WorldWideScience

Sample records for sample processing system

  1. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  2. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  3. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  4. Robotic system for process sampling

    International Nuclear Information System (INIS)

    Dyches, G.M.

    1985-01-01

    A three-axis cartesian geometry robot for process sampling was developed at the Savannah River Laboratory (SRL) and implemented in one of the site radioisotope separations facilities. Use of the robot reduces personnel radiation exposure and contamination potential by routinely handling sample containers under operator control in a low-level radiation area. This robot represents the initial phase of a longer term development program to use robotics for further sample automation. Preliminary design of a second generation robot with additional capabilities is also described. 8 figs

  5. Down sampled signal processing for a B Factory bunch-by-bunch feedback system

    International Nuclear Information System (INIS)

    Hindi, H.; Hosseini, W.; Briggs, D.; Fox, J.; Hutton, A.

    1992-03-01

    A bunch-by-bunch feedback scheme is studied for damping coupled bunch synchrotron oscillations in the proposed PEP II B Factory. The quasi-linear feedback systems design incorporates a phase detector to provide a quantized measure of bunch phase, digital signal processing to compute an error correction signal and a kicker system to correct the energy of the bunches. A farm of digital processors, operating in parallel, is proposed to compute correction signals for the 1658 bunches of the B Factory. This paper studies the use of down sampled processing to reduce the computational complexity of the feedback system. We present simulation results showing the effect of down sampling on beam dynamics. Results show that down sampled processing can reduce the scale of the processing task by a factor of 10

  6. Highly oriented Bi-system bulk sample prepared by a decomposition-crystallization process

    International Nuclear Information System (INIS)

    Xi Zhengping; Zhou Lian; Ji Chunlin

    1992-01-01

    A decomposition-crystallization method, preparing highly oriented Bi-system bulk sample is reported. The effects of processing parameter, decomposition temperature, cooling rate and post-treatment condition on texture and superconductivity are investigated. The method has successfully prepared highly textured Bi-system bulk samples. High temperature annealing does not destroy the growing texture, but the cooling rate has some effect on texture and superconductivity. Annealing in N 2 /O 2 atmosphere can improve superconductivity of the textured sample. The study on the superconductivity of the Bi(Pb)-Sr-Ca-Cu-O bulk material has been reported in numerous papers. The research on J c concentrates on the tape containing the 2223 phase, with very few studies on the J c of bulk sample. The reason for the lack of studies is that the change of superconducting phases at high temperatures has not been known. The authors have reported that the 2212 phase incongruently melted at about 875 degrees C and proceeded to orient the c-axis perpendicular to the surface in the process of crystallization of the 2212 phase. Based on that result, a decomposition-crystallization method was proposed to prepare highly oriented Bi-system bulk sample. In this paper, the process is described in detail and the effects of processing parameters on texture and superconductivity are reported

  7. Event Processing and Variable Part of Sample Period Determining in Combined Systems Using GA

    Science.gov (United States)

    Strémy, Maximilián; Závacký, Pavol; Jedlička, Martin

    2011-01-01

    This article deals with combined dynamic systems and usage of modern techniques in dealing with these systems, focusing particularly on sampling period design, cyclic processing tasks and related processing algorithms in the combined event management systems using genetic algorithms.

  8. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  9. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  10. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... angle with water, the first contact angle being smaller than the second contact angle. The first substrate defines an inlet system and a preparation system in areas of the first type which two areas are separated by a barrier system in an area of the second type. The inlet system is adapted to receive...

  11. Remote sampling of process fluids in radiochemical plants

    International Nuclear Information System (INIS)

    Sengar, P.B.; Bhattacharya, R.; Ozarde, P. D.; Rana, D.S.

    1990-01-01

    Sampling of process fluids, continuous or periodic, is an essential requirement in any chemical process plant, so as to keep a control on process variables. In a radiochemical plant the task of taking and conveying the samples is a very tricky affair. This is due to the fact that neither the vessels/equipment containing radioactive effluents can be approached for manual sampling nor sampled fluids can be handled directly. The problems become more accute with higher levels of radioactivity. As such, inovative systems have to be devised to obtain and handle the raioactive samples employing remote operations. The remote sampling system developed in this Division has some of the unique features such as taking only requisite amount of samples in microlitre range, practically maintenance free design, avoidence of excess radioactive fluids coming out of process systems, etc. The paper describes in detail the design of remote sampling system and compares the same with existing systems. The design efforts are towards simplicity in operation, obtaining homogenised representative samples and highly economical on man-rem expenditure. The performance of a prototype system has also been evaluated. (author). 3 refs

  12. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  13. Automated sample-processing and titration system for determining uranium in nuclear materials

    International Nuclear Information System (INIS)

    Harrar, J.E.; Boyle, W.G.; Breshears, J.D.; Pomernacki, C.L.; Brand, H.R.; Kray, A.M.; Sherry, R.J.; Pastrone, J.A.

    1977-01-01

    The system is designed for accurate, precise, and selective determination of from 10 to 180 mg of uranium in 2 to 12 cm 3 of solution. Samples, standards, and their solutions are handled on a weight basis. These weights, together with their appropriate identification numbers, are stored in computer memory and are used automatically in the assay calculations after each titration. The measurement technique (controlled-current coulometry) is based on the Davies-Gray and New Brunswick Laboratory method, in which U(VI) is reduced to U(IV) in strong H 3 PO 4 , followed by titration of the U(IV) with electrogenerated V(V). Solution pretreatment and titration are automatic. The analyzer is able to process 44 samples per loading of the sample changer, at a rate of 4 to 9 samples per hour. The system includes a comprehensive fault-monitoring system that detects analytical errors, guards against abnormal conditions which might cause errors, and prevents unsafe operation. A detailed description of the system, information on the reliability of the component subsystems, and a summary of its evaluation by the New Brunswick Laboratory are presented

  14. Waste retrieval sluicing system vapor sampling and analysis plan for evaluation of organic emissions, process test phase III

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    1999-01-01

    This sampling and analysis plan identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained to address vapor issues related to the sluicing of tank 241-C-106. Sampling will be performed in accordance with Waste Retrieval Sluicing System Emissions Collection Phase III (Jones 1999) and Process Test Plan Phase III, Waste Retrieval Sluicing System Emissions Collection (Powers 1999). Analytical requirements include those specified in Request for Ecology Concurrence on Draft Strategy/Path Forward to Address Concerns Regarding Organic Emissions from C-106 Sluicing Activities (Peterson 1998). The Waste Retrieval Sluicing System was installed to retrieve and transfer high-heat sludge from tank 241-C-106 to tank 241-AY-102, which is designed for high-heat waste storage. During initial sluicing of tank 241-C-106 in November 1998, operations were halted due to detection of unexpected high volatile organic compounds in emissions that exceeded regulatory permit limits. Several workers also reported smelling sharp odors and throat irritation. Vapor grab samples from the 296-C-006 ventilation system were taken as soon as possible after detection; the analyses indicated that volatile and semi-volatile organic compounds were present. In December 1998, a process test (phase I) was conducted in which the pumps in tanks 241-C-106 and 241-AY-102 were operated and vapor samples obtained to determine constituents that may be present during active sluicing of tank 241-C-106. The process test was suspended when a jumper leak was detected. On March 7, 1999, phase I1 of the process test was performed; the sluicing system was operated for approximately 7 hours and was ended using the controlled shutdown method when the allowable amount of solids were transferred to 241-AY-102. The phase II test was successful, however, further testing is required to obtain vapor samples at higher emission levels

  15. Sample Handling and Processing on Mars for Future Astrobiology Missions

    Science.gov (United States)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  16. Tank vapor sampling and analysis data package for tank 241-C-106 waste retrieval sluicing system process test phase III, sampled March 28, 1999

    International Nuclear Information System (INIS)

    LOCKREM, L.L.

    1999-01-01

    This data package presents sampling data and analytical results from the March 28, 1999, vapor sampling of Hanford Site single-shell tank 241-C-106 during active sluicing. Samples were obtained from the 296-C-006 ventilation system stack and ambient air at several locations. Characterization Project Operations (CPO) was responsible for the collection of all SUMMATM canister samples. The Special Analytical Support (SAS) vapor team was responsible for the collection of all triple sorbent trap (TST), sorbent tube train (STT), polyurethane foam (PUF), and particulate filter samples collected at the 296-C-006 stack. The SAS vapor team used the non-electrical vapor sampling (NEVS) system to collect samples of the air, gases, and vapors from the 296-C-006 stack. The SAS vapor team collected and analyzed these samples for Lockheed Martin Hanford Corporation (LMHC) and Tank Waste Remediation System (TWRS) in accordance with the sampling and analytical requirements specified in the Waste Retrieval Sluicing System Vapor Sampling and Analysis Plan (SAP) for Evaluation of Organic Emissions, Process Test Phase III, HNF-4212, Rev. 0-A, (LMHC, 1999). All samples were stored in a secured Radioactive Materials Area (RMA) until the samples were radiologically released and received by SAS for analysis. The Waste Sampling and Characterization Facility (WSCF) performed the radiological analyses. The samples were received on April 5, 1999

  17. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  18. Sampling system for in vivo ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jorgen Arendt; Mathorne, Jan

    1991-01-01

    Newly developed algorithms for processing medical ultrasound images use the high frequency sampled transducer signal. This paper describes demands imposed on a sampling system suitable for acquiring such data and gives details about a prototype constructed. It acquires full clinical images...... at a sampling frequency of 20 MHz with a resolution of 12 bits. The prototype can be used for real time image processing. An example of a clinical in vivo image is shown and various aspects of the data acquisition process are discussed....

  19. Two-Stage Variable Sample-Rate Conversion System

    Science.gov (United States)

    Tkacenko, Andre

    2009-01-01

    A two-stage variable sample-rate conversion (SRC) system has been pro posed as part of a digital signal-processing system in a digital com munication radio receiver that utilizes a variety of data rates. The proposed system would be used as an interface between (1) an analog- todigital converter used in the front end of the receiver to sample an intermediatefrequency signal at a fixed input rate and (2) digita lly implemented tracking loops in subsequent stages that operate at v arious sample rates that are generally lower than the input sample r ate. This Two-Stage System would be capable of converting from an input sample rate to a desired lower output sample rate that could be var iable and not necessarily a rational fraction of the input rate.

  20. An improved sampling system installed for reprocessing

    International Nuclear Information System (INIS)

    Finsterwalder, L.; Zeh, H.

    1979-03-01

    Sampling devices are needed for taking representative samples from individual process containers during the reprocessing of irradiated fuel. The aqueous process stream in a reprocessing plant frequently contains, in addition to the dissolved radioactive materials, more or less small quantities of solid matter fraction of fuel material still remaining undissolved, insoluble fission-, corrosion-, or degradation products as well, in exceptional cases, ion exchange resin or silica gel. The solid matter is deposited partly on the upper surfaces of the sampling system and the radiation due to this makes maintenance and repair of the sampler more difficult. The purpose of the development work was to reduce the chance of accident and the maintenance costs and to lower the radiation exposure of the personnel. A new sampling system was developed and is described. (author)

  1. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from

  2. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen

    2007-01-01

    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...

  3. Data processing system for NBT experiments

    International Nuclear Information System (INIS)

    Takahashi, C.; Hosokawa, M.; Shoji, T.; Fujiwara, M.

    1981-07-01

    Data processing system for Nagoya Bumpy Torus (NBT) has been developed. Since plasmas are produced and heated in steady state by use of high power microwaves, sampling and processing data prevails in long time scale on the order of one minute. The system, which consists of NOVA 3/12 minicomputer and many data acquisition devices, is designed to sample and process large amount of data before the next discharge starts. Several features of such long time scale data processing system are described in detail. (author)

  4. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  5. Radar Doppler Processing with Nonuniform Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.

  6. Tank vapor sampling and analysis data package for tank 241-C-106 waste retrieval sluicing system process test phase III

    Energy Technology Data Exchange (ETDEWEB)

    LOCKREM, L.L.

    1999-08-13

    This data package presents sampling data and analytical results from the March 28, 1999, vapor sampling of Hanford Site single-shell tank 241-C-106 during active sluicing. Samples were obtained from the 296-C-006 ventilation system stack and ambient air at several locations. Characterization Project Operations (CPO) was responsible for the collection of all SUMMATM canister samples. The Special Analytical Support (SAS) vapor team was responsible for the collection of all triple sorbent trap (TST), sorbent tube train (STT), polyurethane foam (PUF), and particulate filter samples collected at the 296-C-006 stack. The SAS vapor team used the non-electrical vapor sampling (NEVS) system to collect samples of the air, gases, and vapors from the 296-C-006 stack. The SAS vapor team collected and analyzed these samples for Lockheed Martin Hanford Corporation (LMHC) and Tank Waste Remediation System (TWRS) in accordance with the sampling and analytical requirements specified in the Waste Retrieval Sluicing System Vapor Sampling and Analysis Plan (SAP) for Evaluation of Organic Emissions, Process Test Phase III, HNF-4212, Rev. 0-A, (LMHC, 1999). All samples were stored in a secured Radioactive Materials Area (RMA) until the samples were radiologically released and received by SAS for analysis. The Waste Sampling and Characterization Facility (WSCF) performed the radiological analyses. The samples were received on April 5, 1999.

  7. Dry sample storage system for an analytical laboratory supporting plutonium processing

    International Nuclear Information System (INIS)

    Treibs, H.A.; Hartenstein, S.D.; Griebenow, B.L.; Wade, M.A.

    1990-01-01

    The Special Isotope Separation (SIS) plant is designed to provide removal of undesirable isotopes in fuel grade plutonium by the atomic vapor laser isotope separation (AVLIS) process. The AVLIS process involves evaporation of plutonium metal, and passage of an intense beam of light from a laser through the plutonium vapor. The laser beam consists of several discrete wavelengths, tuned to the precise wavelength required to ionize the undesired isotopes. These ions are attracted to charged plates, leaving the bulk of the plutonium vapor enriched in the desired isotopes to be collected on a cold plate. Major portions of the process consist of pyrochemical processes, including direct reduction of the plutonium oxide feed material with calcium metal, and aqueous processes for purification of plutonium in residues. The analytical laboratory for the plant is called the Material and Process Control Laboratory (MPCL), and provides for the analysis of solid and liquid process samples

  8. Multi-dimensional virtual system introduced to enhance canonical sampling

    Science.gov (United States)

    Higo, Junichi; Kasahara, Kota; Nakamura, Haruki

    2017-10-01

    When an important process of a molecular system occurs via a combination of two or more rare events, which occur almost independently to one another, computational sampling for the important process is difficult. Here, to sample such a process effectively, we developed a new method, named the "multi-dimensional Virtual-system coupled Monte Carlo (multi-dimensional-VcMC)" method, where the system interacts with a virtual system expressed by two or more virtual coordinates. Each virtual coordinate controls sampling along a reaction coordinate. By setting multiple reaction coordinates to be related to the corresponding rare events, sampling of the important process can be enhanced. An advantage of multi-dimensional-VcMC is its simplicity: Namely, the conformation moves widely in the multi-dimensional reaction coordinate space without knowledge of canonical distribution functions of the system. To examine the effectiveness of the algorithm, we introduced a toy model where two molecules (receptor and its ligand) bind and unbind to each other. The receptor has a deep binding pocket, to which the ligand enters for binding. Furthermore, a gate is set at the entrance of the pocket, and the gate is usually closed. Thus, the molecular binding takes place via the two events: ligand approach to the pocket and gate opening. In two-dimensional (2D)-VcMC, the two molecules exhibited repeated binding and unbinding, and an equilibrated distribution was obtained as expected. A conventional canonical simulation, which was 200 times longer than 2D-VcMC, failed in sampling the binding/unbinding effectively. The current method is applicable to various biological systems.

  9. SCADA based radioactive sample bottle delivery system for fuel reprocessing project

    International Nuclear Information System (INIS)

    Kaushik, Subrat; Munj, Niket; Chauhan, R.K.; Jayaram, M.N.; Haneef, K.K.M.

    2014-01-01

    Radioactive samples of process streams need to be analyzed in centralized control lab for measuring concentration of heavy elements as well as activity at various stages of re-processing plants. The sample is taken from biologically shielded process cells remotely through sampling blisters in sample bottles. These are then transferred to control lab located about 50 meters using vacuum transfer system. The bottle movement is tracked from origin to destination in rich HMI SCADA system using Infra-red non contact type proximity sensors located along sampling line and these sensors are connected to PLC in a fail-safe mode. The sample bottle travels at a speed of 10 m/s under vacuum motive force and the detection time is of the order of 1 mS. The flow meters have been used to know the air flow in sampling line. The system has been designed, developed, tested and commissioned and in use for four years. (author)

  10. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  11. Development of sampling systems and special analyses for pressurized gasification processes; Paineistettujen kaasutusprosessien naeytteenottomenetelmien ja erityisanalytiikan kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Staahlberg, P.; Oesch, P.; Leppaemaeki, E.; Moilanen, A.; Nieminen, M.; Korhonen, J. [VTT Energy, Espoo (Finland)

    1996-12-01

    The reliability of sampling methods used for measuring impurities contained in gasification gas were studied, and new methods were developed for sampling and sample analyses. The aim of the method development was to improve the representativeness of the samples and to speed up the analysis of gas composition. The study focused on tar, nitrogen and sulphur compounds contained in the gasification gas. In the study of the sampling reliability, the effects of probe and sampling line materials suitable for high temperatures and of the solids deposited in the sampling devices on gas samples drawn from the process were studied. Measurements were carried out in the temperature range of 250 - 850 deg C both in real conditions and in conditions simulating gasification gas. The durability of samples during storage was also studied. The other main aim of the study was to increase the amount of quick-measurable gas components by developing on-line analytical methods based on GC, FTIR and FI (flow injection) techniques for the measurements of nitrogen and sulphur compounds in gasification gas. As these methods are suitable only for the gases that do not contain condensing gas components disturbing the operation of analysers (heavy tar compounds, water), a sampling system operating in dilution principle was developed. The system operates at high pressures and temperatures and is suitable for gasification gases containing heavy tar compounds. The capabilities of analysing heavy tar compounds (mole weight >200 g mol) was improved by adding the amount of compounds identified and calibrated by model substances and by developing analytical methods based on the high-temperature-GC analysis and the thermogravimetric method. (author)

  12. Remote sampling system in reprocessing: present and future perspective

    International Nuclear Information System (INIS)

    Garcha, J.S.; Balakrishnan, V.P.; Rao, M.K.

    1990-01-01

    For the process and inventory control of the reprocessing plant operation it is essential to analyse the samples from the various process vessels to assess the plant performance and take corrective action if needed in the operating parameters. In view of the very high radioactive inventory in the plant, these plants are operated remotely behind thick shielding. The liquid sampling also has to be carried out by remote techniques only as no direct approach is feasible. A vacuum assisted air lift method is employed for the purpose of obtaining samples from remotely located process vessels. A brief description of the present technique, the design criteria, various interlocks and manual operations involved during sampling and despatching the same to the analytical laboratory is given in the paper. A design approach for making the sampling system, a fully automated remote operation has been attempted in this paper. Utilisation of custom built robots and dedicated computer for the various operations and interlocks has been visualised to ensure a complete remotised system for the adoption in future plants. (author). 2 figs., 2 tabs

  13. SCADA based radioactive sample bottle delivery system for fuel reprocessing project

    International Nuclear Information System (INIS)

    Kaushik, Subrat; Munj, Niket; Chauhan, R.K.; Kumar, Pramod; Mishra, A.C.

    2011-01-01

    Radioactive samples of process streams need to be analyzed in centralized control lab for measuring concentration of heavy elements as well as activity at various stages of re-processing plants. The sample is taken from biologically shielded process cells remotely through sampling blisters in sample bottles. These are then transferred to control lab located about 50 meters using vacuum transfer system. The bottle movement is tracked from origin to destination in rich HMI SCADA system using Infra-red non contact type proximity sensors located along sampling line and these sensors are connected to PLC in a fail-safe mode. The sample bottle travels at a speed of 10 m/s under vacuum motive force and the detection time is of the order of 1 mS. The contact time Flow meters have been used to know the air flow in sampling line

  14. Industrial variographic analysis for continuous sampling system validation

    DEFF Research Database (Denmark)

    Engström, Karin; Esbensen, Kim Harry

    2017-01-01

    Karin Engström, LKAB mining, Kiruna, Sweden, continues to present illuminative cases from process industry. Here she reveals more from her ongoing PhD project showing application of variographic characterisation for on-line continuous control of process sampling systems, including the one...

  15. Testing of a Microfluidic Sampling System for High Temperature Electrochemical MC&A

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Candido [Argonne National Lab. (ANL), Argonne, IL (United States); Nichols, Kevin [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-11-27

    This report describes the preliminary validation of a high-temperature microfluidic chip system for sampling of electrochemical process salt. Electroanalytical and spectroscopic techniques are attractive candidates for improvement through high-throughput sample analysis via miniaturization. Further, microfluidic chip systems are amenable to micro-scale chemical processing such as rapid, automated sample purification to improve sensor performance. The microfluidic chip was tested to determine the feasibility of the system for high temperature applications and conditions under which microfluidic systems can be used to generate salt droplets at process temperature to support development of material balance and control systems in a used fuel treatment facility. In FY13, the project focused on testing a quartz microchip device with molten salts at near process temperatures. The equipment was installed in glove box and tested up to 400°C using commercial thermal transfer fluids as the carrier phase. Preliminary tests were carried out with a low-melting halide salt to initially characterize the properties of this novel liquid-liquid system and to investigate the operating regimes for inducing droplet flow within candidate carrier fluids. Initial results show that the concept is viable for high temperature sampling but further development is required to optimize the system to operate with process relevant molten salts.

  16. Sampled-data models for linear and nonlinear systems

    CERN Document Server

    Yuz, Juan I

    2014-01-01

    Sampled-data Models for Linear and Nonlinear Systems provides a fresh new look at a subject with which many researchers may think themselves familiar. Rather than emphasising the differences between sampled-data and continuous-time systems, the authors proceed from the premise that, with modern sampling rates being as high as they are, it is becoming more appropriate to emphasise connections and similarities. The text is driven by three motives: ·      the ubiquity of computers in modern control and signal-processing equipment means that sampling of systems that really evolve continuously is unavoidable; ·      although superficially straightforward, sampling can easily produce erroneous results when not treated properly; and ·      the need for a thorough understanding of many aspects of sampling among researchers and engineers dealing with applications to which they are central. The authors tackle many misconceptions which, although appearing reasonable at first sight, are in fact either p...

  17. The development of neutron activation, sample transportation and γ-ray counting routine system for numbers of geological samples

    International Nuclear Information System (INIS)

    Shibata Shin-nosuke; Tanaka, Tsuyoshi; Minami, Masayo

    2001-01-01

    A new gamma-ray counting and data processing system for non-destructive neutron activation analysis has been set up in Radioisotope Center in Nagoya University. The system carry out gamma-ray counting, sample change and data processing automatically, and is able to keep us away from parts of complicated operations in INAA. In this study, we have arranged simple analytical procedure that makes practical works easier than previous. The concrete flow is described from the reparation of powder rock samples to gamma-ray counting and data processing by the new INAA system. Then it is run over that the analyses used two Geological Survey of Japan rock reference samples JB-1a and JG-1a in order to evaluate how the new analytical procedure give any speediness and accuracy for analyses of geological materials. Two United States Geological Survey reference samples BCR-1 and G-2 used as the standard respectively. Twenty two elements for JB-1a and 25 elements for JG-1a were analyzed, the uncertainty are <5% for Na, Sc, Fe, Co, La, Ce, Sm, Eu, Yb, Lu, Hf, Ta and Th, and of <10% for Cr, Zn, Cs, Ba, Nd, Tb and U. This system will enable us to analyze more than 1500 geologic samples per year. (author)

  18. The Safeguards analysis applied to the RRP. Automatic sampling authentication system

    International Nuclear Information System (INIS)

    Ono, Sawako; Nakashima, Shinichi; Iwamoto, Tomonori

    2004-01-01

    The sampling for analysis from vessels and columns at the Rokkasho Reprocessing Plant (RRP) is performed mostly by the automatic sampling system. The safeguards sample for the verification also will be taken using these sampling systems and transfer to the OSL though the pneumatic transfer network owned and controlled by operator. In order to maintaining sample integrity and continuity of knowledge (CoK) for throughout the sample processing. It is essential to develop and establish the authentication measures for the automatic sampling system including transfer network. We have developed the Automatic Sampling Authentication System (ASAS) under consultation by IAEA. This paper describes structure, function and concept of ASAS. (author)

  19. Improved mixing and sampling systems for vitrification melter feeds

    International Nuclear Information System (INIS)

    Ebadian, M.A.

    1998-01-01

    This report summarizes the methods used and results obtained during the progress of the study of waste slurry mixing and sampling systems during fiscal year 1977 (FY97) at the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU). The objective of this work is to determine optimal mixing configurations and operating conditions as well as improved sampling technology for defense waste processing facility (DWPF) waste melter feeds at US Department of Energy (DOE) sites. Most of the research on this project was performed experimentally by using a tank mixing configuration with different rotating impellers. The slurry simulants for the experiments were prepared in-house based on the properties of the DOE sites' typical waste slurries. A sampling system was designed to withdraw slurry from the mixing tank. To obtain insight into the waste mixing process, the slurry flow in the mixing tank was also simulated numerically by applying computational fluid dynamics (CFD) methods. The major parameters investigated in both the experimental and numerical studies included power consumption of mixer, mixing time to reach slurry uniformity, slurry type, solids concentration, impeller type, impeller size, impeller rotating speed, sampling tube size, and sampling velocities. Application of the results to the DWPF melter feed preparation process will enhance and modify the technical base for designing slurry transportation equipment and pipeline systems. These results will also serve as an important reference for improving waste slurry mixing performance and melter operating conditions. These factors will contribute to an increase in the capability of the vitrification process and the quality of the waste glass

  20. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  1. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    Science.gov (United States)

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample

  2. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  3. Double Shell Tank (DST) Process Waste Sampling Subsystem Definition Report

    International Nuclear Information System (INIS)

    RASMUSSEN, J.H.

    2000-01-01

    This report defines the Double-Shell Tank (DST) Process Waste Sampling Subsystem (PWSS). This subsystem definition report fully describes and identifies the system boundaries of the PWSS. This definition provides a basis for developing functional, performance, and test requirements (i.e., subsystem specification), as necessary, for the PWSS. The resultant PWSS specification will include the sampling requirements to support the transfer of waste from the DSTs to the Privatization Contractor during Phase 1 of Waste Feed Delivery

  4. 291-B-1 stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    Ridge, T.M.

    1994-01-01

    The B Plant 291-B-1 main stack exhausts gaseous effluents to the atmosphere from the 221-B Building canyon and cells, the No. 1 Vessel Ventilation System (VVS1), the 212-B Cask Station cell ventilation system, and, to a limited capacity, the 224-B Building. VVS1 collects offgases from various process tanks in 221-B Building, while the 224-B system maintains a negative pressure in out-of-service, sealed process tanks. B Plant Administration Manual, WHC-CM-7-5, Section 5.30 requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 291-B-1 (System Number B977A) at B Plant. The system is functional and performing satisfactorily

  5. Static sampling of dynamic processes - a paradox?

    Science.gov (United States)

    Mälicke, Mirko; Neuper, Malte; Jackisch, Conrad; Hassler, Sibylle; Zehe, Erwin

    2017-04-01

    Environmental systems monitoring aims at its core at the detection of spatio-temporal patterns of processes and system states, which is a pre-requisite for understanding and explaining their baffling heterogeneity. Most observation networks rely on distributed point sampling of states and fluxes of interest, which is combined with proxy-variables from either remote sensing or near surface geophysics. The cardinal question on the appropriate experimental design of such a monitoring network has up to now been answered in many different ways. Suggested approaches range from sampling in a dense regular grid using for the so-called green machine, transects along typical catenas, clustering of several observations sensors in presumed functional units or HRUs, arrangements of those cluster along presumed lateral flow paths to last not least a nested, randomized stratified arrangement of sensors or samples. Common to all these approaches is that they provide a rather static spatial sampling, while state variables and their spatial covariance structure dynamically change in time. It is hence of key interest how much of our still incomplete understanding stems from inappropriate sampling and how much needs to be attributed to an inappropriate analysis of spatial data sets. We suggest that it is much more promising to analyze the spatial variability of processes, for instance changes in soil moisture values, than to investigate the spatial variability of soil moisture states themselves. This is because wetting of the soil, reflected in a soil moisture increase, is causes by a totally different meteorological driver - rainfall - than drying of the soil. We hence propose that the rising and the falling limbs of soil moisture time series belong essentially to different ensembles, as they are influenced by different drivers. Positive and negative temporal changes in soil moisture need, hence, to be analyzed separately. We test this idea using the CAOS data set as a benchmark

  6. Disc valve for sampling erosive process streams

    Science.gov (United States)

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1986-01-07

    A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.

  7. Special study for the manual transfer of process samples from CPP [Chemical Processing Plant] 601 to RAL [Remote Analytical Laboratory

    International Nuclear Information System (INIS)

    Marts, D.J.

    1987-05-01

    A study of alternate methods to manually transport radioactive samples from their glove boxes to the Remote Analytical Laboratory (RAL) was conducted at the Idaho National Engineering Laboratory. The study was performed to mitigate the effects of a potential loss of sampling capabilities that could take place if a malfunction in the Pneumatic Transfer System (PTS) occurred. Samples are required to be taken from the cell glove boxes and analyzed at the RAL regardless of the operational status of the PTS. This paper documents the conclusions of the study and how a decision was reached that determined the best handling scenarios for manually transporting 15 mL vials of liquid process samples from the K, W, U, WG, or WH cell glove boxes in the Chemical Processing Plant (CPP) 601 to the RAL. This study of methods to manually remove the samples from the glove boxes, package them for safe shipment, transport them by the safest route, receive them at the RAL, and safely unload them was conducted by EG and G Idaho, Inc., for Westinghouse Idaho Nuclear Company as part of the Glove Box Sampling and Transfer System Project for the Fuel Processing Facilities Upgrade, Task 10, Subtask 2. The study focused on the safest and most reliable scenarios that could be implemented using existing equipment. Hardware modifications and new hardware proposals were identified, and their impact on the handling scenario has been evaluated. A conclusion was reached that by utilizing the existing facility hardware, these samples can be safely transported manually from the sample stations in CPP 601 to the RAL, and that additional hardware could facilitate the transportation process even further

  8. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    through web services. Based on valuable feedback from the user community, we will introduce enhancements that add greater flexibility to the system to accommodate the vast diversity of metadata that users want to store. Users will be able to create custom metadata fields and use these for the samples they register. Users will also be able to group samples into 'collections' to make retrieval for research projects or publications easier. An improved interface design will allow for better workflow transition and navigation throughout the application. In keeping up with the demands of a growing community, SESAR has also made process changes to ensure efficiency in system development. For example, we have implemented a release cycle to better track enhancements and fixes to the system, and an API library that facilitates reusability of code. Usage tracking, metrics and surveys capture information to guide the direction of future developments. A new set of administrative tools allows greater control of system management.

  9. A novel PMT test system based on waveform sampling

    Science.gov (United States)

    Yin, S.; Ma, L.; Ning, Z.; Qian, S.; Wang, Y.; Jiang, X.; Wang, Z.; Yu, B.; Gao, F.; Zhu, Y.; Wang, Z.

    2018-01-01

    Comparing with the traditional test system based on a QDC and TDC and scaler, a test system based on waveform sampling is constructed for signal sampling of the 8"R5912 and the 20"R12860 Hamamatsu PMT in different energy states from single to multiple photoelectrons. In order to achieve high throughput and to reduce the dead time in data processing, the data acquisition software based on LabVIEW is developed and runs with a parallel mechanism. The analysis algorithm is realized in LabVIEW and the spectra of charge, amplitude, signal width and rising time are analyzed offline. The results from Charge-to-Digital Converter, Time-to-Digital Converter and waveform sampling are discussed in detailed comparison.

  10. Study on infrasonic characteristics of coal samples in failure process under uniaxial loading

    Directory of Open Access Journals (Sweden)

    Bing Jia

    Full Text Available To study the precursory failure infrasonic characteristics of coal samples, coal rock stress loading system and infrasonic wave acquisition system were adopted, and infrasonic tests in uniaxial loading process were made for the coal samples in the studied area. Wavelet filtering, fast Fourier transform, and relative infrasonic energy methods were used to analyze the characteristics of the infrasonic waves in the loading process, including time domain characteristics, and relative energy. The analysis results demonstrated that the frequencies of the infrasonic signals in the loading process mainly distribute within 5–10 Hz, which are significantly different from noise signals. The changes of the infrasonic signals show clear periodic characters in time domain. Meanwhile, the relative energy changes of the infrasonic wave also show periodic characters, which are divided into two stages by the yield limit of coal samples, and are clear and easy to be recognized, so that they can be used as the precursory characteristics for recognizing coal sample failures. Moreover, the infrasonic waves generated by coal samples have low frequency and low attenuation, which can be collected without coupling and transmitted in long distance. This study provides an important support for the further in-situ prediction of coal rock failures. Keywords: Infrasound, Relative energy, Time-frequency analysis, Failure prediction, Identification feature

  11. Processing a Complex Architectural Sampling with Meshlab: the Case of Piazza della Signoria

    Science.gov (United States)

    Callieri, M.; Cignoni, P.; Dellepiane, M.; Ranzuglia, G.; Scopigno, R.

    2011-09-01

    The paper presents a recent 3D scanning project performed with long range scanning technology showing how a complex sampled dataset can be processed with the features available in MeshLab, an open source tool. MeshLab is an open source mesh processing system. It is a portable and extensible system aimed to help the processing of the typical not-so-small unstructured models that arise in 3D scanning, providing a set of tools for editing, cleaning, processing, inspecting, rendering and converting meshes. The MeshLab system started in late 2005 as a part of a university course, and considerably evolved since then thanks to the effort of the Visual Computing Lab and of the support of several funded EC projects. MeshLab gained so far an excellent visibility and distribution, with several thousands downloads every month, and a continuous evolution. The aim of this scanning campaign was to sample the façades of the buildings located in Piazza della Signoria (Florence, Italy). This digital 3D model was required, in the framework of a Regional Project, as a basic background model to present a complex set of images using a virtual navigation metaphor (following the PhotoSynth approach). Processing of complex dataset, such as the ones produced by long range scanners, often requires specialized, difficult to use and costly software packages. We show in the paper how it is possible to process this kind of data inside an open source tool, thanks to the many new features recently introduced in MeshLab for the management of large sets of sampled point.

  12. Study on auto-plating process time versus recovery for polonium, Po-210 in environmental sample

    International Nuclear Information System (INIS)

    Jalal Sharib; Zaharudin Ahmad; Abdul Kadir Ishak; Norfaizal Mohamed; Ahmad Sanadi Abu Bakar; Yii Mei Wo; Kamarozaman Ishak; Siti Aminah Yusoff

    2008-08-01

    This study was carried out to evaluate time effectiveness and recovery 16 samples of 4 Kuala Muda stations during auto-plating process procedures for determination Polonium, Po 210 activity concentration in environmental sample. The study was performed using Kuala Muda sediment as sample in the same methodology. The auto-plating process runs for 4, 12, 24 and 30 hours on a silver disc for 4 samples each station, and then counted for one (1) day using an alpha spectrometry counting system. The objectives for this study is to justify on time duration for auto-plating process effecting a chemical yield of Po-209.The results showed recovery are increasing versus time and constantly at 24 hour auto-plating. Its mean, 24 hour is an optimum time for auto-plating process for determination of Polonium, Po 210 activity concentration in environmental sample. (Author)

  13. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    Science.gov (United States)

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array.

  14. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  15. Use the Bar Code System to Improve Accuracy of the Patient and Sample Identification.

    Science.gov (United States)

    Chuang, Shu-Hsia; Yeh, Huy-Pzu; Chi, Kun-Hung; Ku, Hsueh-Chen

    2018-01-01

    In time and correct sample collection were highly related to patient's safety. The sample error rate was 11.1%, because misbranded patient information and wrong sample containers during January to April, 2016. We developed a barcode system of "Specimens Identify System" through process of reengineering of TRM, used bar code scanners, add sample container instructions, and mobile APP. Conclusion, the bar code systems improved the patient safety and created green environment.

  16. Neutron-activation analysis of routine mineral-processing samples

    International Nuclear Information System (INIS)

    Watterson, J.; Eddy, B.; Pearton, D.

    1974-01-01

    Instrumental neutron-activation analysis was applied to a suite of typical mineral-processing samples to establish which elements can be rapidly determined in them by this technique. A total of 35 elements can be determined with precisions (from the counting statistics) ranging from better than 1 per cent to approximately 20 per cent. The elements that can be determined have been tabulated together with the experimental conditions, the precision from the counting statistics, and the estimated number of analyses possible per day. With an automated system, this number can be as high as 150 in the most favourable cases [af

  17. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statistical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  18. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statisical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  19. Water and steam sampling systems; Provtagningssystem foer vatten och aanga

    Energy Technology Data Exchange (ETDEWEB)

    Hellman, Mats

    2009-10-15

    The supervision of cycle chemistry can be divided into two parts, the sampling system and the chemical analysis. In modern steam generating plants most of the chemical analyses are carried out on-line. The detection limits of these analyzers are pushed downward to the ppt-range (parts per trillion), however the analyses are not more correct than the accuracy of the sampling system. A lot of attention has been put to the analyzers and the statistics to interpret the results but the sampling procedures has gained much less attention. This report aims to give guidance of the considerations to be made regarding sampling systems. Sampling is necessary since most analysis of interesting parameters cannot be carried out in- situ on-line in the steam cycle. Today's on-line instruments for pH, conductivity, silica etc. are designed to meet a water sample at a temperature of 10-30 deg C. This means that the sampling system has to extract a representative sample from the process, transport and cool it down to room temperature without changing the characteristics of the fluid. In the literature research work, standards and other reports can be found. Although giving similar recommendations in most aspects there are some discrepancies that may be confusing. This report covers all parts in the sampling system: Sample points and nozzles; Sample lines; Valves, regulating and on-off; Sample coolers; Temperature, pressure and flow rate control; Cooling water; and Water recovery. On-line analyzers connecting to the sampling system are not covered. This report aims to clarify what guidelines are most appropriate amongst the existing ones. The report should also give guidance to the design of the sampling system in order to achieve representative samples. In addition to this the report gives an overview of the fluid mechanics involved in sampling. The target group of this report is owners and operators of steam generators, vendors of power plant equipment, consultants working in

  20. REMOTE IN-CELL SAMPLING IMPROVEMENTS PROGRAM AT THESAVANNAH RIVER SITE (SRS) DEFENSE WASTE PROCESSING FACILITY (DWPF)

    International Nuclear Information System (INIS)

    Marzolf, A

    2007-01-01

    Remote Systems Engineering (RSE) of the Savannah River National Lab (SRNL) in combination with the Defense Waste Processing Facility(DWPF) Engineering and Operations has evaluated the existing equipment and processes used in the facility sample cells for 'pulling' samples from the radioactive waste stream and performing equipment in-cell repairs/replacements. RSE has designed and tested equipment for improving remote in-cell sampling evolutions and reducing the time required for in-cell maintenance of existing equipment. The equipment within the present process tank sampling system has been in constant use since the facility start-up over 17 years ago. At present, the method for taking samples within the sample cells produces excessive maintenance and downtime due to frequent failures relative to the sampling station equipment and manipulator. Location and orientation of many sampling stations within the sample cells is not conducive to manipulator operation. The overextension of manipulators required to perform many in-cell operations is a major cause of manipulator failures. To improve sampling operations and reduce downtime due to equipment maintenance, a Portable Sampling Station (PSS), wireless in-cell cameras, and new commercially available sampling technology has been designed, developed and/or adapted and tested. The uniqueness of the design(s), the results of the scoping tests, and the benefits relative to in-cell operation and reduction of waste are presented

  1. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  2. The LITA Drill and Sample Delivery System

    Science.gov (United States)

    Paulsen, G.; Yoon, S.; Zacny, K.; Wettergreeng, D.; Cabrol, N. A.

    2013-12-01

    The Life in the Atacama (LITA) project has a goal of demonstrating autonomous roving, sample acquisition, delivery and analysis operations in Atacama, Chile. To enable the sample handling requirement, Honeybee Robotics developed a rover-deployed, rotary-percussive, autonomous drill, called the LITA Drill, capable of penetrating to ~80 cm in various formations, capturing and delivering subsurface samples to a 20 cup carousel. The carousel has a built-in capability to press the samples within each cup, and position target cups underneath instruments for analysis. The drill and sample delivery system had to have mass and power requirements consistent with a flight system. The drill weighs 12 kg and uses less than 100 watt of power to penetrate ~80 cm. The LITA Drill auger has been designed with two distinct stages. The lower part has deep and gently sloping flutes for retaining powdered sample, while the upper section has shallow and steep flutes for preventing borehole collapse and for efficient movement of cuttings and fall back material out of the hole. The drill uses the so called 'bite-sampling' approach that is samples are taken in short, 5-10 cm bites. To take the first bite, the drill is lowered onto the ground and upon drilling of the first bite it is then retracted into an auger tube. The auger with the auger tube are then lifted off the ground and positioned next to the carousel. To deposit the sample, the auger is rotated and retracted above the auger tube. The cuttings retained on the flutes are either gravity fed or are brushed off by a passive side brush into the cup. After the sample from the first bite has been deposited, the drill is lowered back into the same hole to take the next bite. This process is repeated until a target depth is reached. The bite sampling is analogous to peck drilling in the machining process where a bit is periodically retracted to clear chips. If there is some fall back into the hole once the auger has cleared the hole, this

  3. System to determine present elements in oily samples

    International Nuclear Information System (INIS)

    Mendoza G, Y.

    2004-11-01

    In the Chemistry Department of the National Institute of Nuclear Investigations of Mexico, dedicated to analyze samples of oleaginous material and of another origin, to determine the elements of the periodic table present in the samples, through the Neutron activation analysis technique (NAA). This technique has been developed to determine majority elements in any solid, aqueous, industrial and environmental sample, which consists basically on to irradiate a sample with neutrons coming from the TRIGA Mark III reactor and to carry out the analysis to obtain those gamma spectra that it emits, for finally to process the information, the quantification of the analysis it is carried out in a manual way, which requires to carry out a great quantity of calculations. The main objective of this project is the development of a software that allows to carry out the quantitative analysis of the NAA for the multielemental determination of samples in an automatic way. To fulfill the objective of this project it has been divided in four chapters: In the first chapter it is shortly presented the history on radioactivity and basic concepts that will allow us penetrate better to this work. In the second chapter the NAA is explained which is used in the sample analysis, the description of the process to be carried out, its are mentioned the characteristics of the used devices and an example of the process is illustrated. In the third chapter it is described the development of the algorithm and the selection of the programming language. The fourth chapter it is shown the structure of the system, the general form of operation, the execution of processes and the obtention of results. Later on the launched results are presented in the development of the present project. (Author)

  4. Quality evaluation of processed clay soil samples.

    Science.gov (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  5. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  6. Core sampling system spare parts assessment

    International Nuclear Information System (INIS)

    Walter, E.J.

    1995-01-01

    Soon, there will be 4 independent core sampling systems obtaining samples from the underground tanks. It is desirable that these systems be available for sampling during the next 2 years. This assessment was prepared to evaluate the adequacy of the spare parts identified for the core sampling system and to provide recommendations that may remediate overages or inadequacies of spare parts

  7. Sampling Transition Pathways in Highly Correlated Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David

    2004-10-20

    This research grant supported my group's efforts to apply and extend the method of transition path sampling that we invented during the late 1990s. This methodology is based upon a statistical mechanics of trajectory space. Traditional statistical mechanics focuses on state space, and with it, one can use Monte Carlo methods to facilitate importance sampling of states. With our formulation of a statistical mechanics of trajectory space, we have succeeded at creating algorithms by which importance sampling can be done for dynamical processes. In particular, we are able to study rare but important events without prior knowledge of transition states or mechanisms. In perhaps the most impressive application of transition path sampling, my group combined forces with Michele Parrinello and his coworkers to unravel the dynamics of auto ionization of water [5]. This dynamics is the fundamental kinetic step of pH. Other applications concern nature of dynamics far from equilibrium [1, 7], nucleation processes [2], cluster isomerization, melting and dissociation [3, 6], and molecular motors [10]. Research groups throughout the world are adopting transition path sampling. In part this has been the result of our efforts to provide pedagogical presentations of the technique [4, 8, 9], as well as providing new procedures for interpreting trajectories of complex systems [11].

  8. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...

  9. Test processing integrated system (S.I.D.E.X.)

    International Nuclear Information System (INIS)

    Sabas, M.; Oules, H.; Badel, D.

    1969-01-01

    The Test Processing Integrated System is mostly composed of a CAE 9080 (equiv. S. D. S. 9300) computer which is equipped of a 100 000 samples/sec acquisition system. The System is designed for high speed data acquisition and data processing on environment tests, and also calculation of structural models. Such a digital appliance on data processing has many advantages compared to the conventional methods based on analog instruments. (author) [fr

  10. Calculation of parameter failure probability of thermodynamic system by response surface and importance sampling method

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei

    2012-01-01

    In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)

  11. Sample processing procedures and radiocarbon dating

    International Nuclear Information System (INIS)

    Svetlik, Ivo; Tomaskova, Lenka; Dreslerova, Dagmar

    2010-01-01

    The article outlines radiocarbon dating routines and highlights the potential and limitations of this method. The author's institutions have been jointly running a conventional radiocarbon dating laboratory using the international CRL code. A procedure based on the synthesis of benzene is used. Small samples are sent abroad for dating because no AMS instrumentation is available in the Czech Republic so far. Our laboratory plans to introduce routines for the processing of milligram samples and preparation of graphitized targets for AMS

  12. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  13. Compact multipurpose sub-sampling and processing of in-situ cores with press (pressurized core sub-sampling and extrusion system)

    Energy Technology Data Exchange (ETDEWEB)

    Anders, E.; Muller, W.H. [Technical Univ. of Berlin, Berlin (Germany). Chair of Continuum Mechanics and Material Theory

    2008-07-01

    Climate change, declining resources and over-consumption result in a need for sustainable resource allocation, habitat conservation and claim for new technologies and prospects for damage-containment. In order to increase knowledge of the environment and to define potential hazards, it is necessary to get an understanding of the deep biosphere. In addition, the benthic conditions of sediment structure and gas hydrates, temperature, pressure and bio-geochemistry must be maintained during the sequences of sampling, retrieval, transfer, storage and downstream analysis. In order to investigate highly instable gas hydrates, which decomposes under pressure and temperature change, a suite of research technologies have been developed by the Technische Universitat Berlin (TUB), Germany. This includes the pressurized core sub-sampling and extrusion system (PRESS) that was developed in the European Union project called HYACE/HYACINTH. The project enabled well-defined sectioning and transfer of drilled pressure-cores obtained by a rotary corer and fugro pressure corer into transportation and investigation chambers. This paper described HYACINTH pressure coring and the HYACINTH core transfer. Autoclave coring tools and HYACINTH core logging, coring tools, and sub-sampling were also discussed. It was concluded that possible future applications include, but were not limited to, research in shales and other tight formations, carbon dioxide sequestration, oil and gas exploration, coalbed methane, and microbiology of the deep biosphere. To meet the corresponding requirements and to incorporate the experiences from previous expeditions, the pressure coring system would need to be redesigned to adapt it to the new applications. 3 refs., 5 figs.

  14. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    Science.gov (United States)

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption.

  15. Systems and methods for laser assisted sample transfer to solution for chemical analysis

    Science.gov (United States)

    Van Berkel, Gary J; Kertesz, Vilmos; Ovchinnikova, Olga S

    2013-08-27

    Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.

  16. Atmospheric scanning electron microscope system with an open sample chamber: Configuration and applications

    Energy Technology Data Exchange (ETDEWEB)

    Nishiyama, Hidetoshi, E-mail: hinishiy@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Koizumi, Mitsuru, E-mail: koizumi@jeol.co.jp [JEOL Technics Ltd., 2-6-38 Musashino, Akishima, Tokyo 196-0021 (Japan); Ogawa, Koji, E-mail: kogawa@jeol.co.jp [JEOL Technics Ltd., 2-6-38 Musashino, Akishima, Tokyo 196-0021 (Japan); Kitamura, Shinich, E-mail: kitamura@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Konyuba, Yuji, E-mail: ykonyuub@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Watanabe, Yoshiyuki, E-mail: watanabeyoshiy@pref.yamagata.jp [Yamagata Research Institute of Technology, 2-2-1, Matsuei, Yamagata 990-2473 (Japan); Ohbayashi, Norihiko, E-mail: n.ohbayashi@m.tohoku.ac.jp [Laboratory of Membrane Trafficking Mechanisms, Department of Developmental Biology and Neurosciences, Graduate School of Life Sciences, Tohoku University, Aobayama, Aoba-ku, Sendai, Miyagi 980-8578 (Japan); Fukuda, Mitsunori, E-mail: nori@m.tohoku.ac.jp [Laboratory of Membrane Trafficking Mechanisms, Department of Developmental Biology and Neurosciences, Graduate School of Life Sciences, Tohoku University, Aobayama, Aoba-ku, Sendai, Miyagi 980-8578 (Japan); Suga, Mitsuo, E-mail: msuga@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Sato, Chikara, E-mail: ti-sato@aist.go.jp [Biomedical Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), 1-1-4, Umezono, Tsukuba 305-8568 (Japan)

    2014-12-15

    An atmospheric scanning electron microscope (ASEM) with an open sample chamber and optical microscope (OM) is described and recent developments are reported. In this ClairScope system, the base of the open sample dish is sealed to the top of the inverted SEM column, allowing the liquid-immersed sample to be observed by OM from above and by SEM from below. The optical axes of the two microscopes are aligned, ensuring that the same sample areas are imaged to realize quasi-simultaneous correlative microscopy in solution. For example, the cathodoluminescence of ZnO particles was directly demonstrated. The improved system has (i) a fully motorized sample stage, (ii) a column protection system in the case of accidental window breakage, and (iii) an OM/SEM operation system controlled by a graphical user interface. The open sample chamber allows the external administration of reagents during sample observation. We monitored the influence of added NaCl on the random motion of silica particles in liquid. Further, using fluorescence as a transfection marker, the effect of small interfering RNA-mediated knockdown of endogenous Varp on Tyrp1 trafficking in melanocytes was examined. A temperature-regulated titanium ASEM dish allowed the dynamic observation of colloidal silver nanoparticles as they were heated to 240 °C and sintered. - Highlights: • Atmospheric SEM (ASEM) allows observation of samples in liquid or gas. • Open sample chamber allows in situ monitoring of evaporation and sintering processes. • in situ monitoring of processes during reagent administration is also accomplished. • Protection system for film breakage is developed for ASEM. • Usability of ASEM has been improved significantly including GUI control.

  17. Atmospheric scanning electron microscope system with an open sample chamber: Configuration and applications

    International Nuclear Information System (INIS)

    Nishiyama, Hidetoshi; Koizumi, Mitsuru; Ogawa, Koji; Kitamura, Shinich; Konyuba, Yuji; Watanabe, Yoshiyuki; Ohbayashi, Norihiko; Fukuda, Mitsunori; Suga, Mitsuo; Sato, Chikara

    2014-01-01

    An atmospheric scanning electron microscope (ASEM) with an open sample chamber and optical microscope (OM) is described and recent developments are reported. In this ClairScope system, the base of the open sample dish is sealed to the top of the inverted SEM column, allowing the liquid-immersed sample to be observed by OM from above and by SEM from below. The optical axes of the two microscopes are aligned, ensuring that the same sample areas are imaged to realize quasi-simultaneous correlative microscopy in solution. For example, the cathodoluminescence of ZnO particles was directly demonstrated. The improved system has (i) a fully motorized sample stage, (ii) a column protection system in the case of accidental window breakage, and (iii) an OM/SEM operation system controlled by a graphical user interface. The open sample chamber allows the external administration of reagents during sample observation. We monitored the influence of added NaCl on the random motion of silica particles in liquid. Further, using fluorescence as a transfection marker, the effect of small interfering RNA-mediated knockdown of endogenous Varp on Tyrp1 trafficking in melanocytes was examined. A temperature-regulated titanium ASEM dish allowed the dynamic observation of colloidal silver nanoparticles as they were heated to 240 °C and sintered. - Highlights: • Atmospheric SEM (ASEM) allows observation of samples in liquid or gas. • Open sample chamber allows in situ monitoring of evaporation and sintering processes. • in situ monitoring of processes during reagent administration is also accomplished. • Protection system for film breakage is developed for ASEM. • Usability of ASEM has been improved significantly including GUI control

  18. Output Information Based Fault-Tolerant Iterative Learning Control for Dual-Rate Sampling Process with Disturbances and Output Delay

    Directory of Open Access Journals (Sweden)

    Hongfeng Tao

    2018-01-01

    Full Text Available For a class of single-input single-output (SISO dual-rate sampling processes with disturbances and output delay, this paper presents a robust fault-tolerant iterative learning control algorithm based on output information. Firstly, the dual-rate sampling process with output delay is transformed into discrete system in state-space model form with slow sampling rate without time delay by using lifting technology; then output information based fault-tolerant iterative learning control scheme is designed and the control process is turned into an equivalent two-dimensional (2D repetitive process. Moreover, based on the repetitive process stability theory, the sufficient conditions for the stability of system and the design method of robust controller are given in terms of linear matrix inequalities (LMIs technique. Finally, the flow control simulations of two flow tanks in series demonstrate the feasibility and effectiveness of the proposed method.

  19. System for Packaging Planetary Samples for Return to Earth

    Science.gov (United States)

    Badescu, Mircea; Bar-Cohen, Yoseph; Backes, paul G.; Sherrit, Stewart; Bao, Xiaoqi; Scott, James S.

    2010-01-01

    A system is proposed for packaging material samples on a remote planet (especially Mars) in sealed sample tubes in preparation for later return to Earth. The sample tubes (Figure 1) would comprise (1) tubes initially having open tops and closed bottoms; (2) small, bellows-like collapsible bodies inside the tubes at their bottoms; and (3) plugs to be eventually used to close the tops of the tubes. The top inner surface of each tube would be coated with solder. The side of each plug, which would fit snugly into a tube, would feature a solder-filled ring groove. The system would include equipment for storing, manipulating, filling, and sealing the tubes. The containerization system (see Figure 2) will be organized in stations and will include: the storage station, the loading station, and the heating station. These stations can be structured in circular or linear pattern to minimize the manipulator complexity, allowing for compact design and mass efficiency. The manipulation of the sample tube between stations is done by a simple manipulator arm. The storage station contains the unloaded sample tubes and the plugs before sealing as well as the sealed sample tubes with samples after loading and sealing. The chambers at the storage station also allow for plug insertion into the sample tube. At the loading station the sample is poured or inserted into the sample tube and then the tube is topped off. At the heating station the plug is heated so the solder ring melts and seals the plug to the sample tube. The process is performed as follows: Each tube is filled or slightly overfilled with sample material and the excess sample material is wiped off the top. Then, the plug is inserted into the top section of the tube packing the sample material against the collapsible bellowslike body allowing the accommodation of the sample volume. The plug and the top of the tube are heated momentarily to melt the solder in order to seal the tube.

  20. A novel atmospheric tritium sampling system

    Science.gov (United States)

    Qin, Lailai; Xia, Zhenghai; Gu, Shaozhong; Zhang, Dongxun; Bao, Guangliang; Han, Xingbo; Ma, Yuhua; Deng, Ke; Liu, Jiayu; Zhang, Qin; Ma, Zhaowei; Yang, Guo; Liu, Wei; Liu, Guimin

    2018-06-01

    The health hazard of tritium is related to its chemical form. Sampling different chemical forms of tritium simultaneously becomes significant. Here a novel atmospheric tritium sampling system (TS-212) was developed to collect the tritiated water (HTO), tritiated hydrogen (HT) and tritiated methane (CH3T) simultaneously. It consisted of an air inlet system, three parallel connected sampling channels, a hydrogen supply module, a methane supply module and a remote control system. It worked at air flow rate of 1 L/min to 5 L/min, with temperature of catalyst furnace at 200 °C for HT sampling and 400 °C for CH3T sampling. Conversion rates of both HT and CH3T to HTO were larger than 99%. The collecting efficiency of the two-stage trap sets for HTO was larger than 96% in 12 h working-time without being blocked. Therefore, the collected efficiencies of TS-212 are larger than 95% for tritium with different chemical forms in environment. Besides, the remote control system made sampling more intelligent, reducing the operator's work intensity. Based on the performance parameters described above, the TS-212 can be used to sample atmospheric tritium in different chemical forms.

  1. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    Science.gov (United States)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  2. Ultra-High-Throughput Sample Preparation System for Lymphocyte Immunophenotyping Point-of-Care Diagnostics.

    Science.gov (United States)

    Walsh, David I; Murthy, Shashi K; Russom, Aman

    2016-10-01

    Point-of-care (POC) microfluidic devices often lack the integration of common sample preparation steps, such as preconcentration, which can limit their utility in the field. In this technology brief, we describe a system that combines the necessary sample preparation methods to perform sample-to-result analysis of large-volume (20 mL) biopsy model samples with staining of captured cells. Our platform combines centrifugal-paper microfluidic filtration and an analysis system to process large, dilute biological samples. Utilizing commercialization-friendly manufacturing methods and materials, yielding a sample throughput of 20 mL/min, and allowing for on-chip staining and imaging bring together a practical, yet powerful approach to microfluidic diagnostics of large, dilute samples. © 2016 Society for Laboratory Automation and Screening.

  3. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    In this chapter, the state of the art of flow injection and related approaches thereof for automation and miniaturization of sample processing regardless of the aggregate state of the sample medium is overviewed. The potential of the various generation of flow injection for implementation of in...

  4. A multi-probe thermophoretic soot sampling system for high-pressure diffusion flames

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, Alex M.; Gülder, Ömer L. [Institute for Aerospace Studies, University of Toronto, Toronto, Ontario M3H 5T6 (Canada)

    2016-05-15

    Optical diagnostics and physical probing of the soot processes in high pressure combustion pose challenges that are not faced in atmospheric flames. One of the preferred methods of studying soot in atmospheric flames is in situ thermophoretic sampling followed by transmission electron microscopy imaging and analysis for soot sizing and morphology. The application of this method of sampling to high pressures has been held back by various operational and mechanical problems. In this work, we describe a rotating disk multi-probe thermophoretic soot sampling system, driven by a microstepping stepper motor, fitted into a high-pressure chamber capable of producing sooting laminar diffusion flames up to 100 atm. Innovative aspects of the sampling system design include an easy and precise control of the sampling time down to 2.6 ms, avoidance of the drawbacks of the pneumatic drivers used in conventional thermophoretic sampling systems, and the capability to collect ten consecutive samples in a single experimental run. Proof of principle experiments were performed using this system in a laminar diffusion flame of methane, and primary soot diameter distributions at various pressures up to 10 atm were determined. High-speed images of the flame during thermophoretic sampling were recorded to assess the influence of probe intrusion on the flow field of the flame.

  5. Application of a Dual-Arm Robot in Complex Sample Preparation and Measurement Processes.

    Science.gov (United States)

    Fleischer, Heidi; Drews, Robert Ralf; Janson, Jessica; Chinna Patlolla, Bharath Reddy; Chu, Xianghua; Klos, Michael; Thurow, Kerstin

    2016-10-01

    Automation systems with applied robotics have already been established in industrial applications for many years. In the field of life sciences, a comparable high level of automation can be found in the areas of bioscreening and high-throughput screening. Strong deficits still exist in the development of flexible and universal fully automated systems in the field of analytical measurement. Reasons are the heterogeneous processes with complex structures, which include sample preparation and transport, analytical measurements using complex sensor systems, and suitable data analysis and evaluation. Furthermore, the use of nonstandard sample vessels with various shapes and volumes results in an increased complexity. The direct use of existing automation solutions from bioscreening applications is not possible. A flexible automation system for sample preparation, analysis, and data evaluation is presented in this article. It is applied for the determination of cholesterol in biliary endoprosthesis using gas chromatography-mass spectrometry (GC-MS). A dual-arm robot performs both transport and active manipulation tasks to ensure human-like operation. This general robotic concept also enables the use of manual laboratory devices and equipment and is thus suitable in areas with a high standardization grade. © 2016 Society for Laboratory Automation and Screening.

  6. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  7. RaPToRS Sample Delivery System

    Science.gov (United States)

    Henchen, Robert; Shibata, Kye; Krieger, Michael; Pogozelski, Edward; Padalino, Stephen; Glebov, Vladimir; Sangster, Craig

    2010-11-01

    At various labs (NIF, LLE, NRL), activated material samples are used to measure reaction properties. The Rapid Pneumatic Transport of Radioactive Samples (RaPToRS) system quickly and safely moves these radioactive samples through a closed PVC tube via airflow. The carrier travels from the reaction chamber to the control and analysis station, pneumatically braking at the outlet. A reversible multiplexer routes samples from various locations near the shot chamber to the analysis station. Also, the multiplexer allows users to remotely load unactivated samples without manually approaching the reaction chamber. All elements of the system (pneumatic drivers, flow control valves, optical position sensors, multiplexers, Geiger counters, and release gates at the analysis station) can be controlled manually or automatically using a custom LabVIEW interface. A prototype is currently operating at NRL in Washington DC. Prospective facilities for Raptors systems include LLE and NIF.

  8. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    Science.gov (United States)

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is

  9. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    Science.gov (United States)

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  11. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  12. A 4 GS/sec Instability Feedback Processing System for Intra-bunch Instabilities

    CERN Document Server

    Dusatko, J; Fox, J D; Pollock, K; Rivetta, C H; Turgut, O; Hofle, W

    2013-01-01

    We present the architecture and implementation overview of a proof-of-principle digital signal processing system developed to study control of Electron-Cloud and Transverse Mode Coupling Instabilities (TMCI) in the CERN SPS. This system is motivated by intensity increases planned as part of the High Luminosity LHC upgrade. It is based on a reconfigurable processing architecture which samples intra-bunch motion and applies correction signals at a 4GSa/s rate, allowing multiple samples across a single 3.2ns SPS bunch. This initial demonstration system is a rapidly developed prototype consisting of both commercial and custom- designed hardware that implements feedback control on a single bunch. It contains a high speed ADC and DAC, capable of sampling at up to 4GSa/s, with a 16-tap FIR control filter for each bunch sample slice. Other system features include a timing subsystem to synchronize the sampling to the injection and the bunch 1 markers, the capability of generating arbitrary time domain signals to drive...

  13. A Consistent System for Coding Laboratory Samples

    Science.gov (United States)

    Sih, John C.

    1996-07-01

    A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.

  14. Surface studies of plasma processed Nb samples

    International Nuclear Information System (INIS)

    Tyagi, Puneet V.; Doleans, Marc; Hannah, Brian S.; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma-processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO_2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  15. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  16. Recommended practice for process sampling for partial pressure analysis

    International Nuclear Information System (INIS)

    Blessing, James E.; Ellefson, Robert E.; Raby, Bruce A.; Brucker, Gerardo A.; Waits, Robert K.

    2007-01-01

    This Recommended Practice describes and recommends various procedures and types of apparatus for obtaining representative samples of process gases from >10 -2 Pa (10 -4 Torr) for partial pressure analysis using a mass spectrometer. The document was prepared by a subcommittee of the Recommended Practices Committee of the American Vacuum Society. The subcommittee was comprised of vacuum users and manufacturers of mass spectrometer partial pressure analyzers who have practical experience in the sampling of process gas atmospheres

  17. Air sampling system for airborne surveys

    International Nuclear Information System (INIS)

    Jupiter, C.; Tipton, W.J.

    1975-01-01

    An air sampling system has been designed for installation on the Beechcraft King Air A-100 aircraft as a part of the Aerial Radiological Measuring System (ARMS). It is intended for both particle and whole gas sampling. The sampling probe is designed for isokinetic sampling and is mounted on a removable modified escape hatch cover, behind the co-pilot's seat, and extends about two feet forward of the hatch cover in the air stream lines. Directly behind the sampling probe inside the modified hatch cover is an expansion chamber, space for a 5-inch diameter filter paper cassette, and an optional four-stage cascade impactor for particle size distribution measurements. A pair of motors and blower pumps provide the necessary 0.5 atmosphere pressure across the type MSA 1106 B glass fiber filter paper to allow a flow rate of 50 cfm. The MSA 1106 B filter paper is designed to trap sub-micrometer particles with a high efficiency; it was chosen to enable a quantitative measurement of airborne radon daughters, one of the principal sources of background signals when radiological surveys are being performed. A venturi section and pressure gauges allow air flow rate measurements so that airborne contaminant concentrations may be quantified. A whole gas sampler capable of sampling a cubic meter of air is mounted inside the aircraft cabin. A nuclear counting system on board the aircraft provides capability for α, β and γ counting of filter paper samples. Design data are presented and types of survey missions which may be served by this system are described

  18. A novel approach to process carbonate samples for radiocarbon measurements with helium carrier gas

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: wacker@phys.ethz.ch [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Fueloep, R.-H. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany); Hajdas, I. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Molnar, M. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Institute of Nuclear Research, Hungarian Academy of Sciences, 4026 Debrecen (Hungary); Rethemeyer, J. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany)

    2013-01-15

    Most laboratories prepare carbonates samples for radiocarbon analysis by acid decomposition in evacuated glass tubes and subsequent reduction of the evolved CO{sub 2} to graphite in self-made reduction manifolds. This process is time consuming and labor intensive. In this work, we have tested a new approach for the preparation of carbonate samples, where any high-vacuum system is avoided and helium is used as a carrier gas. The liberation of CO{sub 2} from carbonates with phosphoric acid is performed in a similar way as it is often done in stable isotope ratio mass spectrometry where CO{sub 2} is released with acid in septum sealed tube under helium atmosphere. The formed CO{sub 2} is later flushed in a helium flow by means of a double-walled needle mounted from the tubes to the zeolite trap of the automated graphitization equipment (AGE). It essentially replaces the elemental analyzer normally used for the combustion of organic samples. The process can be fully automated from sampling the released CO{sub 2} in the septum-sealed tubes with a commercially available auto-sampler to the graphitization with the automated graphitization. The new method yields in low sample blanks of about 50000 years. Results of processed reference materials (IAEA-C2, FIRI-C) are in agreement with their consensus values.

  19. Standard Format for Chromatographic-polarimetric System small samples assessment

    International Nuclear Information System (INIS)

    Naranjo, S.; Fajer, V.; Fonfria, C.; Patinno, R.

    2012-01-01

    The treatment of samples containing optically active substances to be evaluated as part of quality control of raw material entering industrial process, and also during the modifications exerted on it to obtain the desired final composition is still and unsolved problem for many industries. That is the case of sugarcane industry. Sometimes the troubles implied are enlarged because samples to be evaluated are not bigger than one milliliter. Reduction of gel beds in G-10 and G-50 chromatographic columns having an inner diameter of 16 mm, instead of 25, and bed heights adjustable to requirements by means of sliding stoppers to increase analytical power were evaluated with glucose and sucrose standards in concentrations from 1 to 10 g/dL, using aliquots of 1 ml without undesirable dilutions that could affect either detection or chromatographic profile. Assays with seaweed extracts gave good results that are shown. It is established the advantage to know concentration of a separated substance by the height of its peak and the savings in time and reagents resulting . Sample expanded uncertainty in both systems is compared. It is also presented several programs for data acquisition, storing and processing. (Author)

  20. Evaluation of Picture Archiving and Communication System (Pacs As A Process Improvement Sample (Sivas Numune Hospital Application

    Directory of Open Access Journals (Sweden)

    Ali Rıza İnce

    2013-09-01

    Full Text Available It is a realitythatthefirmsproducinggoods or services frequentlybenefited process improvementand similartechniques toachieve their objectives In today'scompetitive environment.Beingprocess-orientedmeans to give weight to the creation of quality but not to the control of quality. In this sense, process improvement is seen as an approach of Total Quality Management targeted. Hospital informationsystems are computer systems that collect andarchive all orthe majority of the hospitaldatafor use inthe assessment.It isa major problem that storage of visualdatawithout lossanddamage and also the being availableof these data as soon as possiblein this system.System that calledPACShas been developedto overcome this problem.PACS is an archiving system that archive the visual dataobtained fromimaging systems in different unitsin a spaceand, allows the transfer of these data to usersat different points when needed.In this study,efficiency of PACS asa process improvementtechniquewas examined by the application ofahospital.It is concluded that the system providea positive contributionto customers and hospitalas an example ofan importantprocess improvement at the evaluationof the results and implementation of the system.

  1. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay

    DEFF Research Database (Denmark)

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana

    2015-01-01

    and transportation prior to processing and samples with immediate processing and freezing. METHODS: Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed...... and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. RESULTS: For samples taken in the winter, relative...... differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate...

  2. System design specification for rotary mode core sample trucks No. 2, 3, and 4 programmable logic controller

    International Nuclear Information System (INIS)

    Dowell, J.L.; Akers, J.C.

    1995-01-01

    The system this document describes controls several functions of the Core Sample Truck(s) used to obtain nuclear waste samples from various underground storage tanks at Hanford. The system will monitor the sampling process and provide alarms and other feedback to insure the sampling process is performed within the prescribed operating envelope. The intended audience for this document is anyone associated with rotary or push mode core sampling. This document describes the Alarm and Control logic installed on Rotary Mode Core Sample Trucks (RMCST) number-sign 2, 3, and 4. It is intended to define the particular requirements of the RMCST alarm and control operation (not defined elsewhere) sufficiently for detailed design to implement on a Programmable Logic Controller (PLC)

  3. System and process for dissolution of solids

    Science.gov (United States)

    Liezers, Martin; Farmer, III, Orville T.

    2017-10-10

    A system and process are disclosed for dissolution of solids and "difficult-to-dissolve" solids. A solid sample may be ablated in an ablation device to generate nanoscale particles. Nanoparticles may then swept into a coupled plasma device operating at atmospheric pressure where the solid nanoparticles are atomized. The plasma exhaust may be delivered directly into an aqueous fluid to form a solution containing the atomized and dissolved solids. The composition of the resulting solution reflects the composition of the original solid sample.

  4. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    CERN Document Server

    Chang, Chein-I

    2017-01-01

    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  5. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay.

    Science.gov (United States)

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana; Bech, Bodil Hammer; Fuglsang, Jens; Olsen, Jørn; Nohr, Ellen Aagaard

    2015-01-01

    In studies of perfluoroalkyl acids, the validity and comparability of measured concentrations may be affected by differences in the handling of biospecimens. We aimed to investigate whether measured plasma levels of perfluoroalkyl acids differed between blood samples subjected to delay and transportation prior to processing and samples with immediate processing and freezing. Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. For samples taken in the winter, relative differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate there was no difference between the two setups [corresponding estimate 1% (0, 3)]. Differences were negligible in the summer for all compounds. Transport of blood samples and processing delay, similar to conditions applied in some large, population-based studies, may affect measured perfluoroalkyl acid concentrations, mainly when outdoor temperatures are low. Attention to processing conditions is needed in studies of perfluoroalkyl acid exposure in humans.

  6. Automated sample mounting and technical advance alignment system for biological crystallography at a synchrotron source

    International Nuclear Information System (INIS)

    Snell, Gyorgy; Cork, Carl; Nordmeyer, Robert; Cornell, Earl; Meigs, George; Yegian, Derek; Jaklevic, Joseph; Jin, Jian; Stevens, Raymond C.; Earnest, Thomas

    2004-01-01

    High-throughput data collection for macromolecular crystallography requires an automated sample mounting system for cryo-protected crystals that functions reliably when integrated into protein-crystallography beamlines at synchrotrons. Rapid mounting and dismounting of the samples increases the efficiency of the crystal screening and data collection processes, where many crystals can be tested for the quality of diffraction. The sample-mounting subsystem has random access to 112 samples, stored under liquid nitrogen. Results of extensive tests regarding the performance and reliability of the system are presented. To further increase throughput, we have also developed a sample transport/storage system based on 'puck-shaped' cassettes, which can hold sixteen samples each. Seven cassettes fit into a standard dry shipping Dewar. The capabilities of a robotic crystal mounting and alignment system with instrumentation control software and a relational database allows for automated screening and data collection to be developed

  7. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  8. NJOY nuclear data processing system: user's manual

    International Nuclear Information System (INIS)

    MacFarlane, R.E.; Barrett, R.J.; Muir, D.W.; Boicourt, R.M.

    1978-12-01

    The NJOY nuclear data processing system is a comprehensive computer code package for producing cross sections for neutron and photon transport calculations from ENDF/B-IV and -V evaluated nuclear data. This user's manual provides a concise description of the code, input instructions, sample problems, and installation instructions. 1 figure, 3 tables

  9. The process system analysis for advanced spent fuel management technology (I)

    International Nuclear Information System (INIS)

    Lee, H. H.; Lee, J. R.; Kang, D. S.; Seo, C. S.; Shin, Y. J.; Park, S. W.

    1997-12-01

    Various pyrochemical processes were evaluated, and viable options were selected in consideration of the proliferation safety, technological feasibility and compatibility to the domestic nuclear power system. Detailed technical analysis were followed on the selected options such as unit process flowsheet including physico-chemical characteristics of the process systems, preliminary concept development, process design criteria and materials for equipment. Supplementary analysis were also carried out on the support technologies including sampling and transport technologies of molten salt, design criteria and equipment for glove box systems, and remote operation technologies. (author). 40 refs., 49 tabs., 37 figs

  10. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  11. Water sample-collection and distribution system

    Science.gov (United States)

    Brooks, R. R.

    1978-01-01

    Collection and distribution system samples water from six designated stations, filtered if desired, and delivers it to various analytical sensors. System may be controlled by Water Monitoring Data Acquisition System or operated manually.

  12. Fiber laser-microscope system for femtosecond photodisruption of biological samples.

    Science.gov (United States)

    Yavaş, Seydi; Erdogan, Mutlu; Gürel, Kutan; Ilday, F Ömer; Eldeniz, Y Burak; Tazebay, Uygar H

    2012-03-01

    We report on the development of a ultrafast fiber laser-microscope system for femtosecond photodisruption of biological targets. A mode-locked Yb-fiber laser oscillator generates few-nJ pulses at 32.7 MHz repetition rate, amplified up to ∼125 nJ at 1030 nm. Following dechirping in a grating compressor, ∼240 fs-long pulses are delivered to the sample through a diffraction-limited microscope, which allows real-time imaging and control. The laser can generate arbitrary pulse patterns, formed by two acousto-optic modulators (AOM) controlled by a custom-developed field-programmable gate array (FPGA) controller. This capability opens the route to fine optimization of the ablation processes and management of thermal effects. Sample position, exposure time and imaging are all computerized. The capability of the system to perform femtosecond photodisruption is demonstrated through experiments on tissue and individual cells.

  13. Active Fault Diagnosis in Sampled-data Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2015-01-01

    The focus in this paper is on active fault diagnosis (AFD) in closed-loop sampleddata systems. Applying the same AFD architecture as for continuous-time systems does not directly result in the same set of closed-loop matrix transfer functions. For continuous-time systems, the LFT (linear fractional...... transformation) structure in the connection between the parametric faults and the matrix transfer function (also known as the fault signature matrix) applied for AFD is not directly preserved for sampled-data system. As a consequence of this, the AFD methods cannot directly be applied for sampled-data systems....... Two methods are considered in this paper to handle the fault signature matrix for sampled-data systems such that standard AFD methods can be applied. The first method is based on a discretization of the system such that the LFT structure is preserved resulting in the same LFT structure in the fault...

  14. Research on How to Remove Efficiently the Condensate Water of Sampling System

    International Nuclear Information System (INIS)

    Cho, SungHwan; Kim, MinSoo; Choi, HoYoung; In, WonHo

    2015-01-01

    Corrosion was caused in the measurement chamber inside the O 2 and H 2 analyzer, and thus measuring the concentration of O 2 and H 2 was not possible. It was confirmed that the cause of the occurrence of condensate water is due to the temperature difference caused during the process of the internal gas of the disposal and degasifier tank being brought into the analyzer. Thus, a heating system was installed inside and outside of the sampling panel for gas to remove generated condensate water in the analyzer and pipe. For the case where condensate water is not removed by the heating system, drain port is also installed in the sampling panel for gas to collect the condensate water of the sampling system. It was verified that there is a great volume of condensate water existing in the pipe line during the purging process after installing manufactured goods. The condensate water was fully removed by the installed heating cable and drain port. The heating cable was operated constantly at a temperature of 80 to 90 .deg. C, which allows the precise measurement of gas concentration and longer maintenance duration by blocking of the condensate water before being produced. To install instruments for measuring the gas, such as an O 2 and H 2 analyzer etc., consideration regarding whether there condensate water is present due to the temperature difference between the measuring system and analyzer is required

  15. Research on How to Remove Efficiently the Condensate Water of Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Cho, SungHwan; Kim, MinSoo; Choi, HoYoung; In, WonHo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Corrosion was caused in the measurement chamber inside the O{sub 2} and H{sub 2} analyzer, and thus measuring the concentration of O{sub 2} and H{sub 2} was not possible. It was confirmed that the cause of the occurrence of condensate water is due to the temperature difference caused during the process of the internal gas of the disposal and degasifier tank being brought into the analyzer. Thus, a heating system was installed inside and outside of the sampling panel for gas to remove generated condensate water in the analyzer and pipe. For the case where condensate water is not removed by the heating system, drain port is also installed in the sampling panel for gas to collect the condensate water of the sampling system. It was verified that there is a great volume of condensate water existing in the pipe line during the purging process after installing manufactured goods. The condensate water was fully removed by the installed heating cable and drain port. The heating cable was operated constantly at a temperature of 80 to 90 .deg. C, which allows the precise measurement of gas concentration and longer maintenance duration by blocking of the condensate water before being produced. To install instruments for measuring the gas, such as an O{sub 2} and H{sub 2} analyzer etc., consideration regarding whether there condensate water is present due to the temperature difference between the measuring system and analyzer is required.

  16. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    Dyke, Jason V.; Kirk, Andrea B.; Kalyani Martinelango, P.; Dasgupta, Purnendu K.

    2006-01-01

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  17. LIIS: A web-based system for culture collections and sample annotation

    Directory of Open Access Journals (Sweden)

    Matthew S Forster

    2014-04-01

    Full Text Available The Lab Information Indexing System (LIIS is a web-driven database application for laboratories looking to store their sample or culture metadata on a central server. The design was driven by a need to replace traditional paper storage with an easier to search format, and extend current spreadsheet storage methods. The system supports the import and export of CSV spreadsheets, and stores general metadata designed to complement the environmental packages provided by the Genomic Standards Consortium. The goals of the LIIS are to simplify the storage and archival processes and to provide an easy to access library of laboratory annotations. The program will find utility in microbial ecology laboratories or any lab that needs to annotate samples/cultures.

  18. Importance sampling of rare events in chaotic systems

    DEFF Research Database (Denmark)

    Leitão, Jorge C.; Parente Lopes, João M.Viana; Altmann, Eduardo G.

    2017-01-01

    space of chaotic systems. As examples of our general framework we compute the distribution of finite-time Lyapunov exponents (in different chaotic maps) and the distribution of escape times (in transient-chaos problems). Our methods sample exponentially rare states in polynomial number of samples (in......Finding and sampling rare trajectories in dynamical systems is a difficult computational task underlying numerous problems and applications. In this paper we show how to construct Metropolis-Hastings Monte-Carlo methods that can efficiently sample rare trajectories in the (extremely rough) phase...... both low- and high-dimensional systems). An open-source software that implements our algorithms and reproduces our results can be found in reference [J. Leitao, A library to sample chaotic systems, 2017, https://github.com/jorgecarleitao/chaospp]....

  19. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    Science.gov (United States)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  20. Performance test of SAUNA xenon mobile sampling system

    International Nuclear Information System (INIS)

    Hu Dan; Yang Bin; Yang Weigeng; Jia Huaimao; Wang Shilian; Li Qi; Zhao Yungang; Fan Yuanqing; Chen Zhanying; Chang Yinzhong; Liu Shujiang; Zhang Xinjun; Wang Jun

    2011-01-01

    In this article, the structure and basic functions of SAUNA noble gas xenon mobile sampling system are introduced. The sampling capability of this system is about 2.2 mL per day, as a result from a 684-h operation. The system can be transported to designated locations conveniently to collect xenon sample for routine or emergency environment monitoring. (authors)

  1. Testing results of Monte Carlo sampling processes in MCSAD

    International Nuclear Information System (INIS)

    Pinnera, I.; Cruz, C.; Abreu, Y.; Leyva, A.; Correa, C.; Demydenko, C.

    2009-01-01

    The Monte Carlo Simulation of Atom Displacements (MCSAD) is a code implemented by the authors to simulate the complete process of atom displacement (AD) formation. This code makes use of the Monte Carlo (MC) method to sample all the processes involved in the gamma and electronic radiation transport through matter. The kernel of the calculations applied to this code relies on a model based on an algorithm developed by the authors, which firstly splits out multiple electron elastic scattering events from those single ones at higher scattering angles and then, from the last one, sampling those leading to AD at high transferred atomic recoil energies. Some tests have been developed to check the sampling algorithms with the help of the corresponding theoretical distribution functions. Satisfactory results have been obtained, which indicate the strength of the methods and subroutines used in the code. (Author)

  2. Automated processing of whole blood samples for the determination of immunosuppressants by liquid chromatography tandem-mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Spöhrer, Ute

    2006-01-01

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.

  3. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2018-04-24

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The process also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  4. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  5. Stationary and related stochastic processes sample function properties and their applications

    CERN Document Server

    Cramér, Harald

    2004-01-01

    This graduate-level text offers a comprehensive account of the general theory of stationary processes, with special emphasis on the properties of sample functions. Assuming a familiarity with the basic features of modern probability theory, the text develops the foundations of the general theory of stochastic processes, examines processes with a continuous-time parameter, and applies the general theory to procedures key to the study of stationary processes. Additional topics include analytic properties of the sample functions and the problem of time distribution of the intersections between a

  6. Development of a sample preparation system for AMS radiocarbon dating at CRICH, Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung-Jin; Lee, Byeong-Cheol; Lim, Eun-Soo [Cultural Research Institute of Chungcheong Heritage, Gongju (Korea, Republic of); Hong, Duk-Geun [Kangwon National University, Chuncheon (Korea, Republic of); Park, Soon-Bal [Chungnam National University, Daejeon (Korea, Republic of); Youn, Min-Young [Seoul National University, Seoul (Korea, Republic of)

    2010-01-15

    We developed a sample preparation system for radiocarbon dating by using AMS measurement at Cultural Research Institute of Chungcheong Heritage, Korea. From the investigation of the reduction process, the optimum graphitization temperature was chosen as 625 .deg. C. Using Aldrich graphite powder of 0.75 {+-} 0.023 pMC, the background value of our preparation system was controlled at a low level. The robustness against chemical treatment and contamination was also observed from samples of Oxalic acid II and IAEA-C4. The resultant values, 134.04 {+-} 0.99 pMC and 0.38 {+-} 0.043 pMC, were in good agreement with the consensus values. Based on comparison, our conventional ages agreed very well with those of Beta Analytic Co. and SNU-AMS. No memory effect existed in the preparation system. Therefore, we concluded that the sample preparation system was operated in a stable manner and that the basic radiocarbon dating procedures were completely verified.

  7. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  8. FFTF gas processing systems

    International Nuclear Information System (INIS)

    Halverson, T.G.

    1977-01-01

    The design and operation of the two radioactive gas processing systems at the Fast Flux Test Facility (FFTF) exemplifies the concept that will be used in the first generation of Liquid Metal Fast Breeder Reactors (LMFBR's). The two systems, the Radioactive Argon Processing System (RAPS) and the Cell Atmosphere Processing System (CAPS), process the argon and nitrogen used in the FFTF for cover gas on liquid metal systems and as inert atmospheres in steel lined cells housing sodium equipment. The RAPS specifically processes the argon cover gas from the reactor coolant system, providing for decontamination and eventual reuse. The CAPS processes radioactive gasses from inerted cells and other liquid metal cover gas systems, providing for decontamination and ultimate discharge to the atmosphere. The cryogenic processing of waste gas by both systems is described

  9. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  10. High speed real-time wavefront processing system for a solid-state laser system

    Science.gov (United States)

    Liu, Yuan; Yang, Ping; Chen, Shanqiu; Ma, Lifang; Xu, Bing

    2008-03-01

    A high speed real-time wavefront processing system for a solid-state laser beam cleanup system has been built. This system consists of a core2 Industrial PC (IPC) using Linux and real-time Linux (RT-Linux) operation system (OS), a PCI image grabber, a D/A card. More often than not, the phase aberrations of the output beam from solid-state lasers vary fast with intracavity thermal effects and environmental influence. To compensate the phase aberrations of solid-state lasers successfully, a high speed real-time wavefront processing system is presented. Compared to former systems, this system can improve the speed efficiently. In the new system, the acquisition of image data, the output of control voltage data and the implementation of reconstructor control algorithm are treated as real-time tasks in kernel-space, the display of wavefront information and man-machine conversation are treated as non real-time tasks in user-space. The parallel processing of real-time tasks in Symmetric Multi Processors (SMP) mode is the main strategy of improving the speed. In this paper, the performance and efficiency of this wavefront processing system are analyzed. The opened-loop experimental results show that the sampling frequency of this system is up to 3300Hz, and this system can well deal with phase aberrations from solid-state lasers.

  11. Data processing system for ETL TPE-2

    International Nuclear Information System (INIS)

    Yahagi, E.; Kiyama, M.

    1988-01-01

    The data processing system for ETL TPE-2 consists of 2 CPU systems and it is composing a duplex system. One system is used as a data acquisition system, which is abbreviated as DAS and functions controlling various data input devices, data acquisition, communication with the main controller of TPE-2 confirming safety system operation. Another one is used as data processing system, which is abbreviated as DPS and functions the processing of the data after the acquisition, the interconnections with the mainframe and the development of software. A transient memory system, which has 64 channels of 8 bits ADC with maximum sampling frequency of 20 MHz and 4 KB buffer memory in each channel, is used to record the time sequential experimental data. Two CAMAC crates are used for the acquisition of the informations of the experiment condition and Thomson scattering data. They are composing a serial high way system through fiber optics. The CAMAC crate for Thomson scattering data is controlled by a personal computer, HP-85, and is available stand-alone use, and the communication between the CAMAC system and DAS is easily performed by using a CAMAC memory module as an intermediator without complicated procedure in the connection of different type computers. Two magnetic disk pack units, which have the formatted storage capacity of 158 KB in each one and can record the data over 2,000 shots, are used in parallel with a magnetic tape handler for the data file. Thus we realized the high speed data processing over the wide range of experimental shots and confirmed the preservation of the data. (author)

  12. Imaging systems and algorithms to analyze biological samples in real-time using mobile phone microscopy.

    Science.gov (United States)

    Shanmugam, Akshaya; Usmani, Mohammad; Mayberry, Addison; Perkins, David L; Holcomb, Daniel E

    2018-01-01

    Miniaturized imaging devices have pushed the boundaries of point-of-care imaging, but existing mobile-phone-based imaging systems do not exploit the full potential of smart phones. This work demonstrates the use of simple imaging configurations to deliver superior image quality and the ability to handle a wide range of biological samples. Results presented in this work are from analysis of fluorescent beads under fluorescence imaging, as well as helminth eggs and freshwater mussel larvae under white light imaging. To demonstrate versatility of the systems, real time analysis and post-processing results of the sample count and sample size are presented in both still images and videos of flowing samples.

  13. Estimating rare events in biochemical systems using conditional sampling

    Science.gov (United States)

    Sundar, V. S.

    2017-01-01

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  14. Principles of image processing in machine vision systems for the color analysis of minerals

    Science.gov (United States)

    Petukhova, Daria B.; Gorbunova, Elena V.; Chertov, Aleksandr N.; Korotaev, Valery V.

    2014-09-01

    At the moment color sorting method is one of promising methods of mineral raw materials enrichment. This method is based on registration of color differences between images of analyzed objects. As is generally known the problem with delimitation of close color tints when sorting low-contrast minerals is one of the main disadvantages of color sorting method. It is can be related with wrong choice of a color model and incomplete image processing in machine vision system for realizing color sorting algorithm. Another problem is a necessity of image processing features reconfiguration when changing the type of analyzed minerals. This is due to the fact that optical properties of mineral samples vary from one mineral deposit to another. Therefore searching for values of image processing features is non-trivial task. And this task doesn't always have an acceptable solution. In addition there are no uniform guidelines for determining criteria of mineral samples separation. It is assumed that the process of image processing features reconfiguration had to be made by machine learning. But in practice it's carried out by adjusting the operating parameters which are satisfactory for one specific enrichment task. This approach usually leads to the fact that machine vision system unable to estimate rapidly the concentration rate of analyzed mineral ore by using color sorting method. This paper presents the results of research aimed at addressing mentioned shortcomings in image processing organization for machine vision systems which are used to color sorting of mineral samples. The principles of color analysis for low-contrast minerals by using machine vision systems are also studied. In addition, a special processing algorithm for color images of mineral samples is developed. Mentioned algorithm allows you to determine automatically the criteria of mineral samples separation based on an analysis of representative mineral samples. Experimental studies of the proposed algorithm

  15. Adaptive multiple importance sampling for Gaussian processes

    Czech Academy of Sciences Publication Activity Database

    Xiong, X.; Šmídl, Václav; Filippone, M.

    2017-01-01

    Roč. 87, č. 8 (2017), s. 1644-1665 ISSN 0094-9655 R&D Projects: GA MŠk(CZ) 7F14287 Institutional support: RVO:67985556 Keywords : Gaussian Process * Bayesian estimation * Adaptive importance sampling Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 0.757, year: 2016 http://library.utia.cas.cz/separaty/2017/AS/smidl-0469804.pdf

  16. Process information displays from a computerized nuclear materials control and accounting system

    International Nuclear Information System (INIS)

    Ellis, J.H.

    1981-11-01

    A computerized nuclear materials control and accounting system is being developed for an LWR spent fuel reprocessing facility. This system directly accesses process instrument readings, sample analyses, and outputs of various on-line analytical instruments. In this paper, methods of processing and displaying this information in ways that aid in the efficient, timely, and safe control of the chemical processes of the facility are described

  17. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  18. Remote laser drilling and sampling system for the detection of concealed explosives

    Science.gov (United States)

    Wild, D.; Pschyklenk, L.; Theiß, C.; Holl, G.

    2017-05-01

    The detection of hazardous materials like explosives is a central issue in national security in the field of counterterrorism. One major task includes the development of new methods and sensor systems for the detection. Many existing remote or standoff methods like infrared or raman spectroscopy find their limits, if the hazardous material is concealed in an object. Imaging technologies using x-ray or terahertz radiation usually yield no information about the chemical content itself. However, the exact knowledge of the real threat potential of a suspicious object is crucial for disarming the device. A new approach deals with a laser drilling and sampling system for the use as verification detector for suspicious objects. Central part of the system is a miniaturised, diode pumped Nd:YAG laser oscillator-amplifier. The system allows drilling into most materials like metals, synthetics or textiles with bore hole diameters in the micron scale. During the drilling process, the hazardous material can be sampled for further investigation with suitable detection methods. In the reported work, laser induced breakdown spectroscopy (LIBS) is used to monitor the drilling process and to classify the drilled material. Also experiments were carried out to show the system's ability to not ignite even sensitive explosives like triacetone triperoxide (TATP). The detection of concealed hazardous material is shown for different explosives using liquid chromatography and ion mobility spectrometry.

  19. Sample distillation/graphitization system for carbon pool analysis by accelerator mass spectrometry (AMS)

    International Nuclear Information System (INIS)

    Pohlman, J.W.; Knies, D.L.; Grabowski, K.S.; DeTurck, T.M.; Treacy, D.J.; Coffin, R.B.

    2000-01-01

    A facility at the Naval Research Laboratory (NRL), Washington, DC, has been developed to extract, trap, cryogenically distill and graphitize carbon from a suite of organic and inorganic carbon pools for analysis by accelerator mass spectrometry (AMS). The system was developed to investigate carbon pools associated with the formation and stability of methane hydrates. However, since the carbon compounds found in hydrate fields are ubiquitous in aquatic ecosystems, this apparatus is applicable to a number of oceanographic and environmental sample types. Targeted pools are dissolved methane, dissolved organic carbon (DOC), dissolved inorganic carbon (DIC), solid organic matrices (e.g., seston, tissue and sediments), biomarkers and short chained (C 1 -C 5 ) hydrocarbons from methane hydrates. In most instances, the extraction, distillation and graphitization events are continuous within the system, thus, minimizing the possibility of fractionation or contamination during sample processing. A variety of methods are employed to extract carbon compounds and convert them to CO 2 for graphitization. Dissolved methane and DIC from the same sample are sparged and cryogenically separated before the methane is oxidized in a high temperature oxygen stream. DOC is oxidized to CO 2 by 1200 W ultraviolet photo-oxidation lamp, and solids oxidized in sealed, evacuated tubes. Hydrocarbons liberated from the disassociation of gas hydrates are cryogenically separated with a cryogenic temperature control unit, and biomarkers separated and concentrated by preparative capillary gas chromatography (PCGC). With this system, up to 20 samples, standards or blanks can be processed per day

  20. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  1. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    International Nuclear Information System (INIS)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L.

    2015-01-01

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h"−"1). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates the

  2. Slurry feed variability in West Valley's melter feed tank and sampling system

    International Nuclear Information System (INIS)

    Fow, C.L.; Kurath, D.E.; Pulsipher, B.A.; Bauer, B.P.

    1989-04-01

    The present plan for disposal of high-level wastes at West Valley is to vitrify the wastes for disposal in deep geologic repository. The vitrification process involves mixing the high-level wastes with glass-forming chemicals and feeding the resulting slurry to a liquid-fed ceramic melter. Maintaining the quality of the glass product and proficient melter operation depends on the ability of the melter feed system to produce and maintain a homogeneous mixture of waste and glass-former materials. To investigate the mixing properties of the melter feed preparation system at West Valley, a statistically designed experiment was conducted using synthetic melter feed slurry over a range of concentrations. On the basis of the statistical data analysis, it was found that (1) a homogeneous slurry is produced in the melter feed tank, (2) the liquid-sampling system provides slurry samples that are statistically different from the slurry in the tank, and (3) analytical measurements are the major source of variability. A statistical quality control program for the analytical laboratory and a characterization test of the actual sampling system is recommended. 1 ref., 5 figs., 1 tab

  3. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  4. Sample introduction systems for the analysis of liquid microsamples by ICP-AES and ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Todoli, Jose L. [Departamento de Quimica Analitica, Nutricion y Bromatologia, Universidad de Alicante, 03080 Alicante (Spain)]. E-mail: jose.todoli@ua.es; Mermet, Jean M. [Spectroscopy Forever, 01390 Tramoyes (France)

    2006-03-15

    There are many fields in which the available sample volume is the limiting factor for an elemental analysis. Over the last ten years, sample introduction systems used in plasma spectrometry (i.e., Inductively Coupled Plasma Atomic Emission Spectrometry, ICP-AES, and Mass Spectrometry, ICP-MS) have evolved in order to expand the field of applicability of these techniques to the analysis of micro- and nanosamples. A full understanding of the basic processes occurring throughout the sample introduction system is absolutely necessary to improve analytical performance. The first part of the present review deals with fundamental studies concerning the different phenomena taking place from aerosol production to analyte excitation/ionization when the liquid consumption rate does not exceed 100 {mu}l/min. Existing sample introduction systems are currently far from the ideal and a significant effort has been made to develop new and efficient devices. Different approaches for continuously introducing small sample volumes (i.e., microsamples) have been reviewed and compared in the present work. Finally, applications as well as basic guidelines to select the best sample introduction system according to the sample particularities are given at the end of this review.

  5. Sample introduction systems for the analysis of liquid microsamples by ICP-AES and ICP-MS

    International Nuclear Information System (INIS)

    Todoli, Jose L.; Mermet, Jean M.

    2006-01-01

    There are many fields in which the available sample volume is the limiting factor for an elemental analysis. Over the last ten years, sample introduction systems used in plasma spectrometry (i.e., Inductively Coupled Plasma Atomic Emission Spectrometry, ICP-AES, and Mass Spectrometry, ICP-MS) have evolved in order to expand the field of applicability of these techniques to the analysis of micro- and nanosamples. A full understanding of the basic processes occurring throughout the sample introduction system is absolutely necessary to improve analytical performance. The first part of the present review deals with fundamental studies concerning the different phenomena taking place from aerosol production to analyte excitation/ionization when the liquid consumption rate does not exceed 100 μl/min. Existing sample introduction systems are currently far from the ideal and a significant effort has been made to develop new and efficient devices. Different approaches for continuously introducing small sample volumes (i.e., microsamples) have been reviewed and compared in the present work. Finally, applications as well as basic guidelines to select the best sample introduction system according to the sample particularities are given at the end of this review

  6. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  7. Sampling system for a boiling reactor NPP

    International Nuclear Information System (INIS)

    Zabelin, A.I.; Yakovleva, E.D.; Solov'ev, Yu.A.

    1976-01-01

    Investigations and pilot running of the nuclear power plant with a VK-50 boiling reactor reveal the necessity of normalizing the design system of water sampling and of mandatory replacement of the needle-type throttle device by a helical one. A method for designing a helical throttle device has been worked out. The quantitative characteristics of depositions of corrosion products along the line of reactor water sampling are presented. Recommendations are given on the organizaton of the sampling system of a nuclear power plant with BWR type reactors

  8. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  9. A flexible system to capture sample vials in a storage box - the box vial scanner.

    Science.gov (United States)

    Nowakowski, Steven E; Kressin, Kenneth R; Deick, Steven D

    2009-01-01

    Tracking sample vials in a research environment is a critical task and doing so efficiently can have a large impact on productivity, especially in high volume laboratories. There are several challenges to automating the capture process, including the variety of containers used to store samples. We developed a fast and robust system to capture the location of sample vials being placed in storage that allows the laboratories the flexibility to use sample containers of varying dimensions. With a single scan, this device captures the box identifier, the vial identifier and the location of each vial within a freezer storage box. The sample vials are tracked through a barcode label affixed to the cap while the boxes are tracked by a barcode label on the side of the box. Scanning units are placed at the point of use and forward data to a sever application for processing the scanned data. Scanning units consist of an industrial barcode reader mounted in a fixture positioning the box for scanning and providing lighting during the scan. The server application transforms the scan data into a list of storage locations holding vial identifiers. The list is then transferred to the laboratory database. The box vial scanner captures the IDs and location information for an entire box of sample vials into the laboratory database in a single scan. The system accommodates a wide variety of vials sizes by inserting risers under the sample box and a variety of storage box layouts are supported via the processing algorithm on the server.

  10. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  11. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  12. COED Transactions, Vol. X, No. 10, October 1978. Simulation of a Sampled-Data System on a Hybrid Computer.

    Science.gov (United States)

    Mitchell, Eugene E., Ed.

    The simulation of a sampled-data system is described that uses a full parallel hybrid computer. The sampled data system simulated illustrates the proportional-integral-derivative (PID) discrete control of a continuous second-order process representing a stirred-tank. The stirred-tank is simulated using continuous analog components, while PID…

  13. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    Science.gov (United States)

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. 296-B-10 stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    Ridge, T.M.

    1995-01-01

    B Plant Administration Manual, requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with stack 296-B-10 at B Plant. The ventilation system of WESF (Waste Encapsulation and Storage Facility) is designed to provide airflow patterns so that air movement throughout the building is from areas of lesser radioactivity to areas of greater radioactivity. All potentially contaminated areas are maintained at a negative pressure with respect to the atmosphere so that air flows into the building at all times. The exhaust discharging through the 296-B-10 stack is continuously monitored and sampled using a sampling and monitoring probe assembly located approximately 17.4 meters (57 feet) above the base of the stack. The probe assembly consists of 5 nozzles for the sampling probe and 2 nozzles to monitor the flow. The sampling and monitoring system associated with Stack 296-B-10 is functional and performing satisfactorily

  15. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    Science.gov (United States)

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. 296-B-5 Stack monitoring and sampling system annual system assessment report

    International Nuclear Information System (INIS)

    Ridge, T.M.

    1995-02-01

    The B Plant Administration Manual requires an annual system assessment to evaluate and report the present condition of the sampling and monitoring system associated with Stack 296-B-5 at B Plant. The sampling and monitoring system associated with stack 296-B-5 is functional and performing satisfactorily. This document is an annual assessment report of the systems associated with the 296-B-5 stack

  17. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  18. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  19. Meteorites and cosmic dust: Interstellar heritage and nebular processes in the early solar system

    Directory of Open Access Journals (Sweden)

    Engrand C.

    2012-01-01

    Full Text Available Small solar system bodies like asteroids and comets have escaped planetary accretion. They are the oldest and best preserved witnesses of the formation of the solar system. Samples of these celestial bodies fall on Earth as meteorites and interplanetary dust. The STARDUST mission also recently returned to Earth cometary dust from comet 81P/Wild 2, a Jupiter Family Comet (JFC. These samples provide unique insights on the physico-chemical conditions and early processes of the solar system. They also contain some minute amount of materials inherited from the local interstellar medium that have survived the accretion processes in the solar system.

  20. Evaluation of standard methods for collecting and processing fuel moisture samples

    Science.gov (United States)

    Sally M. Haase; José Sánchez; David R. Weise

    2016-01-01

    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  1. A review of blood sample handling and pre-processing for metabolomics studies.

    Science.gov (United States)

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. High peak power processing up to 100 MV/M on various metallic samples

    International Nuclear Information System (INIS)

    Luong, M.; Bonin, B.; Safa, H.

    1996-01-01

    The high peak power processing (HPPP) is a well established way to reduce electronic field emission from radiofrequency (RF) metallic surfaces. The processing occurs because of some kind of instability destroys the emitter, but the basic physical mechanism at work has not yet been clearly identified. RF processing experiments on samples of restricted area, are described with well localized artificial emitting sites (protrusions from scratches on the sample surface). In order to disentangle the role of thermal and mechanical effects, in the processing, the samples were made from metals with different melting temperatures and tensile strengths. (author)

  3. High peak power processing up to 100 MV/m on various metallic samples

    International Nuclear Information System (INIS)

    Luong, M.; Bonin, B.; Safa, H.; Le Goff, A.

    1996-01-01

    The high peak power processing (HPPP) is a well established way to reduce electronic field emission from radiofrequency (RF) metallic surfaces. The processing occurs because of some kind of instability destroys the emitter, but the basic physical mechanism at work has not yet been clearly identified. The present study describes RF processing experiments on samples of restricted area, with well localized artificial emitting sites (protrusions from scratches on the sample surface). In order to disentangle the role of thermal and mechanical effects in the processing, the samples were made from metals with different melting temperatures and tensile strengths. (author)

  4. Rare behavior of growth processes via umbrella sampling of trajectories

    Science.gov (United States)

    Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen

    2018-03-01

    We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.

  5. Sampling inspection for the evaluation of time-dependent reliability of deteriorating systems under imperfect defect detection

    International Nuclear Information System (INIS)

    Kuniewski, Sebastian P.; Weide, Johannes A.M. van der; Noortwijk, Jan M. van

    2009-01-01

    The paper presents a sampling-inspection strategy for the evaluation of time-dependent reliability of deteriorating systems, where the deterioration is assumed to initiate at random times and at random locations. After initiation, defects are weakening the system's resistance. The system becomes unacceptable when at least one defect reaches a critical depth. The defects are assumed to initiate at random times modeled as event times of a non-homogeneous Poisson process (NHPP) and to develop according to a non-decreasing time-dependent gamma process. The intensity rate of the NHPP is assumed to be a combination of a known time-dependent shape function and an unknown proportionality constant. When sampling inspection (i.e. inspection of a selected subregion of the system) results in a number of defect initiations, Bayes' theorem can be used to update prior beliefs about the proportionality constant of the NHPP intensity rate to the posterior distribution. On the basis of a time- and space-dependent Poisson process for the defect initiation, an adaptive Bayesian model for sampling inspection is developed to determine the predictive probability distribution of the time to failure. A potential application is, for instance, the inspection of a large vessel or pipeline suffering pitting/localized corrosion in the oil industry. The possibility of imperfect defect detection is also incorporated in the model.

  6. Correction of Sample-Time Error for Time-Interleaved Sampling System Using Cubic Spline Interpolation

    Directory of Open Access Journals (Sweden)

    Qin Guo-jie

    2014-08-01

    Full Text Available Sample-time errors can greatly degrade the dynamic range of a time-interleaved sampling system. In this paper, a novel correction technique employing a cubic spline interpolation is proposed for inter-channel sample-time error compensation. The cubic spline interpolation compensation filter is developed in the form of a finite-impulse response (FIR filter structure. The correction method of the interpolation compensation filter coefficients is deduced. A 4GS/s two-channel, time-interleaved ADC prototype system has been implemented to evaluate the performance of the technique. The experimental results showed that the correction technique is effective to attenuate the spurious spurs and improve the dynamic performance of the system.

  7. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  8. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.

    2011-11-01

    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  9. Sampling from a system-theoretic viewpoint

    NARCIS (Netherlands)

    Meinsma, Gjerrit; Mirkin, Leonid

    2009-01-01

    This paper studies a system-theoretic approach to the problem of reconstructing an analog signal from its samples. The idea, borrowed from earlier treatments in the control literature, is to address the problem as a hybrid model-matching problem in which performance is measured by system norms. The

  10. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  11. Digital processing data communication systems (bus systems)

    International Nuclear Information System (INIS)

    Fleck, K.

    1980-01-01

    After an introduction to the technology of digital processing data communication systems there are the following chapters: digital communication of processing data in automation technology, the technology of biserial communication, the implementaiton of a bus system, the data transmission of the TDC-2000 system of Honeywell's and the process bus CS 275 in the automation system TELEPERM M of Siemens AG. (WB) [de

  12. Sampled Data Systems Passivity and Discrete Port-Hamiltonian Systems

    NARCIS (Netherlands)

    Stramigioli, Stefano; Secchi, Cristian; Schaft, Arjan J. van der; Fantuzzi, Cesare

    2005-01-01

    In this paper, we present a novel way to approach the interconnection of a continuous and a discrete time physical system. This is done in a way which preserves passivity of the coupled system independently of the sampling time T. This strategy can be used both in the field of telemanipulation, for

  13. A THz Tomography System for Arbitrarily Shaped Samples

    Science.gov (United States)

    Stübling, E.; Bauckhage, Y.; Jelli, E.; Fischer, B.; Globisch, B.; Schell, M.; Heinrich, A.; Balzer, J. C.; Koch, M.

    2017-10-01

    We combine a THz time-domain spectroscopy system with a robotic arm. With this scheme, the THz emitter and receiver can be positioned perpendicular and at defined distance to the sample surface. Our system allows the acquisition of reflection THz tomographic images of samples with an arbitrarily shaped surface.

  14. Indigenous development of automated metallographic sample preparation system

    International Nuclear Information System (INIS)

    Kulkarni, A.P.; Pandit, K.M.; Deshmukh, A.G.; Sahoo, K.C.

    2005-01-01

    Surface preparation of specimens for Metallographic studies on irradiated material involves a lot of remote handling of radioactive material by skilled manpower. These are laborious and man-rem intensive activities and put limitations on number of samples that can be prepared for the metallographic studies. To overcome these limitations, automated systems have been developed for surface preparation of specimens in PIE division. The system includes (i) Grinding and polishing stations (ii) Water jet cleaning station (iii) Ultrasonic cleaning stations (iv) Drying station (v) Sample loading and unloading station (vi) Dispenser for slurries and diluents and (vii) Automated head for movement of the sample holder disc from one station to other. System facilities the operator for programming/changing sequence of the sample preparations including remote changing of grinding/polishing discs from the stations. Two such systems have been installed and commissioned in Hot Cell for PIE Division. These are being used for preparation of irradiated samples from nuclear fuels and structural components. This development has increased the throughput of metallography work and savings in terms of (man-severts) radiation exposure to operators. This presentation will provide details of the challenges in undertaking this developmental work. (author)

  15. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  16. Expert systems in process control systems

    International Nuclear Information System (INIS)

    Wittig, T.

    1987-01-01

    To illustrate where the fundamental difference between expert systems in classical diagnosis and in industrial control lie, the work of process control instrumentation is used as an example for the job of expert systems. Starting from the general process of problem-solving, two classes of expert systems can be defined accordingly. (orig.) [de

  17. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA processing... determination within 20 workdays, we have instituted multitrack processing of requests. Based on the information... source; responsive records were part of the Air Force's decision-making process, and the prerelease...

  18. Processing scarce biological samples for light and transmission electron microscopy

    Directory of Open Access Journals (Sweden)

    P Taupin

    2008-06-01

    Full Text Available Light microscopy (LM and transmission electron microscopy (TEM aim at understanding the relationship structure-function. With advances in biology, isolation and purification of scarce populations of cells or subcellular structures may not lead to enough biological material, for processing for LM and TEM. A protocol for preparation of scarce biological samples is presented. It is based on pre-embedding the biological samples, suspensions or pellets, in bovine serum albumin (BSA and bis-acrylamide (BA, cross-linked and polymerized. This preparation provides a simple and reproducible technique to process biological materials, present in limited quantities that can not be amplified, for light and transmission electron microscopy.

  19. Finite element simulation of the T-shaped ECAP processing of round samples

    Science.gov (United States)

    Shaban Ghazani, Mehdi; Fardi-Ilkhchy, Ali; Binesh, Behzad

    2018-05-01

    Grain refinement is the only mechanism that increases the yield strength and toughness of the materials simultaneously. Severe plastic deformation is one of the promising methods to refine the microstructure of materials. Among different severe plastic deformation processes, the T-shaped equal channel angular pressing (T-ECAP) is a relatively new technique. In the present study, finite element analysis was conducted to evaluate the deformation behavior of metals during T-ECAP process. The study was focused mainly on flow characteristics, plastic strain distribution and its homogeneity, damage development, and pressing force which are among the most important factors governing the sound and successful processing of nanostructured materials by severe plastic deformation techniques. The results showed that plastic strain is localized in the bottom side of sample and uniform deformation cannot be possible using T-ECAP processing. Friction coefficient between sample and die channel wall has a little effect on strain distributions in mirror plane and transverse plane of deformed sample. Also, damage analysis showed that superficial cracks may be initiated from bottom side of sample and their propagation will be limited due to the compressive state of stress. It was demonstrated that the V shaped deformation zone are existed in T-ECAP process and the pressing load needed for execution of deformation process is increased with friction.

  20. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  1. Process simulation in digital camera system

    Science.gov (United States)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  2. Use of an Electronic Tongue System and Fuzzy Logic to Analyze Water Samples

    Science.gov (United States)

    Braga, Guilherme S.; Paterno, Leonardo G.; Fonseca, Fernando J.

    2009-05-01

    An electronic tongue (ET) system incorporating 8 chemical sensors was used in combination with two pattern recognition tools, namely principal component analysis (PCA) and Fuzzy logic for discriminating/classification of water samples from different sources (tap, distilled and three brands of mineral water). The Fuzzy program exhibited a higher accuracy than the PCA and allowed the ET to classify correctly 4 in 5 types of water. Exception was made for one brand of mineral water which was sometimes misclassified as tap water. On the other hand, the PCA grouped water samples in three clusters, one with the distilled water; a second with tap water and one brand of mineral water, and the third with the other two other brands of mineral water. Samples in the second and third clusters could not be distinguished. Nevertheless, close grouping between repeated tests indicated that the ET system response is reproducible. The potential use of the Fuzzy logic as the data processing tool in combination with an electronic tongue system is discussed.

  3. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples."

    Data.gov (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  4. Optimizing the data acquisition rate for a remotely controllable structural monitoring system with parallel operation and self-adaptive sampling

    International Nuclear Information System (INIS)

    Sheng, Wenjuan; Guo, Aihuang; Liu, Yang; Azmi, Asrul Izam; Peng, Gang-Ding

    2011-01-01

    We present a novel technique that optimizes the real-time remote monitoring and control of dispersed civil infrastructures. The monitoring system is based on fiber Bragg gating (FBG) sensors, and transfers data via Ethernet. This technique combines parallel operation and self-adaptive sampling to increase the data acquisition rate in remote controllable structural monitoring systems. The compact parallel operation mode is highly efficient at achieving the highest possible data acquisition rate for the FBG sensor based local data acquisition system. Self-adaptive sampling is introduced to continuously coordinate local acquisition and remote control for data acquisition rate optimization. Key issues which impact the operation of the whole system, such as the real-time data acquisition rate, data processing capability, and buffer usage, are investigated. The results show that, by introducing parallel operation and self-adaptive sampling, the data acquisition rate can be increased by several times without affecting the system operating performance on both local data acquisition and remote process control

  5. A large-scale cryoelectronic system for biological sample banking

    Science.gov (United States)

    Shirley, Stephen G.; Durst, Christopher H. P.; Fuchs, Christian C.; Zimmermann, Heiko; Ihmig, Frank R.

    2009-11-01

    We describe a polymorphic electronic infrastructure for managing biological samples stored over liquid nitrogen. As part of this system we have developed new cryocontainers and carrier plates attached to Flash memory chips to have a redundant and portable set of data at each sample. Our experimental investigations show that basic Flash operation and endurance is adequate for the application down to liquid nitrogen temperatures. This identification technology can provide the best sample identification, documentation and tracking that brings added value to each sample. The first application of the system is in a worldwide collaborative research towards the production of an AIDS vaccine. The functionality and versatility of the system can lead to an essential optimization of sample and data exchange for global clinical studies.

  6. Blood Sample Transportation by Pneumatic Transportation Systems

    DEFF Research Database (Denmark)

    Nybo, Mads; Lund, Merete E; Titlestad, Kjell

    2018-01-01

    BACKGROUND: Pneumatic transportation systems (PTSs) are increasingly used for transportation of blood samples to the core laboratory. Many studies have investigated the impact of these systems on different types of analyses, but to elucidate whether PTSs in general are safe for transportation...... analysis, and the hemolysis index). CONCLUSIONS: Owing to their high degree of heterogeneity, the retrieved studies were unable to supply evidence for the safety of using PTSs for blood sample transportation. In consequence, laboratories need to measure and document the actual acceleration forces...

  7. A novel storage system for cryoEM samples.

    Science.gov (United States)

    Scapin, Giovanna; Prosise, Winifred W; Wismer, Michael K; Strickland, Corey

    2017-07-01

    We present here a new CryoEM grid boxes storage system designed to simplify sample labeling, tracking and retrieval. The system is based on the crystal pucks widely used by the X-ray crystallographic community for storage and shipping of crystals. This system is suitable for any cryoEM laboratory, but especially for large facilities that will need accurate tracking of large numbers of samples coming from different sources. Copyright © 2017. Published by Elsevier Inc.

  8. Design and fabrication of a glovebox for the Plasma Hearth Process radioactive bench-scale system

    International Nuclear Information System (INIS)

    Wahlquist, D.R.

    1996-01-01

    This paper presents some of the design considerations and fabrication techniques for building a glovebox for the Plasma Hearth Process (PHP) radioactive bench-scale system. The PHP radioactive bench-scale system uses a plasma torch to process a variety of radioactive materials into a final vitrified waste form. The processed waste will contain plutonium and trace amounts of other radioactive materials. The glovebox used in this system is located directly below the plasma chamber and is called the Hearth Handling Enclosure (HHE). The HHE is designed to maintain a confinement boundary between the processed waste and the operator. Operations that take place inside the HHE include raising and lowering the hearth using a hydraulic lift table, transporting the hearth within the HHE using an overhead monorail and hoist system, sampling and disassembly of the processed waste and hearth, weighing the hearth, rebuilding a hearth, and sampling HEPA filters. The PHP radioactive bench-scale system is located at the TREAT facility at Argonne National Laboratory-West in Idaho Falls, Idaho

  9. Design of FPGA based high-speed data acquisition and real-time data processing system on J-TEXT tokamak

    International Nuclear Information System (INIS)

    Zheng, W.; Liu, R.; Zhang, M.; Zhuang, G.; Yuan, T.

    2014-01-01

    Highlights: • It is a data acquisition system for polarimeter–interferometer diagnostic on J-TEXT tokamak based on FPGA and PXIe devices. • The system provides a powerful data acquisition and real-time data processing performance. • Users can implement different data processing applications on the FPGA in a short time. • This system supports EPICS and has been integrated into the J-TEXT CODAC system. - Abstract: Tokamak experiment requires high-speed data acquisition and processing systems. In traditional data acquisition system, the sampling rate, channel numbers and processing speed are limited by bus throughput and CPU speed. This paper presents a data acquisition and processing system based on FPGA. The data can be processed in real-time before it is passed to the CPU. It provides processing ability for more channels with higher sampling rates than the traditional data acquisition system while ensuring deterministic real-time performance. A working prototype is developed for the newly built polarimeter–interferometer diagnostic system on the Joint Texas Experimental Tokamak (J-TEXT). It provides 16 channels with 120 MHz maximum sampling rate and 16 bit resolution. The onboard FPGA is able to calculate the plasma electron density and Faraday rotation angel. A RAID 5 storage device is adopted providing 700 MB/s read–write speed to buffer the data to the hard disk continuously for better performance

  10. Rotary Mode Core Sample System availability improvement

    International Nuclear Information System (INIS)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D.; Cross, B.T.; Burkes, J.M.; Rogers, A.C.

    1995-01-01

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2

  11. Candidate sample acquisition systems for the Rosetta

    International Nuclear Information System (INIS)

    Magnani, P.G.; Gerli, C.; Colombina, G.; Vielmo, P.

    1989-01-01

    The Comet Nucleus Sample Return (CNSR) mission, one of the four cornerstones of the ESA scientific program, is one of the most complex space ventures within the next century, both from technological and deep space exploration point of view. In the Rosetta scenario the sample acquisition phase represents the most critical point for the global mission's success. The proposed paper illustrates the main results obtained in the context of the CNSR-SAS ongoing activity. The main areas covered are related to: (1) sample properties characterization (comet soil model, physical/chemical properties, reference material for testing); (2) concepts identification for coring, shovelling, harpooning and anchoring; (3) preferred concept (trade off among concepts, identification of the preferred configuration); and (4) proposed development activity for gaining the necessary confidence before finalizing the CNSR mission. Particular emphasis will be given to the robotic and flexibility aspects of the identified sample acquisition systems (SAS) configuration, intended as a means for the overall system performance enhancement

  12. Relationships between processing delay and microbial load of broiler neck skin samples.

    Science.gov (United States)

    Lucianez, A; Holmes, M A; Tucker, A W

    2010-01-01

    The measurable microbial load on poultry carcasses during processing is determined by a number of factors including farm or origin, processing hygiene, and external temperature. This study investigated associations between carcass microbial load and progressive delays to processing. A total of 30 carcasses were delayed immediately after defeathering and before evisceration in a commercial abattoir in groups of five, and were held at ambient temperature for 1, 2, 3, 4, 6, and 8 h. Delayed carcasses were reintroduced to the processing line, and quantitative assessment of total viable count, coliforms, Staphylococcus aureus, and Pseudomonas spp. was undertaken on neck skin flap samples collected after carcass chilling and then pooled for each group. Sampling was repeated on 5 separate days, and the data were combined. Significant increases in total viable count (P = 0.001) and coliforms (P = 0.004), but not for S. aureus or Pseudomonas loads, were observed across the 8-h period of delay. In line with previous studies, there was significant variation in microbiological data according to sampling day. In conclusion, there is a significant and measurable decline in microbiological status of uneviscerated but defeathered poultry carcasses after an 8-h delay, but the variability of sampling results, reflecting the wide range of factors that impact microbial load, means that it is not possible to determine maximum or minimum acceptable periods of processing delay based on this criterion alone.

  13. Multiscale simulations of patchy particle systems combining Molecular Dynamics, Path Sampling and Green's Function Reaction Dynamics

    Science.gov (United States)

    Bolhuis, Peter

    Important reaction-diffusion processes, such as biochemical networks in living cells, or self-assembling soft matter, span many orders in length and time scales. In these systems, the reactants' spatial dynamics at mesoscopic length and time scales of microns and seconds is coupled to the reactions between the molecules at microscopic length and time scales of nanometers and milliseconds. This wide range of length and time scales makes these systems notoriously difficult to simulate. While mean-field rate equations cannot describe such processes, the mesoscopic Green's Function Reaction Dynamics (GFRD) method enables efficient simulation at the particle level provided the microscopic dynamics can be integrated out. Yet, many processes exhibit non-trivial microscopic dynamics that can qualitatively change the macroscopic behavior, calling for an atomistic, microscopic description. The recently developed multiscale Molecular Dynamics Green's Function Reaction Dynamics (MD-GFRD) approach combines GFRD for simulating the system at the mesocopic scale where particles are far apart, with microscopic Molecular (or Brownian) Dynamics, for simulating the system at the microscopic scale where reactants are in close proximity. The association and dissociation of particles are treated with rare event path sampling techniques. I will illustrate the efficiency of this method for patchy particle systems. Replacing the microscopic regime with a Markov State Model avoids the microscopic regime completely. The MSM is then pre-computed using advanced path-sampling techniques such as multistate transition interface sampling. I illustrate this approach on patchy particle systems that show multiple modes of binding. MD-GFRD is generic, and can be used to efficiently simulate reaction-diffusion systems at the particle level, including the orientational dynamics, opening up the possibility for large-scale simulations of e.g. protein signaling networks.

  14. Data acquisition and processing system for reactor noise analysis

    International Nuclear Information System (INIS)

    Costa Oliveira, J.; Morais Da Veiga, C.; Forjaz Trigueiros, D.; Pombo Duarte, J.

    1975-01-01

    A data acquisition and processing system for reactor noise analysis by time correlation methods is described, consisting in one to four data feeding channels (transducer, associated electronics and V/f converter), a sampling unit, a landline transmission system and a PDP 15 computer. This system is being applied to study the kinetic parameters of the 'Reactor Portugues de Investigacao', a swimming-pool 1MW reactor. The main features that make such a data acquisition and processing system a useful tool to perform noise analysis are: the improved characteristics of analog-to-digital converters employed to quantize the signals; the use of an on-line computer which allows a great accumulation and a rapid treatment of data together with an easy check of the correctness of the experiments; and the adoption of the time cross-correlation technique using two-detectors which by-pass the limitation of low efficiency detectors. (author)

  15. A user configurable data acquisition and signal processing system for high-rate, high channel count applications

    International Nuclear Information System (INIS)

    Salim, Arwa; Crockett, Louise; McLean, John; Milne, Peter

    2012-01-01

    Highlights: ► The development of a new digital signal processing platform is described. ► The system will allow users to configure the real-time signal processing through software routines. ► The architecture of the DRUID system and signal processing elements is described. ► A prototype of the DRUID system has been developed for the digital chopper-integrator. ► The results of acquisition on 96 channels at 500 kSamples/s per channel are presented. - Abstract: Real-time signal processing in plasma fusion experiments is required for control and for data reduction as plasma pulse times grow longer. The development time and cost for these high-rate, multichannel signal processing systems can be significant. This paper proposes a new digital signal processing (DSP) platform for the data acquisition system that will allow users to easily customize real-time signal processing systems to meet their individual requirements. The D-TACQ reconfigurable user in-line DSP (DRUID) system carries out the signal processing tasks in hardware co-processors (CPs) implemented in an FPGA, with an embedded microprocessor (μP) for control. In the fully developed platform, users will be able to choose co-processors from a library and configure programmable parameters through the μP to meet their requirements. The DRUID system is implemented on a Spartan 6 FPGA, on the new rear transition module (RTM-T), a field upgrade to existing D-TACQ digitizers. As proof of concept, a multiply-accumulate (MAC) co-processor has been developed, which can be configured as a digital chopper-integrator for long pulse magnetic fusion devices. The DRUID platform allows users to set options for the integrator, such as the number of masking samples. Results from the digital integrator are presented for a data acquisition system with 96 channels simultaneously acquiring data at 500 kSamples/s per channel.

  16. System design development for microwave and millimeter-wave materials processing

    Science.gov (United States)

    Feher, Lambert; Thumm, Manfred

    2002-06-01

    The most notable effect in processing dielectrics with micro- and millimeter-waves is volumetric heating of these materials, offering the opportunity of very high heating rates for the samples. In comparison to conventional heating where the heat transfer is diffusive and depends on the thermal conductivity of the material, the microwave field penetrates the sample and acts as an instantaneous heat source at each point of the sample. By this unique property, microwave heating at 2.45 GHz and 915 MHz ISM (Industrial, Medical, Scientific) frequencies is established as an important industrial technology since more than 50 years ago. Successful application of microwaves in industries has been reported e.g. by food processing systems, domestic ovens, rubber industry, vacuum drying etc. The present paper shows some outlines of microwave system development at Forschungszentrum Karlsruhe, IHM by transferring properties from the higher frequency regime (millimeter-waves) to lower frequency applications. Anyway, the need for using higher frequencies like 24 GHz (ISM frequency) for industrial applications has to be carefully verified with respect to special physical/engineering advantages or to limits the standard microwave technology meets for the specific problem.

  17. DUAL-PROCESS, a highly reliable process control system

    International Nuclear Information System (INIS)

    Buerger, L.; Gossanyi, A.; Parkanyi, T.; Szabo, G.; Vegh, E.

    1983-02-01

    A multiprocessor process control system is described. During its development the reliability was the most important aspect because it is used in the computerized control of a 5 MW research reactor. DUAL-PROCESS is fully compatible with the earlier single processor control system PROCESS-24K. The paper deals in detail with the communication, synchronization, error detection and error recovery problems of the operating system. (author)

  18. Sampling and Timing: A Task for the Environmetal Process

    NARCIS (Netherlands)

    Hilderink, G.H.; Broenink, Johannes F.

    2003-01-01

    Sampling and timing is considered a responsibility of the environment of controller software. In this paper we will illustrate a concept whereby an environmental process and multi-way events play an important role in applying timing for untimed CSP software architectures. We use this timing concept

  19. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    Science.gov (United States)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  20. Extreme Environment Sampling System Deployment Mechanism, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Future Venus or Comet mission architectures may feature robotic sampling systems comprised of a Sampling Tool and Deployment Mechanism. Since 2005, Honeybee has been...

  1. Aersol particle losses in sampling systems

    International Nuclear Information System (INIS)

    Fan, B.J.; Wong, F.S.; Ortiz, C.A.; Anand, N.K.; McFarland, A.R.

    1993-01-01

    When aerosols are sampled from stacks and ducts, it is usually necessary to transport them from the point of sampling to a location of collection or analysis. Losses of aerosol particles can occur in the inlet region of the probe, in straight horizontal and vertical tubes and in elbows. For probes in laminary flow, the Saffman lift force can cause substantial losses of particles in a short inlet region. An empirical model has been developed to predict probe inlet losses, which are often on the order of 40% for 10 μm AED particles. A user-friendly PC computer code, DEPOSITION, has been setup to model losses in transport systems. Experiments have been conducted to compare the actual aerosol particle losses in transport systems with those predicted by the DEPOSITION code

  2. Robust nonhomogeneous training samples detection method for space-time adaptive processing radar using sparse-recovery with knowledge-aided

    Science.gov (United States)

    Li, Zhihui; Liu, Hanwei; Zhang, Yongshun; Guo, Yiduo

    2017-10-01

    The performance of space-time adaptive processing (STAP) may degrade significantly when some of the training samples are contaminated by the signal-like components (outliers) in nonhomogeneous clutter environments. To remove the training samples contaminated by outliers in nonhomogeneous clutter environments, a robust nonhomogeneous training samples detection method using the sparse-recovery (SR) with knowledge-aided (KA) is proposed. First, the reduced-dimension (RD) overcomplete spatial-temporal steering dictionary is designed with the prior knowledge of system parameters and the possible target region. Then, the clutter covariance matrix (CCM) of cell under test is efficiently estimated using a modified focal underdetermined system solver (FOCUSS) algorithm, where a RD overcomplete spatial-temporal steering dictionary is applied. Third, the proposed statistics are formed by combining the estimated CCM with the generalized inner products (GIP) method, and the contaminated training samples can be detected and removed. Finally, several simulation results validate the effectiveness of the proposed KA-SR-GIP method.

  3. Design of signal reception and processing system of embedded ultrasonic endoscope

    Science.gov (United States)

    Li, Ming; Yu, Feng; Zhang, Ruiqiang; Li, Yan; Chen, Xiaodong; Yu, Daoyin

    2009-11-01

    Embedded Ultrasonic Endoscope, based on embedded microprocessor and embedded real-time operating system, sends a micro ultrasonic probe into coelom through the biopsy channel of the Electronic Endoscope to get the fault histology features of digestive organs by rotary scanning, and acquires the pictures of the alimentary canal mucosal surface. At the same time, ultrasonic signals are processed by signal reception and processing system, forming images of the full histology of the digestive organs. Signal Reception and Processing System is an important component of Embedded Ultrasonic Endoscope. However, the traditional design, using multi-level amplifiers and special digital processing circuits to implement signal reception and processing, is no longer satisfying the standards of high-performance, miniaturization and low power requirements that embedded system requires, and as a result of the high noise that multi-level amplifier brought, the extraction of small signal becomes hard. Therefore, this paper presents a method of signal reception and processing based on double variable gain amplifier and FPGA, increasing the flexibility and dynamic range of the Signal Reception and Processing System, improving system noise level, and reducing power consumption. Finally, we set up the embedded experiment system, using a transducer with the center frequency of 8MHz to scan membrane samples, and display the image of ultrasonic echo reflected by each layer of membrane, with a frame rate of 5Hz, verifying the correctness of the system.

  4. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    Science.gov (United States)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  5. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  6. Preliminary level 2 specification for the nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This preliminary Level 2 Component Specification establishes the performance, design, development, and test requirements for the in-tank sampling system which will support the BNFL contract in the final disposal of Hanford's High Level Wastes (HLW) and Low Activity Wastes (LAW). The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by BNFL from double-shell feed tanks. Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a nested, fixed-depth sampling system. This sampling system will provide large volume? representative samples without the environmental, radiation exposure, and sample volume Impacts of the current base-line ''grab'' sampling method. This preliminary Level 2 Component Specification is not a general specification for tank sampling, but is based on a ''record of decision'', AGA (HNF-SD-TWR-AGA-001 ), the System Specification for the Double Shell Tank System (HNF-SD-WM-TRD-O07), and the BNFL privatization contract

  7. Aggregation and sampling in deterministic chaos: implications for chaos identification in hydrological processes

    Directory of Open Access Journals (Sweden)

    J. D. Salas

    2005-01-01

    Full Text Available A review of the literature reveals conflicting results regarding the existence and inherent nature of chaos in hydrological processes such as precipitation and streamflow, i.e. whether they are low dimensional chaotic or stochastic. This issue is examined further in this paper, particularly the effect that certain types of transformations, such as aggregation and sampling, may have on the identification of the dynamics of the underlying system. First, we investigate the dynamics of daily streamflows for two rivers in Florida, one with strong surface and groundwater storage contributions and the other with a lesser basin storage contribution. Based on estimates of the delay time, the delay time window, and the correlation integral, our results suggest that the river with the stronger basin storage contribution departs significantly from the behavior of a chaotic system, while the departure is less significant for the river with the smaller basin storage contribution. We pose the hypothesis that the chaotic behavior depicted on continuous precipitation fields or small time-step precipitation series becomes less identifiable as the aggregation (or sampling time step increases. Similarly, because streamflows result from a complex transformation of precipitation that involves accumulating and routing excess rainfall throughout the basin and adding surface and groundwater flows, the end result may be that streamflows at the outlet of the basin depart from low dimensional chaotic behavior. We also investigate the effect of aggregation and sampling using series derived from the Lorenz equations and show that, as the aggregation and sampling scales increase, the chaotic behavior deteriorates and eventually ceases to show evidence of low dimensional determinism.

  8. Development of FPGA-based digital signal processing system for radiation spectroscopy

    International Nuclear Information System (INIS)

    Lee, Pil Soo; Lee, Chun Sik; Lee, Ju Hahn

    2013-01-01

    We have developed an FPGA-based digital signal processing system that performs both online digital signal filtering and pulse-shape analysis for both particle and gamma-ray spectroscopy. Such functionalities were made possible by a state-of-the-art programmable logic device and system architectures employed. The system performance as measured, for example, in the system dead time and accuracy for pulse-height and rise-time determination, was evaluated with standard alpha- and gamma-ray sources using a CsI(Tl) scintillation detector. It is resulted that the present system has shown its potential application to various radiation-related fields such as particle identification, radiography, and radiation imaging. - Highlights: ► An FPGA-based digital processing system was developed for radiation spectroscopy. ► Our digital system has a 14-bit resolution and a 100-MHz sampling rate. ► The FPGA implements the online digital filtering and pulse-shape analysis. ► The pileup rejection is implemented in trigger logic before digital filtering process. ► Our digital system was verified in alpha-gamma measurements using a CsI detector

  9. Rapid surface sampling and archival record system (RSSAR)

    International Nuclear Information System (INIS)

    Barren, E.; Bracco, A.; Dorn, S.B.

    1997-01-01

    Purpose is to develop a rapid surface (concrete, steel) contamination measurement system that will provide a ''quick-look'' indication of contamination areas, an archival record, and an automated analysis. A bulk sampling oven is also being developed. The sampling device consists of a sampling head, a quick look detector, and an archiving system (sorbent tube). The head thermally desorbs semi-volatiles, such as PCBs, oils, etc., from concrete and steel surfaces; the volatilized materials are passed through a quick-look detector. Sensitivity of the detector can be attenuated for various contaminant levels. Volatilized materials are trapped in a tube filled with adsorbent. The tubes are housed in a magazine which also archives information about sampling conditions. Analysis of the tubes can be done at a later date. The concrete sampling head is fitted with a tungsten-halogen lamp; in laboratory experiments it has extracted model contaminants by heating the top 4mm of the surface to 250 C within 100-200 s. The steel sampling head has been tested on different types of steels and has extracted model contaminants within 30 s. A mathematical model of heat and mass transport in concrete has been developed. Rate of contaminant removal is at maximum when the moisture content is about 100 kg/m 3 . The system will be useful during decontamination and decommissioning operations

  10. Research on photodiode detector-based spatial transient light detection and processing system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Yang; Zhao, Hui; Nan, Meng

    2016-10-01

    In order to realize real-time signal identification and processing of spatial transient light, the features and the energy of the captured target light signal are first described and quantitatively calculated. Considering that the transient light signal has random occurrence, a short duration and an evident beginning and ending, a photodiode detector based spatial transient light detection and processing system is proposed and designed in this paper. This system has a large field of view and is used to realize non-imaging energy detection of random, transient and weak point target under complex background of spatial environment. Weak signal extraction under strong background is difficult. In this paper, considering that the background signal changes slowly and the target signal changes quickly, filter is adopted for signal's background subtraction. A variable speed sampling is realized by the way of sampling data points with a gradually increased interval. The two dilemmas that real-time processing of large amount of data and power consumption required by the large amount of data needed to be stored are solved. The test results with self-made simulative signal demonstrate the effectiveness of the design scheme. The practical system could be operated reliably. The detection and processing of the target signal under the strong sunlight background was realized. The results indicate that the system can realize real-time detection of target signal's characteristic waveform and monitor the system working parameters. The prototype design could be used in a variety of engineering applications.

  11. Fault tolerant controllers for sampled-data systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2004-01-01

    A general compensator architecture for fault tolerant control (FTC) for sampled-data systems is proposed. The architecture is based on the YJBK parameterization of all stabilizing controllers, and uses the dual YJBK parameterization to quantify the performance of the fault tolerant system. The FTC...

  12. 46 CFR 161.002-15 - Sample extraction smoke detection systems.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Sample extraction smoke detection systems. 161.002-15..., CONSTRUCTION, AND MATERIALS: SPECIFICATIONS AND APPROVAL ELECTRICAL EQUIPMENT Fire-Protective Systems § 161.002-15 Sample extraction smoke detection systems. The smoke detecting system must consist of a means for...

  13. Establishing and evaluating bar-code technology in blood sampling system: a model based on human centered human-centered design method.

    Science.gov (United States)

    Chou, Shin-Shang; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Tseng, Kuan-Jui; Kuo, Shu-Chen

    2012-01-01

    This study intended to use a human-centered design study method to develop a bar-code technology in blood sampling process. By using the multilevel analysis to gather the information, the bar-code technology has been constructed to identify the patient's identification, simplify the work process, and prevent medical error rates. A Technology Acceptance Model questionnaire was developed to assess the effectiveness of system and the data of patient's identification and sample errors were collected daily. The average scores of 8 items users' perceived ease of use was 25.21(3.72), 9 items users' perceived usefulness was 28.53(5.00), and 14 items task-technology fit was 52.24(7.09), the rate of patient identification error and samples with order cancelled were down to zero, however, new errors were generated after the new system deployed; which were the position of barcode stickers on the sample tubes. Overall, more than half of nurses (62.5%) were willing to use the new system.

  14. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    : Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort......When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  15. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  16. Capillary absorption spectrometer and process for isotopic analysis of small samples

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  17. Robust media processing on programmable power-constrained systems

    Science.gov (United States)

    McVeigh, Jeff

    2005-03-01

    To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.

  18. On the sample transport time of a pneumatic transfer system

    International Nuclear Information System (INIS)

    Kondo, Yoshihide

    1983-01-01

    The counts accumulated in measuring system are affected by the variations in transport time of the sample on cyclic activation experiments with a mechanical sample transfer system. In use of the pneumatic transfer system, which has been set up, the transport time is variable according to the differences as follows: The form, size and weight of samples, the pneumatic pressure and so on. Comprehending the relationships between the transpot time and these variable factors is essentially important to make experiments with this transfer system. (author)

  19. Assessment of denitrification process in lower Ishikari river system, Japan.

    Science.gov (United States)

    Jha, Pawan Kumar; Minagawa, Masao

    2013-11-01

    Sediment denitrification rate and its role in removal of dissolved nitrate load in lower Ishikari river system were examined. Denitrification rate were measured using acetylene inhibition technique on the sediment samples collected during August 2009-July 2010. The denitrification rate varied from 0.001 to 1.9 μg Ng(-1) DM h(-1) with an average value of 0.21 μg Ng(-1) DM h(-1) in lower Ishikari river system. Denitrification rate showed positive correlation with dissolved nitrate concentration in the river basin, indicating overlying water column supplied nitrate for the sediment denitrification processes. Nutrient enrichment experiments result showed that denitrification rate increased significantly with addition of nitrate in case of samples collected from Barato Lake however no such increase was observed in the samples collected from Ishikari river main channel and its major tributaries indicating that factors other than substrate concentration such as population of denitrifier and hydrological properties of stream channel including channel depth and flow velocity may affects the denitrification rate in lower Ishikari river system. Denitrification rate showed no significant increase with the addition of labile carbon (glucose), indicating that sediment samples had sufficient organic matter to sustain denitrification activity. The result of nutrient spiraling model indicates that in- stream denitrification process removes on an average 5%d(-1) of dissolve nitrate load in Ishikari river. This study was carried out to fill the gap present in the availability of riverine denitrification rate measurement and its role in nitrogen budget from Japanese rivers characterize by small river length and high flow rate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Quality evaluation of processed clay soil samples | Steiner-Asiedu ...

    African Journals Online (AJOL)

    Introduction: This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods: The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was ...

  1. Modular microfluidic system for biological sample preparation

    Science.gov (United States)

    Rose, Klint A.; Mariella, Jr., Raymond P.; Bailey, Christopher G.; Ness, Kevin Dean

    2015-09-29

    A reconfigurable modular microfluidic system for preparation of a biological sample including a series of reconfigurable modules for automated sample preparation adapted to selectively include a) a microfluidic acoustic focusing filter module, b) a dielectrophoresis bacteria filter module, c) a dielectrophoresis virus filter module, d) an isotachophoresis nucleic acid filter module, e) a lyses module, and f) an isotachophoresis-based nucleic acid filter.

  2. DSMC multicomponent aerosol dynamics: Sampling algorithms and aerosol processes

    Science.gov (United States)

    Palaniswaamy, Geethpriya

    The post-accident nuclear reactor primary and containment environments can be characterized by high temperatures and pressures, and fission products and nuclear aerosols. These aerosols evolve via natural transport processes as well as under the influence of engineered safety features. These aerosols can be hazardous and may pose risk to the public if released into the environment. Computations of their evolution, movement and distribution involve the study of various processes such as coagulation, deposition, condensation, etc., and are influenced by factors such as particle shape, charge, radioactivity and spatial inhomogeneity. These many factors make the numerical study of nuclear aerosol evolution computationally very complicated. The focus of this research is on the use of the Direct Simulation Monte Carlo (DSMC) technique to elucidate the role of various phenomena that influence the nuclear aerosol evolution. In this research, several aerosol processes such as coagulation, deposition, condensation, and source reinforcement are explored for a multi-component, aerosol dynamics problem in a spatially homogeneous medium. Among the various sampling algorithms explored the Metropolis sampling algorithm was found to be effective and fast. Several test problems and test cases are simulated using the DSMC technique. The DSMC results obtained are verified against the analytical and sectional results for appropriate test problems. Results show that the assumption of a single mean density is not appropriate due to the complicated effect of component densities on the aerosol processes. The methods developed and the insights gained will also be helpful in future research on the challenges associated with the description of fission product and aerosol releases.

  3. System and method for laser assisted sample transfer to solution for chemical analysis

    Science.gov (United States)

    Van Berkel, Gary J; Kertesz, Vilmos

    2014-01-28

    A system and method for laser desorption of an analyte from a specimen and capturing of the analyte in a suspended solvent to form a testing solution are described. The method can include providing a specimen supported by a desorption region of a specimen stage and desorbing an analyte from a target site of the specimen with a laser beam centered at a radiation wavelength (.lamda.). The desorption region is transparent to the radiation wavelength (.lamda.) and the sampling probe and a laser source emitting the laser beam are on opposite sides of a primary surface of the specimen stage. The system can also be arranged where the laser source and the sampling probe are on the same side of a primary surface of the specimen stage. The testing solution can then be analyzed using an analytical instrument or undergo further processing.

  4. Sampling rare events in nonequilibrium and nonstationary systems.

    Science.gov (United States)

    Berryman, Joshua T; Schilling, Tanja

    2010-12-28

    Although many computational methods for rare event sampling exist, this type of calculation is not usually practical for general nonequilibrium conditions, with macroscopically irreversible dynamics and away from both stationary and metastable states. A novel method for calculating the time-series of the probability of a rare event is presented which is designed for these conditions. The method is validated for the cases of the Glauber-Ising model under time-varying shear flow, the Kawasaki-Ising model after a quench into the region between nucleation dominated and spinodal decomposition dominated phase change dynamics, and the parallel open asymmetric exclusion process. The method requires a subdivision of the phase space of the system: it is benchmarked and found to scale well for increasingly fine subdivisions, meaning that it can be applied without detailed foreknowledge of the physically important reaction pathways.

  5. Analytical Chemistry Division's sample transaction system

    International Nuclear Information System (INIS)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing

  6. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  7. IDAPS (Image Data Automated Processing System) System Description

    Science.gov (United States)

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  8. An Observation-based Assessment of Instrument Requirements for a Future Precipitation Process Observing System

    Science.gov (United States)

    Nelson, E.; L'Ecuyer, T. S.; Wood, N.; Smalley, M.; Kulie, M.; Hahn, W.

    2017-12-01

    Global models exhibit substantial biases in the frequency, intensity, duration, and spatial scales of precipitation systems. Much of this uncertainty stems from an inadequate representation of the processes by which water is cycled between the surface and atmosphere and, in particular, those that govern the formation and maintenance of cloud systems and their propensity to form the precipitation. Progress toward improving precipitation process models requires observing systems capable of quantifying the coupling between the ice content, vertical mass fluxes, and precipitation yield of precipitating cloud systems. Spaceborne multi-frequency, Doppler radar offers a unique opportunity to address this need but the effectiveness of such a mission is heavily dependent on its ability to actually observe the processes of interest in the widest possible range of systems. Planning for a next generation precipitation process observing system should, therefore, start with a fundamental evaluation of the trade-offs between sensitivity, resolution, sampling, cost, and the overall potential scientific yield of the mission. Here we provide an initial assessment of the scientific and economic trade-space by evaluating hypothetical spaceborne multi-frequency radars using a combination of current real-world and model-derived synthetic observations. Specifically, we alter the field of view, vertical resolution, and sensitivity of a hypothetical Ka- and W-band radar system and propagate those changes through precipitation detection and intensity retrievals. The results suggest that sampling biases introduced by reducing sensitivity disproportionately affect the light rainfall and frozen precipitation regimes that are critical for warm cloud feedbacks and ice sheet mass balance, respectively. Coarser spatial resolution observations introduce regime-dependent biases in both precipitation occurrence and intensity that depend on cloud regime, with even the sign of the bias varying within a

  9. Onco-STS: a web-based laboratory information management system for sample and analysis tracking in oncogenomic experiments.

    Science.gov (United States)

    Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard

    2014-01-01

    Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.

  10. Mars Aqueous Processing System

    Science.gov (United States)

    Berggren, Mark; Wilson, Cherie; Carrera, Stacy; Rose, Heather; Muscatello, Anthony; Kilgore, James; Zubrin, Robert

    2012-01-01

    The goal of the Mars Aqueous Processing System (MAPS) is to establish a flexible process that generates multiple products that are useful for human habitation. Selectively extracting useful components into an aqueous solution, and then sequentially recovering individual constituents, can obtain a suite of refined or semi-refined products. Similarities in the bulk composition (although not necessarily of the mineralogy) of Martian and Lunar soils potentially make MAPS widely applicable. Similar process steps can be conducted on both Mars and Lunar soils while tailoring the reaction extents and recoveries to the specifics of each location. The MAPS closed-loop process selectively extracts, and then recovers, constituents from soils using acids and bases. The emphasis on Mars involves the production of useful materials such as iron, silica, alumina, magnesia, and concrete with recovery of oxygen as a byproduct. On the Moon, similar chemistry is applied with emphasis on oxygen production. This innovation has been demonstrated to produce high-grade materials, such as metallic iron, aluminum oxide, magnesium oxide, and calcium oxide, from lunar and Martian soil simulants. Most of the target products exhibited purities of 80 to 90 percent or more, allowing direct use for many potential applications. Up to one-fourth of the feed soil mass was converted to metal, metal oxide, and oxygen products. The soil residue contained elevated silica content, allowing for potential additional refining and extraction for recovery of materials needed for photovoltaic, semiconductor, and glass applications. A high-grade iron oxide concentrate derived from lunar soil simulant was used to produce a metallic iron component using a novel, combined hydrogen reduction/metal sintering technique. The part was subsequently machined and found to be structurally sound. The behavior of the lunar-simulant-derived iron product was very similar to that produced using the same methods on a Michigan iron

  11. Realization of microcontroller-based process control systems examplified on process state monitor for anaerobic biogas fermentation; Realisierung mikrocontrollerbasierter Prozessfuehrungssysteme am Beispiel eines Prozesszustandsmonitors fuer die anaerobe Biogasfermentation

    Energy Technology Data Exchange (ETDEWEB)

    Patzwahl, S.; Kramer, K.D. [Hochschule Harz, Wernigerode (Germany). Fachbereich Automatisierung und Informatik; Nacke, T. [Institut fuer Bioprozess- und Analysenmesstechnik e.V., Heilbad Heiligenstadt (Germany)

    2004-07-01

    This paper describes possibilities to realize microcontroller-based process control systems with use strategies of computational intelligence. All design steps are comprised in a design process with direct interface to the process. A further issue is a development system for firmware, which was programmed especially for main steps of the design process. The process of anaerobic fermentation in biogas plants serves as an sample for a control application. By using the design process and the programmed software a process state monitor was developed for this fermentation process. The system is able to classify the process state online in biogas fermentation plants. (orig.)

  12. In-process weld sampling during hot end welds of type W overpacks

    International Nuclear Information System (INIS)

    Barnes, G.A.

    1998-01-01

    Establish the criteria and process controls to be used in obtaining, testing, and evaluating in-process weld sample during the hot end welding of Type W Overpack capsules used to overpack CsCl capsules for storage at WESF

  13. System Identification of a Non-Uniformly Sampled Multi-Rate System in Aluminium Electrolysis Cells

    Directory of Open Access Journals (Sweden)

    Håkon Viumdal

    2014-07-01

    Full Text Available Standard system identification algorithms are usually designed to generate mathematical models with equidistant sampling instants, that are equal for both input variables and output variables. Unfortunately, real industrial data sets are often disrupted by missing samples, variations of sampling rates in the different variables (also known as multi-rate systems, and intermittent measurements. In industries with varying events based maintenance or manual operational measures, intermittent measurements are performed leading to uneven sampling rates. Such is the case with aluminium smelters, where in addition the materials fed into the cell create even more irregularity in sampling. Both measurements and feeding are mostly manually controlled. A simplified simulation of the metal level in an aluminium electrolysis cell is performed based on mass balance considerations. System identification methods based on Prediction Error Methods (PEM such as Ordinary Least Squares (OLS, and the sub-space method combined Deterministic and Stochastic system identification and Realization (DSR, and its variants are applied to the model of a single electrolysis cell as found in the aluminium smelters. Aliasing phenomena due to large sampling intervals can be crucial in avoiding unsuitable models, but with knowledge about the system dynamics, it is easier to optimize the sampling performance, and hence achieve successful models. The results based on the simulation studies of molten aluminium height in the cells using the various algorithms give results which tally well with the synthetic data sets used. System identification on a smaller data set from a real plant is also implemented in this work. Finally, some concrete suggestions are made for using these models in the smelters.

  14. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  15. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  16. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  17. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  18. Effects of data sampling rate on image quality in fan-beam-CT system

    International Nuclear Information System (INIS)

    Iwata, Akira; Yamagishi, Nobutoshi; Suzumura, Nobuo; Horiba, Isao.

    1984-01-01

    Investigation was made into the relationship between spatial resolution or artifacts and data sampling rate in order to pursue the causes of the degradation of CT image quality by computer simulation. First the generation of projection data and reconstruction calculating process are described, and then the results are shown about the relation between angular sampling interval and spatical resolution or artifacts, and about the relation between projection data sampling interval and spatial resolution or artifacts. It was clarified that the formulation of the relationship between spatial resolution and data sampling rate performed so far for parallel X-ray beam was able to be applied to fan beam. As a conclusion, when other reconstruction parameters are the same in fan beam CT systems, spatial resolution can be determined by projection data sampling rate rather than angular sampling rate. The mechanism of artifact generation due to the insufficient number of angular samples was made clear. It was also made clear that there was a definite relationship among measuring region, angular sampling rate and projection data sampling rate, and the amount of artifacts depending upon projection data sampling rate was proportional to the amount of spatial frequency components (Aliasing components) of a test object above the Nyquist frequency of projection data. (Wakatsuki, Y.)

  19. Natural Information Processing Systems

    OpenAIRE

    John Sweller; Susan Sweller

    2006-01-01

    Natural information processing systems such as biological evolution and human cognition organize information used to govern the activities of natural entities. When dealing with biologically secondary information, these systems can be specified by five common principles that we propose underlie natural information processing systems. The principles equate: (1) human long-term memory with a genome; (2) learning from other humans with biological reproduction; (3) problem solving through random ...

  20. Test plan for the Sample Transfer Canister system

    International Nuclear Information System (INIS)

    Flanagan, B.D.

    1998-01-01

    The Sample Transfer Canister will be used by the Waste Receiving and Processing Facility (WRAP) for the transport of small quantity liquid samples that meet the definition of a limited quantity radioactive material, and may also be corrosive and/or flammable. These samples will be packaged and shipped in accordance with the US Department of Transportation (DOT) regulation 49 CFR 173.4, ''Exceptions for small quantities.'' The Sample Transfer Canister is of a ''French Can'' design, intended to be mated with a glove box for loading/unloading. Transport will typically take place north of the Wye Barricade between WRAP and the 222-S Laboratory. The Sample Transfer Canister will be shipped in an insulated ice chest, but the ice chest will not be a part of the small quantity package during prototype testing

  1. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  2. Can an inadequate cervical cytology sample in ThinPrep be converted to a satisfactory sample by processing it with a SurePath preparation?

    Science.gov (United States)

    Sørbye, Sveinung Wergeland; Pedersen, Mette Kristin; Ekeberg, Bente; Williams, Merete E Johansen; Sauer, Torill; Chen, Ying

    2017-01-01

    The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7%) were processed through the automated "gynecologic" application for cervix cytology samples, and 96 (51.3%) were processed with the "nongynecological" automatic program. Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7%) were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the "gynecology" program and "nongynecology" program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0%) were screened as normal while 13 samples (14.0%) were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187) of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the "nongynecology" program to ensure an adequate number of cells.

  3. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  4. Estimation of Uncertainty in Aerosol Concentration Measured by Aerosol Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Chan; Song, Yong Jae; Jung, Woo Young; Lee, Hyun Chul; Kim, Gyu Tae; Lee, Doo Yong [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    FNC Technology Co., Ltd has been developed test facilities for the aerosol generation, mixing, sampling and measurement under high pressure and high temperature conditions. The aerosol generation system is connected to the aerosol mixing system which injects SiO{sub 2}/ethanol mixture. In the sampling system, glass fiber membrane filter has been used to measure average mass concentration. Based on the experimental results using main carrier gas of steam and air mixture, the uncertainty estimation of the sampled aerosol concentration was performed by applying Gaussian error propagation law. FNC Technology Co., Ltd. has been developed the experimental facilities for the aerosol measurement under high pressure and high temperature. The purpose of the tests is to develop commercial test module for aerosol generation, mixing and sampling system applicable to environmental industry and safety related system in nuclear power plant. For the uncertainty calculation of aerosol concentration, the value of the sampled aerosol concentration is not measured directly, but must be calculated from other quantities. The uncertainty of the sampled aerosol concentration is a function of flow rates of air and steam, sampled mass, sampling time, condensed steam mass and its absolute errors. These variables propagate to the combination of variables in the function. Using operating parameters and its single errors from the aerosol test cases performed at FNC, the uncertainty of aerosol concentration evaluated by Gaussian error propagation law is less than 1%. The results of uncertainty estimation in the aerosol sampling system will be utilized for the system performance data.

  5. Potentiometric chip-based multipumping flow system for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples.

    Science.gov (United States)

    Chango, Gabriela; Palacio, Edwin; Cerdà, Víctor

    2018-08-15

    A simple potentiometric chip-based multipumping flow system (MPFS) has been developed for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples. The proposed system was developed by using a poly(methyl methacrylate) chip microfluidic-conductor using the advantages of flow techniques with potentiometric detection. For this purpose, an automatic system has been designed and built by optimizing the variables involved in the process, such as: pH, ionic strength, stirring and sample volume. This system was applied successfully to water samples getting a versatile system with an analysis frequency of 12 samples per hour. Good correlation between chloride and fluoride concentration measured with ISE and ionic chromatography technique suggests satisfactory reliability of the system. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Order–disorder–reorder process in thermally treated dolomite samples

    DEFF Research Database (Denmark)

    Zucchini, Azzurra; Comodi, Paola; Katerinopoulou, Anna

    2012-01-01

    A combined powder and single-crystal X-ray diffraction analysis of dolomite [CaMg(CO3)2] heated to 1,200oC at 3 GPa was made to study the order–disorder–reorder process. The order/disorder transition is inferred to start below 1,100oC, and complete disorder is attained at approximately 1,200o......C. Twinned crystals characterized by high internal order were found in samples annealed over 1,100oC, and their fraction was found to increase with temperature. Evidences of twinning domains combined with probable remaining disordered portions of the structure imply that reordering processes occur during...

  7. Marine sediment sample pre-processing for macroinvertebrates metabarcoding: mechanical enrichment and homogenization

    Directory of Open Access Journals (Sweden)

    Eva Aylagas

    2016-10-01

    Full Text Available Metabarcoding is an accurate and cost-effective technique that allows for simultaneous taxonomic identification of multiple environmental samples. Application of this technique to marine benthic macroinvertebrate biodiversity assessment for biomonitoring purposes requires standardization of laboratory and data analysis procedures. In this context, protocols for creation and sequencing of amplicon libraries and their related bioinformatics analysis have been recently published. However, a standardized protocol describing all previous steps (i.e. processing and manipulation of environmental samples for macroinvertebrate community characterization is lacking. Here, we provide detailed procedures for benthic environmental sample collection, processing, enrichment for macroinvertebrates, homogenization, and subsequent DNA extraction for metabarcoding analysis. Since this is the first protocol of this kind, it should be of use to any researcher in this field, having the potential for improvement.

  8. Dosimetry systems for radiation processing

    International Nuclear Information System (INIS)

    McLaughlin, W.L.; Desrosiers, M.F.

    1995-01-01

    Dosimetry serves important functions in radiation processing, where large absorbed doses and dose rates from photon and electron sources have to be measured with reasonable accuracy. Proven dosimetry systems are widely used to perform radiation measurements in development of new processes, validation, qualification and verification (quality control) of established processes and archival documentation of day-to-day and plant-to-plant processing uniformity. Proper calibration and traceability of routine dosimetry systems to standards are crucial to the success of many large-volume radiation processes. Recent innovations and advances in performance of systems that enhance radiation measurement assurance and process diagnostics include dose-mapping media (new radiochromic film and solutions), optical waveguide systems for food irradiation, solid-state devices for real-time and passive dosimetry over wide dose-rate and dose ranges, and improved analytical instruments and data acquisition. (author)

  9. Processors and systems (picture processing)

    Energy Technology Data Exchange (ETDEWEB)

    Gemmar, P

    1983-01-01

    Automatic picture processing requires high performance computers and high transmission capacities in the processor units. The author examines the possibilities of operating processors in parallel in order to accelerate the processing of pictures. He therefore discusses a number of available processors and systems for picture processing and illustrates their capacities for special types of picture processing. He stresses the fact that the amount of storage required for picture processing is exceptionally high. The author concludes that it is as yet difficult to decide whether very large groups of simple processors or highly complex multiprocessor systems will provide the best solution. Both methods will be aided by the development of VLSI. New solutions have already been offered (systolic arrays and 3-d processing structures) but they also are subject to losses caused by inherently parallel algorithms. Greater efforts must be made to produce suitable software for multiprocessor systems. Some possibilities for future picture processing systems are discussed. 33 references.

  10. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  11. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  12. Enhanced AFCI Sampling, Analysis, and Safeguards Technology Review

    Energy Technology Data Exchange (ETDEWEB)

    John Svoboda

    2009-09-01

    The focus of this study includes the investigation of sampling technologies used in industry and their potential application to nuclear fuel processing. The goal is to identify innovative sampling methods using state of the art techniques that could evolve into the next generation sampling and analysis system for metallic elements. Sampling and analysis of nuclear fuel recycling plant processes is required both to monitor the operations and ensure Safeguards and Security goals are met. In addition, environmental regulations lead to additional samples and analysis to meet licensing requirements. The volume of samples taken by conventional means, can restrain productivity while results samples are analyzed, require process holding tanks that are sized to meet analytical issues rather than process issues (and that create a larger facility footprint), or, in some cases, simply overwhelm analytical laboratory capabilities. These issues only grow when process flowsheets propose new separations systems and new byproduct material for transmutation purposes. Novel means of streamlining both sampling and analysis are being evaluated to increase the efficiency while meeting all requirements for information. This report addresses just a part of the effort to develop and study novel methods by focusing on the sampling and analysis of aqueous samples for metallic elements. It presents an overview of the sampling requirements, including frequency, sensitivity, accuracy, and programmatic drivers, to demonstrate the magnitude of the task. The sampling and analysis system needed for metallic element measurements is then discussed, and novel options being applied to other industrial analytical needs are presented. Inductively coupled mass spectrometry instruments are the most versatile for metallic element analyses and are thus chosen as the focus for the study. Candidate novel means of process sampling, as well as modifications that are necessary to couple such instruments to

  13. Digital processing data communication systems (bus systems). Digitale Prozessdaten-Kommunikations-Systeme (Bus Systeme)

    Energy Technology Data Exchange (ETDEWEB)

    Fleck, K

    1980-01-01

    After an introduction to the technology of digital processing data communication systems there are the following chapters: digital communication of processing data in automation technology, the technology of biserial communication, the implementaiton of a bus system, the data transmission of the TDC-2000 system of Honeywell's and the process bus CS 275 in the automation system TELEPERM M of Siemens AG.

  14. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  15. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  16. A dual slope charge sampling analog front-end for a wireless neural recording system.

    Science.gov (United States)

    Lee, Seung Bae; Lee, Byunghun; Gosselin, Benoit; Ghovanloo, Maysam

    2014-01-01

    This paper presents a novel dual slope charge sampling (DSCS) analog front-end (AFE) architecture, which amplifies neural signals by taking advantage of the charge sampling concept for analog signal conditioning, such as amplification and filtering. The presented DSCS-AFE achieves amplification, filtering, and sampling in a simultaneous fashion, while consuming very small amount of power. The output of the DSCS-AFE produces a pulse width modulated (PWM) signal that is proportional to the input voltage amplitude. A circular shift register (CSR) utilizes time division multiplexing (TDM) of the PWM pulses to create a pseudo-digital TDM-PWM signal that can feed a wireless transmitter. The 8-channel system-on-a-chip was fabricated in a 0.35-μm CMOS process, occupying 2.4 × 2.1 mm(2) and consuming 255 μW from a 1.8V supply. Measured input-referred noise for the entire system, including the FPGA in order to recover PWM signal is 6.50 μV(rms) in the 288 Hz~10 kHz range. For each channel, sampling rate is 31.25 kHz, and power consumption is 31.8 μW.

  17. Integrating system safety into the basic systems engineering process

    Science.gov (United States)

    Griswold, J. W.

    1971-01-01

    The basic elements of a systems engineering process are given along with a detailed description of what the safety system requires from the systems engineering process. Also discussed is the safety that the system provides to other subfunctions of systems engineering.

  18. A Sample Delivery System for Planetary Missions

    Data.gov (United States)

    National Aeronautics and Space Administration — The project will develop, test and characterize the performance of a prototype /sample delivery system (SDS) implemented as an end effector on a robotic arm capable...

  19. Process information systems in nuclear reprocessing

    International Nuclear Information System (INIS)

    Jaeschke, A.; Keller, H.; Orth, H.

    1987-01-01

    On a production management level, a process information system in a nuclear reprocessing plant (NRP) has to fulfill conventional operating functions and functions for nuclear material surveillance (safeguards). Based on today's state of the art of on-line process control technology, the progress in hardware and software technology allows to introduce more process-specific intelligence into process information systems. Exemplified by an expert-system-aided laboratory management system as component of a NRP process information system, the paper demonstrates that these technologies can be applied already. (DG) [de

  20. Controlling a sample changer using the integrated counting system

    International Nuclear Information System (INIS)

    Deacon, S.; Stevens, M.P.

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described, firstly the running options are given, followed by a program description listing and flowchart. (author)

  1. Controlling a sample changer using the integrated counting system

    Energy Technology Data Exchange (ETDEWEB)

    Deacon, S; Stevens, M P

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module-the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described; first the running options are given, followed by a program description listing and flowchart.

  2. Experimental performance evaluation of two stack sampling systems in a plutonium facility

    International Nuclear Information System (INIS)

    Glissmeyer, J.A.

    1992-04-01

    The evaluation of two routine stack sampling systems at the Z-Plant plutonium facility operated by Rockwell International for USERDA is part of a larger study, sponsored by Rockwell and conducted by Battelle, Pacific Northwest Laboratories, of gaseous effluent sampling systems. The gaseous effluent sampling systems evaluated are located at the main plant ventilation stack (291-Z-1) and at a vessel vent stack (296-Z-3). A preliminary report, which was a paper study issued in April 1976, identified many deficiencies in the existing sampling systems and made recommendations for corrective action. The objectives of this experimental evaluation of those sampling systems were as follows: Characterize the radioactive aerosols in the stack effluents; Develop a tracer aerosol technique for validating particulate effluent sampling system performance; Evaluate the performance of the existing routine sampling systems and their compliance with the sponsor's criteria; and Recommend corrective action where required. The tracer aerosol approach to sampler evaluation was chosen because the low concentrations of radioactive particulates in the effluents would otherwise require much longer sampling times and thus more time to complete this evaluation. The following report describes the sampling systems that are the subject of this study and then details the experiments performed. The results are then presented and discussed. Much of the raw and finished data are included in the appendices

  3. Process-aware information systems : lessons to be learned from process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Jensen, K.; Aalst, van der W.M.P.

    2009-01-01

    A Process-Aware Information System (PAIS) is a software system that manages and executes operational processes involving people, applications, and/or information sources on the basis of process models. Example PAISs are workflow management systems, case-handling systems, enterprise information

  4. Online data processing system

    International Nuclear Information System (INIS)

    Nakahara, Yoshinori; Yagi, Hideyuki; Yamada, Takayuki

    1979-02-01

    A pulse height analyzer terminal system PHATS has been developed for online data processing via JAERI-TOKAI computer network. The system is controled by using a micro-computer MICRO-8 which was developed for the JAERI-TOKAI network. The system program consists of two subprograms, online control system ONLCS and pulse height analyzer control system PHACS. ONLCS links the terminal with the conversational programming system of FACOM 230/75 through the JAERI-TOKAI network and controls data processing in TSS and remote batch modes. PHACS is used to control INPUT/OUTPUT of data between pulse height analyzer and cassette-MT or typewriter. This report describes the hardware configuration and the system program in detail. In the appendix, explained are real time monitor, type of message, PEX to PEX protocol and Host to Host protocol, required for the system programming. (author)

  5. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  6. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  7. Process Parameter Optimization of Extrusion-Based 3D Metal Printing Utilizing PW-LDPE-SA Binder System.

    Science.gov (United States)

    Ren, Luquan; Zhou, Xueli; Song, Zhengyi; Zhao, Che; Liu, Qingping; Xue, Jingze; Li, Xiujuan

    2017-03-16

    Recently, with a broadening range of available materials and alteration of feeding processes, several extrusion-based 3D printing processes for metal materials have been developed. An emerging process is applicable for the fabrication of metal parts into electronics and composites. In this paper, some critical parameters of extrusion-based 3D printing processes were optimized by a series of experiments with a melting extrusion printer. The raw materials were copper powder and a thermoplastic organic binder system and the system included paraffin wax, low density polyethylene, and stearic acid (PW-LDPE-SA). The homogeneity and rheological behaviour of the raw materials, the strength of the green samples, and the hardness of the sintered samples were investigated. Moreover, the printing and sintering parameters were optimized with an orthogonal design method. The influence factors in regard to the ultimate tensile strength of the green samples can be described as follows: infill degree > raster angle > layer thickness. As for the sintering process, the major factor on hardness is sintering temperature, followed by holding time and heating rate. The highest hardness of the sintered samples was very close to the average hardness of commercially pure copper material. Generally, the extrusion-based printing process for producing metal materials is a promising strategy because it has some advantages over traditional approaches for cost, efficiency, and simplicity.

  8. Process Parameter Optimization of Extrusion-Based 3D Metal Printing Utilizing PW–LDPE–SA Binder System

    Directory of Open Access Journals (Sweden)

    Luquan Ren

    2017-03-01

    Full Text Available Recently, with a broadening range of available materials and alteration of feeding processes, several extrusion-based 3D printing processes for metal materials have been developed. An emerging process is applicable for the fabrication of metal parts into electronics and composites. In this paper, some critical parameters of extrusion-based 3D printing processes were optimized by a series of experiments with a melting extrusion printer. The raw materials were copper powder and a thermoplastic organic binder system and the system included paraffin wax, low density polyethylene, and stearic acid (PW–LDPE–SA. The homogeneity and rheological behaviour of the raw materials, the strength of the green samples, and the hardness of the sintered samples were investigated. Moreover, the printing and sintering parameters were optimized with an orthogonal design method. The influence factors in regard to the ultimate tensile strength of the green samples can be described as follows: infill degree > raster angle > layer thickness. As for the sintering process, the major factor on hardness is sintering temperature, followed by holding time and heating rate. The highest hardness of the sintered samples was very close to the average hardness of commercially pure copper material. Generally, the extrusion-based printing process for producing metal materials is a promising strategy because it has some advantages over traditional approaches for cost, efficiency, and simplicity.

  9. Process Parameter Optimization of Extrusion-Based 3D Metal Printing Utilizing PW–LDPE–SA Binder System

    Science.gov (United States)

    Ren, Luquan; Zhou, Xueli; Song, Zhengyi; Zhao, Che; Liu, Qingping; Xue, Jingze; Li, Xiujuan

    2017-01-01

    Recently, with a broadening range of available materials and alteration of feeding processes, several extrusion-based 3D printing processes for metal materials have been developed. An emerging process is applicable for the fabrication of metal parts into electronics and composites. In this paper, some critical parameters of extrusion-based 3D printing processes were optimized by a series of experiments with a melting extrusion printer. The raw materials were copper powder and a thermoplastic organic binder system and the system included paraffin wax, low density polyethylene, and stearic acid (PW–LDPE–SA). The homogeneity and rheological behaviour of the raw materials, the strength of the green samples, and the hardness of the sintered samples were investigated. Moreover, the printing and sintering parameters were optimized with an orthogonal design method. The influence factors in regard to the ultimate tensile strength of the green samples can be described as follows: infill degree > raster angle > layer thickness. As for the sintering process, the major factor on hardness is sintering temperature, followed by holding time and heating rate. The highest hardness of the sintered samples was very close to the average hardness of commercially pure copper material. Generally, the extrusion-based printing process for producing metal materials is a promising strategy because it has some advantages over traditional approaches for cost, efficiency, and simplicity. PMID:28772665

  10. Robust H2 performance for sampled-data systems

    DEFF Research Database (Denmark)

    Rank, Mike Lind

    1997-01-01

    Robust H2 performance conditions under structured uncertainty, analogous to well known methods for H∞ performance, have recently emerged in both discrete and continuous-time. This paper considers the extension into uncertain sampled-data systems, taking into account inter-sample behavior. Convex...... conditions for robust H2 performance are derived for different uncertainty sets...

  11. Judgment sampling: a health care improvement perspective.

    Science.gov (United States)

    Perla, Rocco J; Provost, Lloyd P

    2012-01-01

    Sampling plays a major role in quality improvement work. Random sampling (assumed by most traditional statistical methods) is the exception in improvement situations. In most cases, some type of "judgment sample" is used to collect data from a system. Unfortunately, judgment sampling is not well understood. Judgment sampling relies upon those with process and subject matter knowledge to select useful samples for learning about process performance and the impact of changes over time. It many cases, where the goal is to learn about or improve a specific process or system, judgment samples are not merely the most convenient and economical approach, they are technically and conceptually the most appropriate approach. This is because improvement work is done in the real world in complex situations involving specific areas of concern and focus; in these situations, the assumptions of classical measurement theory neither can be met nor should an attempt be made to meet them. The purpose of this article is to describe judgment sampling and its importance in quality improvement work and studies with a focus on health care settings.

  12. Examples of data processing systems. Data processing system for JT-60

    International Nuclear Information System (INIS)

    Aoyagi, Tetsuo

    1996-01-01

    JT-60 data processing system is a large computer complex system including a lot of micro-computers, several mini-computers, and a main-frame computer. As general introduction of the original system configuration has been published previously, some improvements are described here. Transient mass data storage system, network database server, a data acquisition system using engineering workstations, and a graphic terminal emulator for X-Window are presented. These new features are realized by utilizing recent progress in computer and network technology and carefully designed user interface specification of the original system. (author)

  13. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  14. Sample-hold and analog multiplexer for multidetector systems

    Energy Technology Data Exchange (ETDEWEB)

    Goswami, G C; Ghoshdostidar, M R; Ghosh, B; Chaudhuri, N [North Bengal Univ., Darjeeling (India). Dept. of Physics

    1982-08-15

    A new sample-hold circuit with an analog multiplexer system is described. Designed for multichannel acquistion of data from an air shower array, the system is being used for accurate measurements of pulse heights from 16 channels by the use of a single ADC.

  15. Security of legacy process control systems : Moving towards secure process control systems

    NARCIS (Netherlands)

    Oosterink, M.

    2012-01-01

    This white paper describes solutions which organisations may use to improve the security of their legacy process control systems. When we refer to a legacy system, we generally refer to old methodologies, technologies, computer systems or applications which are still in use, despite the fact that

  16. Demonstrating Reliable High Level Waste Slurry Sampling Techniques to Support Hanford Waste Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, Steven E.

    2013-11-11

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capability using simulated Hanford High-Level Waste (HL W) formulations. This work represents one of the remaining technical issues with the high-level waste treatment mission at Hanford. The TOC must demonstrate the ability to adequately mix and sample high-level waste feed to meet the WTP Waste Acceptance Criteria and Data Quality Objectives. The sampling method employed must support both TOC and WTP requirements. To facilitate information transfer between the two facilities the mixing and sampling demonstrations are led by the One System Integrated Project Team. The One System team, Waste Feed Delivery Mixing and Sampling Program, has developed a full scale sampling loop to demonstrate sampler capability. This paper discusses the full scale sampling loops ability to meet precision and accuracy requirements, including lessons learned during testing. Results of the testing showed that the Isolok(R) sampler chosen for implementation provides precise, repeatable results. The Isolok(R) sampler accuracy as tested did not meet test success criteria. Review of test data and the test platform following testing by a sampling expert identified several issues regarding the sampler used to provide reference material used to judge the Isolok's accuracy. Recommendations were made to obtain new data to evaluate the sampler's accuracy utilizing a reference sampler that follows good sampling protocol.

  17. Demonstrating Reliable High Level Waste Slurry Sampling Techniques to Support Hanford Waste Processing

    International Nuclear Information System (INIS)

    Kelly, Steven E.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capability using simulated Hanford High-Level Waste (HL W) formulations. This work represents one of the remaining technical issues with the high-level waste treatment mission at Hanford. The TOC must demonstrate the ability to adequately mix and sample high-level waste feed to meet the WTP Waste Acceptance Criteria and Data Quality Objectives. The sampling method employed must support both TOC and WTP requirements. To facilitate information transfer between the two facilities the mixing and sampling demonstrations are led by the One System Integrated Project Team. The One System team, Waste Feed Delivery Mixing and Sampling Program, has developed a full scale sampling loop to demonstrate sampler capability. This paper discusses the full scale sampling loops ability to meet precision and accuracy requirements, including lessons learned during testing. Results of the testing showed that the Isolok(R) sampler chosen for implementation provides precise, repeatable results. The Isolok(R) sampler accuracy as tested did not meet test success criteria. Review of test data and the test platform following testing by a sampling expert identified several issues regarding the sampler used to provide reference material used to judge the Isolok's accuracy. Recommendations were made to obtain new data to evaluate the sampler's accuracy utilizing a reference sampler that follows good sampling protocol

  18. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  19. Compact field color schlieren system for use in microgravity materials processing

    Science.gov (United States)

    Poteet, W. M.; Owen, R. B.

    1986-01-01

    A compact color schlieren system designed for field measurement of materials processing parameters has been built and tested in a microgravity environment. Improvements in the color filter design and a compact optical arrangement allowed the system described here to retain the traditional advantages of schlieren, such as simplicity, sensitivity, and ease of data interpretation. Testing was accomplished by successfully flying the instrument on a series of parabolic trajectories on the NASA KC-135 microgravity simulation aircraft. A variety of samples of interest in materials processing were examined. Although the present system was designed for aircraft use, the technique is well suited to space flight experimentation. A major goal of this effort was to accommodate the main optical system within a volume approximately equal to that of a Space Shuttle middeck locker. Future plans include the development of an automated space-qualified facility for use on the Shuttle and Space Station.

  20. Noncontact inspection laser system for characterization of piezoelectric samples

    International Nuclear Information System (INIS)

    Jimenez, F.J.; Frutos, J. de

    2004-01-01

    In this work measurements on a piezoelectric sample in dynamic behavior were taken, in particular, around the frequencies of resonance for the sample where the nonlineal effects are accentuated. Dimension changes in the sample need to be studied as that will allow a more reliable characterization of the piezoelectric samples. The goal of this research is to develop an inspection system able to obtain measurements, using a noncontact laser displacement transducer, also able to visualize, in three-dimensional graphic environment, the displacement that takes place in a piezoelectric sample surface. In resonant mode, the vibration mode of the sample is visualized

  1. An in-process form error measurement system for precision machining

    International Nuclear Information System (INIS)

    Gao, Y; Huang, X; Zhang, Y

    2010-01-01

    In-process form error measurement for precision machining is studied. Due to two key problems, opaque barrier and vibration, the study of in-process form error optical measurement for precision machining has been a hard topic and so far very few existing research works can be found. In this project, an in-process form error measurement device is proposed to deal with the two key problems. Based on our existing studies, a prototype system has been developed. It is the first one of the kind that overcomes the two key problems. The prototype is based on a single laser sensor design of 50 nm resolution together with two techniques, a damping technique and a moving average technique, proposed for use with the device. The proposed damping technique is able to improve vibration attenuation by up to 21 times compared to the case of natural attenuation. The proposed moving average technique is able to reduce errors by seven to ten times without distortion to the form profile results. The two proposed techniques are simple but they are especially useful for the proposed device. For a workpiece sample, the measurement result under coolant condition is only 2.5% larger compared with the one under no coolant condition. For a certified Wyko test sample, the overall system measurement error can be as low as 0.3 µm. The measurement repeatability error can be as low as 2.2%. The experimental results give confidence in using the proposed in-process form error measurement device. For better results, further improvement in design and tests are necessary

  2. Fundamentals of process intensification: A process systems engineering view

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Sales Cruz, Alfonso Mauricio; Gani, Rafiqul

    2016-01-01

    This chapter gives an overview of the fundamentals of process intensification from a process systems engineering point of view. The concept of process intensification, including process integration, is explained together with the drivers for applying process intensification, which can be achieved...

  3. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  4. Molecular dynamics coupled with a virtual system for effective conformational sampling.

    Science.gov (United States)

    Hayami, Tomonori; Kasahara, Kota; Nakamura, Haruki; Higo, Junichi

    2018-07-15

    An enhanced conformational sampling method is proposed: virtual-system coupled canonical molecular dynamics (VcMD). Although VcMD enhances sampling along a reaction coordinate, this method is free from estimation of a canonical distribution function along the reaction coordinate. This method introduces a virtual system that does not necessarily obey a physical law. To enhance sampling the virtual system couples with a molecular system to be studied. Resultant snapshots produce a canonical ensemble. This method was applied to a system consisting of two short peptides in an explicit solvent. Conventional molecular dynamics simulation, which is ten times longer than VcMD, was performed along with adaptive umbrella sampling. Free-energy landscapes computed from the three simulations mutually converged well. The VcMD provided quicker association/dissociation motions of peptides than the conventional molecular dynamics did. The VcMD method is applicable to various complicated systems because of its methodological simplicity. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  5. Pippi — Painless parsing, post-processing and plotting of posterior and likelihood samples

    Science.gov (United States)

    Scott, Pat

    2012-11-01

    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi.

  6. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  7. The EnzymeTracker: an open-source laboratory information management system for sample tracking.

    Science.gov (United States)

    Triplet, Thomas; Butler, Gregory

    2012-01-26

    In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50 scientists. The EnzymeTracker is freely available online at http

  8. Computer-controlled sampling system for airborne particulates

    International Nuclear Information System (INIS)

    Hall, C.F.; Anspaugh, L.R.; Koval, J.S.; Phelps, P.L.; Steinhaus, R.J.

    1975-01-01

    A self-contained, mobile, computer-controlled air-sampling system has been designed and fabricated that also collects and records the data from eight meteorological sensors. The air-samplers are activated automatically when the collected meteorological data meet the criteria specified at the beginning of the data-collection run. The filters from the samplers are intended to collect airborne 239 Pu for later radionuclide analysis and correlation with the meteorological data for the study of resuspended airborne radioactivity and for the development of a predictive model. This paper describes the system hardware, discusses the system and software concepts, and outlines the operational procedures for the system

  9. Diagnostic effectiveness of immunoassays systems for hepatitis C virus in samples from multi-transfusion patients

    International Nuclear Information System (INIS)

    Rivero Jimenez, Rene A; Merlin Linares, Julio C; Blanco de Armas, Madelin; Navea Leyva, Leonor M

    2009-01-01

    Hepatitis C virus (CHV) blood-transmission is a health problem in Cuba and in the world. Some types of diagnostic immunoassays have been developed for the blood certification and in general have a high diagnostic sensitivity and specificity in healthy donors. However, its behavior in samples from multi-transfusion patients could by less effective. To assess the diagnostic effectiveness of the UMELISA HCV third generation Cuban immunoassay (TecnoSUMA, S.A. La Habana), Cuba) in samples from multi-transfusion patients, in parallel, 335 sera from patients were processed by UBI HCV EIA 4.0 (United Biomedical, EE.UU) and UMELISA HCV third generation, and the samples with incongruous results were verified by PCR COBAS AmpliScreen HCV Test, v2 system (Roche, EE.UU.) Comparing the UMELISA HCV third generation system with the UBI HCV EIA 4.0 it was achieved a Sd of 95,8% CI(95%): 92,5-99,15 and a Ed of 100% CI (95%): 99,7-100, with IY: 0,96 (0,93-0,99) with k: 0,0582 ID (95%): 0,9276-0,9888, p = 0,000. Both immunoassay systems were satisfactory for immunodiagnosis of multi-transfusion patients

  10. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    Liu Malin; Shao Youlin; Liu Bing

    2013-01-01

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  11. A gamma cammera image processing system

    International Nuclear Information System (INIS)

    Chen Weihua; Mei Jufang; Jiang Wenchuan; Guo Zhenxiang

    1987-01-01

    A microcomputer based gamma camera image processing system has been introduced. Comparing with other systems, the feature of this system is that an inexpensive microcomputer has been combined with specially developed hardware, such as, data acquisition controller, data processor and dynamic display controller, ect. Thus the process of picture processing has been speeded up and the function expense ratio of the system raised

  12. Super-rapid medical film processing system

    International Nuclear Information System (INIS)

    Honda, C.; Iwata, M.; Nozaki, H.

    1988-01-01

    A new super-rapid medical film processing system cuts processing time from 90 to 45 seconds, a critical advantage in traumatic injury, surgical operation, and other time-vital applications. The system consists of new films new processing chemicals (developer and fixer), and a new high-speed medical film processor. The system's creation is made possible by three new technologies. In film, multilayered monodispersed grains reduce processing time. In processing chemicals, an innovative design maximizes processing speed. And in the processor itself, a new drying apparatus increases drying efficiency. Together, these technologies achieve 45-second processing without degradation of image quality

  13. Multi-channel data acquisition and processing system for moessbauer spectroscopy

    International Nuclear Information System (INIS)

    Jin Ge; Yang Yanming

    1987-01-01

    A multi-channel data acquisition and processing system for moessbauer spectroscopy is described, which consists of an intelligent interface and a BC3-80 microcomputer. The system has eight data channels, each channel contains a counting circuit and a memory. A Z80-CPU is used as a main unit for control and access. The microcomputer is used for real-time displaying spectrum, saving the data to disk, printing data and data processing. The system is applicable to a high counting rate multi-wire proportional chamber. It can increase greatly the counting rate for measuring moessbauer spectrum. The signals of each wire in the chamber go through a corresponding amplifier and a differential discriminator and are recorded by a corresponding data channel, the data of each channel is added by the microcomputer. In addition, two channels can be used to measure an absorption and a scattering spectrum at the same time and the internal and the surface information of the sample are obtained simultaneously

  14. Stability of arsenic compounds in seafood samples during processing and storage by freezing

    DEFF Research Database (Denmark)

    Dahl, Lisbeth; Molin, Marianne; Amlund, Heidi

    2010-01-01

    was observed after processing or after storage by freezing. The content of tetramethylarsonium ion was generally low in all samples types, but increased significantly in all fried samples of both fresh and frozen seafood. Upon storage by freezing, the arsenobetaine content was reduced significantly, but only...

  15. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing

    Science.gov (United States)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.

    2017-01-01

    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and

  16. Research of pneumatic control transmission system for small irradiation samples

    International Nuclear Information System (INIS)

    Bai Zhongxiong; Zhang Haibing; Rong Ru; Zhang Tao

    2008-01-01

    In order to reduce the absorbed dose damage for the operator, pneumatic control has been adopted to realize the rapid transmission of small irradiation samples. On/off of pneumatic circuit and directions for the rapid transmission system are controlled by the electrical control part. The main program initializes the system and detects the location of the manual/automatic change-over switch, and call for the corresponding subprogram to achieve the automatic or manual operation. Automatic subprogram achieves the automatic sample transmission; Manual subprogram completes the deflation, and back and forth movement of the radiation samples. This paper introduces in detail the implementation of the system, in terms of both hardware and software design. (authors)

  17. Towards high performance processing in modern Java-based control systems

    International Nuclear Information System (INIS)

    Misiowiec, M.; Buczak, W.; Buttner, M.

    2012-01-01

    CERN controls software is often developed on Java foundation. Some systems carry out a combination of data, network and processor intensive tasks within strict time limits. Hence, there is a demand for high performing, quasi real time solutions. The system must handle dozens of thousands of data samples every second, along its three tiers, applying complex computations throughout. To accomplish the goal, a deep understanding of multi-threading, memory management and inter process communication was required. There are unexpected traps hidden behind an excessive use of 64 bit memory or severe impact on the processing flow of modern garbage collectors. Tuning JVM configuration significantly affects the execution of the code. Even more important is the amount of threads and the data structures used between them. Accurately dividing work into independent tasks might boost system performance. Thorough profiling with dedicated tools helped understand the bottlenecks and choose algorithmically optimal solutions. Different virtual machines were tested, in a variety of setups and garbage collection options. The overall work provided for discovering actual hard limits of the whole setup. We present this process of designing a challenging system in view of the characteristics and limitations of the contemporary Java run-time environment. (authors)

  18. Microchip systems for imaging liquid and high temperature processes in TEM & SEM

    DEFF Research Database (Denmark)

    Jensen, Eric; Canepa, Silvia; Møller-Nilsen, Rolf Erling Robberstad

    2014-01-01

    Microchips systems have found their way into electron microscopes in order to make miniatureplatforms for controlled liquid and gaseous environments that also begin to include electricalcontacts and other types of interactions with the sample, such as application of forces andirradiation with light.......This presentation will explain the different types of microchip systems and give examples of someof the results we have achieved with our devices and examples of how such devices can be usedfor research related to energy storage and conversion.Heaters can be made in several ways, and monocrystalline silicon......]. Both systems will allowhigh resolution imaging of heterogeneous electrochemical processes such as those in batteries.Based on the suspended microfluidic channels, we are also developing microchips that enableultrafast freezing of processes in liquids....

  19. High quality joining techniques: in-process assurance (IPA) welding system

    Energy Technology Data Exchange (ETDEWEB)

    Kaihara, Shoichiro [Ishikawajima-Harima Heavy Industries Co. Ltd., Tokyo (Japan)

    1996-08-01

    On July 1, 1995, the Product Liability Law was enforced, and in industrial world, further reliability has been demanded. Recently, accompanying the progress of electronics, the proportion taken by automatic welders and robots increased in welding. By memorizing proper welding conditions, the welding from initial to final passes can be done fully automatically. Also feedback mechanism was equipped to mechanized welders, and the in-process control has become to be feasible. The way of thinking on confirming in process welding quality in arc welding is explained. IPA welding system utilizes the multi-media collecting images and sound, samples the change of welding conditions and the state of arc on a same screen, and monitors the deviation from the range of proper welding conditions. At the time of abnormality, inspector or a computer carries out image diagnosis and welding control, and the system indicates the soundness of welded parts. The basic concept and the flow chart of this system are shown. The experiment of applying the system to arc welding is reported. The correlation of welding phenomena and welding conditions is examined. (K.I.)

  20. High quality joining techniques: in-process assurance (IPA) welding system

    International Nuclear Information System (INIS)

    Kaihara, Shoichiro

    1996-01-01

    On July 1, 1995, the Product Liability Law was enforced, and in industrial world, further reliability has been demanded. Recently, accompanying the progress of electronics, the proportion taken by automatic welders and robots increased in welding. By memorizing proper welding conditions, the welding from initial to final passes can be done fully automatically. Also feedback mechanism was equipped to mechanized welders, and the in-process control has become to be feasible. The way of thinking on confirming in process welding quality in arc welding is explained. IPA welding system utilizes the multi-media collecting images and sound, samples the change of welding conditions and the state of arc on a same screen, and monitors the deviation from the range of proper welding conditions. At the time of abnormality, inspector or a computer carries out image diagnosis and welding control, and the system indicates the soundness of welded parts. The basic concept and the flow chart of this system are shown. The experiment of applying the system to arc welding is reported. The correlation of welding phenomena and welding conditions is examined. (K.I.)

  1. Study on sampling of continuous linear system based on generalized Fourier transform

    Science.gov (United States)

    Li, Huiguang

    2003-09-01

    In the research of signal and system, the signal's spectrum and the system's frequency characteristic can be discussed through Fourier Transform (FT) and Laplace Transform (LT). However, some singular signals such as impulse function and signum signal don't satisfy Riemann integration and Lebesgue integration. They are called generalized functions in Maths. This paper will introduce a new definition -- Generalized Fourier Transform (GFT) and will discuss generalized function, Fourier Transform and Laplace Transform under a unified frame. When the continuous linear system is sampled, this paper will propose a new method to judge whether the spectrum will overlap after generalized Fourier transform (GFT). Causal and non-causal systems are studied, and sampling method to maintain system's dynamic performance is presented. The results can be used on ordinary sampling and non-Nyquist sampling. The results also have practical meaning on research of "discretization of continuous linear system" and "non-Nyquist sampling of signal and system." Particularly, condition for ensuring controllability and observability of MIMO continuous systems in references 13 and 14 is just an applicable example of this paper.

  2. A newly developed grab sampling system for collecting stratospheric air over Antarctica

    Directory of Open Access Journals (Sweden)

    Hideyuki Honda

    1996-07-01

    Full Text Available In order to measure the concentrations of various minor constituents and their isotopic ratios in the stratosphere over Antarctica, a simple grab sampling system was newly developed. The sampling system was designed to be launched by a small number of personnel using a rubber balloon under severe experimental conditions. Special attention was paid to minimize the contamination of sample air, as well as to allow easy handling of the system. The sampler consisted mainly of a 15l sample container with electromagnetic and manual valves, control electronics for executing the air sampling procedures and sending the position and status information of the sampler to the ground station, batteries and a transmitter. All these parts were assembled in an aluminum frame gondola with a shock absorbing system for landing. The sampler was equipped with a turn-over mechanism of the gondola to minimize contamination from the gondola, as well as with a GPS receiver and a rawinsonde for its tracking. Total weight of the sampler was about 11kg. To receive, display and store the position and status data of the sampling system at the ground station, a simple data acquisition system with a portable receiver and a microcomputer was also developed. A new gas handling system was prepared to simplify the injection of He gas into the balloon. For air sampling experiments, three sampling systems were launched at Syowa Station (69°00′S, 39°35′E, Antarctica and then recovered on sea ice near the station on January 22 and 25,1996.

  3. STP Position Paper: Recommended Practices for Sampling and Processing the Nervous System (Brain, Spinal Cord, Nerve, and Eye) during Nonclinical General Toxicity Studies

    Science.gov (United States)

    The Society of Toxicologic Pathology charged a Nervous System Sampling Working Group with devising recommended practices to routinely screen the central and peripheral nervous systems in Good Laboratory Practice-type nonclinical general toxicity studies. Brains should be trimmed ...

  4. Impact of implementing ISO 9001:2008 standard on the Spanish Renal Research Network biobank sample transfer process.

    Science.gov (United States)

    Cortés, M Alicia; Irrazábal, Emanuel; García-Jerez, Andrea; Bohórquez-Magro, Lourdes; Luengo, Alicia; Ortiz-Arduán, Alberto; Calleros, Laura; Rodríguez-Puyol, Manuel

    2014-01-01

    Biobank certification ISO 9001:2008 aims to improve the management of processes performed. This has two objectives: customer satisfaction and continuous improvement. This paper presents the impact of certification ISO 9001:2008 on the sample transfer process in a Spanish biobank specialising in kidney patient samples. The biobank experienced a large increase in the number of samples between 2009 (12,582 vials) and 2010 (37,042 vials). The biobank of the Spanish Renal Research Network (REDinREN), located at the University of Alcalá, has implemented ISO standard 9001:2008 for the effective management of human material given to research centres. Using surveys, we analysed two periods in the “sample transfer” process. During the first period between 1-10-12 and 26-11-12 (8 weeks), minimal changes were made to correct isolated errors. In the second period, between 7-01-13 and 18-02-13 (6 weeks), we carried out general corrective actions. The identification of problems and implementation of corrective actions for certification allowed: a 70% reduction in the process execution time, a significant increase (200%) in the number of samples processed and a 25% improvement in the process. The increase in the number of samples processed was directly related to process improvement. The certification of ISO standard 9001:2008, obtained in July 2013, allowed an improvement of the REDinREN biobank processes to be achieved, which increased quality and customer satisfaction.

  5. System to determine present elements in oily samples; Sistema para determinar elementos presentes en muestras oleosas

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza G, Y

    2004-11-01

    In the Chemistry Department of the National Institute of Nuclear Investigations of Mexico, dedicated to analyze samples of oleaginous material and of another origin, to determine the elements of the periodic table present in the samples, through the Neutron activation analysis technique (NAA). This technique has been developed to determine majority elements in any solid, aqueous, industrial and environmental sample, which consists basically on to irradiate a sample with neutrons coming from the TRIGA Mark III reactor and to carry out the analysis to obtain those gamma spectra that it emits, for finally to process the information, the quantification of the analysis it is carried out in a manual way, which requires to carry out a great quantity of calculations. The main objective of this project is the development of a software that allows to carry out the quantitative analysis of the NAA for the multielemental determination of samples in an automatic way. To fulfill the objective of this project it has been divided in four chapters: In the first chapter it is shortly presented the history on radioactivity and basic concepts that will allow us penetrate better to this work. In the second chapter the NAA is explained which is used in the sample analysis, the description of the process to be carried out, its are mentioned the characteristics of the used devices and an example of the process is illustrated. In the third chapter it is described the development of the algorithm and the selection of the programming language. The fourth chapter it is shown the structure of the system, the general form of operation, the execution of processes and the obtention of results. Later on the launched results are presented in the development of the present project. (Author)

  6. An Inductively-Powered Wireless Neural Recording System with a Charge Sampling Analog Front-End.

    Science.gov (United States)

    Lee, Seung Bae; Lee, Byunghun; Kiani, Mehdi; Mahmoudi, Babak; Gross, Robert; Ghovanloo, Maysam

    2016-01-15

    An inductively-powered wireless integrated neural recording system (WINeR-7) is presented for wireless and battery less neural recording from freely-behaving animal subjects inside a wirelessly-powered standard homecage. The WINeR-7 system employs a novel wide-swing dual slope charge sampling (DSCS) analog front-end (AFE) architecture, which performs amplification, filtering, sampling, and analog-to-time conversion (ATC) with minimal interference and small amount of power. The output of the DSCS-AFE produces a pseudo-digital pulse width modulated (PWM) signal. A circular shift register (CSR) time division multiplexes (TDM) the PWM pulses to create a TDM-PWM signal, which is fed into an on-chip 915 MHz transmitter (Tx). The AFE and Tx are supplied at 1.8 V and 4.2 V, respectively, by a power management block, which includes a high efficiency active rectifier and automatic resonance tuning (ART), operating at 13.56 MHz. The 8-ch system-on-a-chip (SoC) was fabricated in a 0.35-μm CMOS process, occupying 5.0 × 2.5 mm 2 and consumed 51.4 mW. For each channel, the sampling rate is 21.48 kHz and the power consumption is 19.3 μW. In vivo experiments were conducted on freely behaving rats in an energized homecage by continuously delivering 51.4 mW to the WINeR-7 system in a closed-loop fashion and recording local field potentials (LFP).

  7. DATA PROCESSING CONCEPTS FOR THE INTEGRATION OF SAR INTO OPERATIONAL VOLCANO MONITORING SYSTEMS

    Directory of Open Access Journals (Sweden)

    F. J. Meyer

    2013-05-01

    Full Text Available Remote Sensing plays a critical role in operational volcano monitoring due to the often remote locations of volcanic systems and the large spatial extent of potential eruption pre-cursor signals. Despite the all-weather capabilities of radar remote sensing and despite its high performance in monitoring change, the contribution of radar data to operational monitoring activities has been limited in the past. This is largely due to (1 the high data costs associated with radar data, (2 the slow data processing and delivery procedures, and (3 the limited temporal sampling provided by spaceborne radars. With this paper, we present new data processing and data integration techniques that mitigate some of the above mentioned limitations and allow for a meaningful integration of radar remote sensing data into operational volcano monitoring systems. The data integration concept presented here combines advanced data processing techniques with fast data access procedures in order to provide high quality radar-based volcano hazard information at improved temporal sampling rates. First performance analyses show that the integration of SAR can significantly improve the ability of operational systems to detect eruptive precursors. Therefore, the developed technology is expected to improve operational hazard detection, alerting, and management capabilities.

  8. Modulation transfer function cascade model for a sampled IR imaging system.

    Science.gov (United States)

    de Luca, L; Cardone, G

    1991-05-01

    The performance of the infrared scanning radiometer (IRSR) is strongly stressed in convective heat transfer applications where high spatial frequencies in the signal that describes the thermal image are present. The need to characterize more deeply the system spatial resolution has led to the formulation of a cascade model for the evaluation of the actual modulation transfer function of a sampled IR imaging system. The model can yield both the aliasing band and the averaged modulation response for a general sampling subsystem. For a line scan imaging system, which is the case of a typical IRSR, a rule of thumb that states whether the combined sampling-imaging system is either imaging-dependent or sampling-dependent is proposed. The model is tested by comparing it with other noncascade models as well as by ad hoc measurements performed on a commercial digitized IRSR.

  9. Stabilization of nonlinear systems using sampled-data output-feedback fuzzy controller based on polynomial-fuzzy-model-based control approach.

    Science.gov (United States)

    Lam, H K

    2012-02-01

    This paper investigates the stability of sampled-data output-feedback (SDOF) polynomial-fuzzy-model-based control systems. Representing the nonlinear plant using a polynomial fuzzy model, an SDOF fuzzy controller is proposed to perform the control process using the system output information. As only the system output is available for feedback compensation, it is more challenging for the controller design and system analysis compared to the full-state-feedback case. Furthermore, because of the sampling activity, the control signal is kept constant by the zero-order hold during the sampling period, which complicates the system dynamics and makes the stability analysis more difficult. In this paper, two cases of SDOF fuzzy controllers, which either share the same number of fuzzy rules or not, are considered. The system stability is investigated based on the Lyapunov stability theory using the sum-of-squares (SOS) approach. SOS-based stability conditions are obtained to guarantee the system stability and synthesize the SDOF fuzzy controller. Simulation examples are given to demonstrate the merits of the proposed SDOF fuzzy control approach.

  10. Solubility of airborne uranium samples from uranium processing plant

    International Nuclear Information System (INIS)

    Kravchik, T.; Oved, S.; Sarah, R.; Gonen, R.; Paz-Tal, O.; Pelled, O.; German, U.; Tshuva, A.

    2005-01-01

    Full text: During the production and machining processes of uranium metal, aerosols might be released to the air. Inhalation of these aerosols is the main route of internal exposure of workers. To assess the radiation dose from the intake of these uranium compounds it is necessary to know their absorption type, based on their dissolution rate in extracellular aqueous environment of lung fluid. The International Commission on Radiological Protection (ICRP) has assigned UF4 and U03 to absorption type M (blood absorption which contains a 10 % fraction with an absorption rate of 10 minutes and 90 % fraction with an absorption rate of 140 fays) and UO2 and U3O8 to absorption type S (blood absorption rate with a half-time of 7000 days) in the ICRP-66 model.The solubility classification of uranium compounds defined by the ICRP can serve as a general guidance. At specific workplaces, differences can be encountered, because of differences in compounds production process and the presence of additional compounds, with different solubility characteristics. According to ICRP recommendations, material-specific rates of absorption should be preferred to default parameters whenever specific experimental data exists. Solubility profiles of uranium aerosols were determined by performing in vitro chemical solubility tests on air samples taken from uranium production and machining facilities. The dissolution rate was determined over 100 days in a simultant solution of the extracellular airway lining fluid. The filter sample was immersed in a test vial holding 60 ml of simultant fluid, which was maintained at a 37 o C inside a thermostatic bath and at a physiological pH of 7.2-7.6. The test vials with the solution were shaken to simulate the conditions inside the extracellular aqueous environment of the lung as much as possible. The tests indicated that the uranium aerosols samples taken from the metal production and machining facilities at the Nuclear Research Center Negev (NRCN

  11. Relative humidity effects on water vapour fluxes measured with closed-path eddy-covariance systems with short sampling lines

    DEFF Research Database (Denmark)

    Fratini, Gerardo; Ibrom, Andreas; Arriga, Nicola

    2012-01-01

    It has been formerly recognised that increasing relative humidity in the sampling line of closed-path eddy-covariance systems leads to increasing attenuation of water vapour turbulent fluctuations, resulting in strong latent heat flux losses. This occurrence has been analyzed for very long (50 m...... from eddy-covariance systems featuring short (4 m) and very short (1 m) sampling lines running at the same clover field and show that relative humidity effects persist also for these setups, and should not be neglected. Starting from the work of Ibrom and co-workers, we propose a mixed method...... and correction method proposed here is deemed applicable to closed-path systems featuring a broad range of sampling lines, and indeed applicable also to passive gases as a special case. The methods described in this paper are incorporated, as processing options, in the free and open-source eddy...

  12. Development and Evaluation of a Pilot Prototype Automated Online Sampling System

    International Nuclear Information System (INIS)

    Whitaker, M.J.

    2000-01-01

    An automated online sampling system has been developed for the BNFL-Hanford Technetium Monitoring Program. The system was designed to be flexible and allows for the collection and delivery of samples to a variety of detection devices that may be used

  13. Data Validation Package - April and July 2015 Groundwater and Surface Water Sampling at the Gunnison, Colorado, Processing Site

    Energy Technology Data Exchange (ETDEWEB)

    Linard, Joshua [Dept. of Energy (DOE), Washington, DC (United States). Office of Legacy Management; Campbell, Sam [Navarro Research and Engineering, Inc., Oak Ridge, TN (United States)

    2016-02-01

    This event included annual sampling of groundwater and surface water locations at the Gunnison, Colorado, Processing Site. Sampling and analyses were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites. Samples were collected from 28 monitoring wells, three domestic wells, and six surface locations in April at the processing site as specified in the 2010 Ground Water Compliance Action Plan for the Gunnison, Colorado, Processing Site. Domestic wells 0476 and 0477 were sampled in July because the homes were unoccupied in April, and the wells were not in use. Duplicate samples were collected from locations 0113, 0248, and 0477. One equipment blank was collected during this sampling event. Water levels were measured at all monitoring wells that were sampled. No issues were identified during the data validation process that requires additional action or follow-up.

  14. Progressive sample processing of band selection for hyperspectral imagery

    Science.gov (United States)

    Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu

    2017-10-01

    Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.

  15. Wavelet data processing of micro-Raman spectra of biological samples

    Science.gov (United States)

    Camerlingo, C.; Zenone, F.; Gaeta, G. M.; Riccio, R.; Lepore, M.

    2006-02-01

    A wavelet multi-component decomposition algorithm is proposed for processing data from micro-Raman spectroscopy (μ-RS) of biological tissue. The μ-RS has been recently recognized as a promising tool for the biopsy test and in vivo diagnosis of degenerative human tissue pathologies, due to the high chemical and structural information contents of this spectroscopic technique. However, measurements of biological tissues are usually hampered by typically low-level signals and by the presence of noise and background components caused by light diffusion or fluorescence processes. In order to overcome these problems, a numerical method based on discrete wavelet transform is used for the analysis of data from μ-RS measurements performed in vitro on animal (pig and chicken) tissue samples and, in a preliminary form, on human skin and oral tissue biopsy from normal subjects. Visible light μ-RS was performed using a He-Ne laser and a monochromator with a liquid nitrogen cooled charge coupled device equipped with a grating of 1800 grooves mm-1. The validity of the proposed data procedure has been tested on the well-characterized Raman spectra of reference acetylsalicylic acid samples.

  16. MarsVac: Pneumatic Sampling System for Planetary Exploration

    Science.gov (United States)

    Zacny, K.; Mungas, G.; Chu, P.; Craft, J.; Davis, K.

    2008-12-01

    We are proposing a Mars Sample Return scheme whereby a sample of regolith is acquired directly into a Mars Ascent Vehicle using a pneumatic system. Unlike prior developments that used suction to collect fines, the proposed system uses positive pressure to move the regolith. We envisage 3 pneumatic tubes to be embedded inside the 3 legs of the lander. Upon landing, the legs will burry themselves into the regolith and the tubes will fill up with regolith. With one puff of gas, the regolith can be lifted into a sampling chamber onboard of the Mars Ascent Vehicle. An additional chamber can be opened to acquire atmospheric gas and dust. The entire MSR will require 1) an actuator to open/close sampling chamber and 2) a valve to open gas cylinder. In the most recent study related to lunar excavation and funded under the NASA SBIR program we have shown that it is possible lift over 3000 grams of soil with only 1 gram of gas at 1atm. Tests conducted under Mars atmospheric pressure conditions (5 torr). In September of 2008, we will be performing tests at 1/6thg (Moon) and 1/3g (Mars) to determine mass lifting efficiencies in reduced gravities.

  17. Double-Shell Tank (DST) Ventilation System Vapor Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2000-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples from the primary ventilation systems of the AN, AP, AW, and AY/AZ tank farms. Sampling will be performed in accordance with Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Air DQO) (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications. Vapor samples will be obtained from tank farm ventilation systems, downstream from the tanks and upstream of any filtration. Samples taken in support of the DQO will consist of SUMMA(trademark) canisters, triple sorbent traps (TSTs), sorbent tube trains (STTs), polyurethane foam (PUF) samples. Particulate filter samples and tritium traps will be taken for radiation screening to allow the release of the samples for analysis. The following sections provide the general methodology and procedures to be used in the preparation, retrieval, transport, analysis, and reporting of results from the vapor samples

  18. 40 CFR 205.171-2 - Test exhaust system sample selection and preparation.

    Science.gov (United States)

    2010-07-01

    ... Systems § 205.171-2 Test exhaust system sample selection and preparation. (a)(1) Exhaust systems... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Test exhaust system sample selection and preparation. 205.171-2 Section 205.171-2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  19. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Science.gov (United States)

    2010-04-01

    ... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... capsule weight variation; (2) Disintegration time; (3) Adequacy of mixing to assure uniformity and... production process, e.g., at commencement or completion of significant phases or after storage for long...

  20. Data process of liquid scintillation counting

    International Nuclear Information System (INIS)

    Ishikawa, Hiroaki; Kuwajima, Susumu.

    1975-01-01

    The use of liquid scintillation counting system has been significantly spread because automatic sample changers and printers have recently come to be incorporated. However, the system will be systematized completely if automatic data processing and the sample preparation of radioactive materials to be measured are realized. Dry or wet oxidation method is applied to the sample preparation when radioactive materials are hard to dissolve into scintillator solution. Since these several years, the automatic sample combustion system, in which the dry oxidation is automated, has been rapidly spread and serves greatly to labor saving. Since the printers generally indicate only counted number, data processing system has been developed, and speeded up calculating process, which automatically corrects quenching of samples for obtaining the final radioactivity required. The data processing system is roughly divided into on-line and off-line systems according to whether computers are connected directly or indirectly, while its hardware is classified to input, calculating and output devices. Also, the calculation to determine sample activity by external standard method is explained. (Wakatsuki, Y.)

  1. Using the sampling method to propagate uncertainties of physical parameters in systems with fissile material

    International Nuclear Information System (INIS)

    Campolina, Daniel de Almeida Magalhães

    2015-01-01

    There is an uncertainty for all the components that comprise the model of a nuclear system. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a realistic calculation that has been replacing conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. By analyzing the propagated uncertainty to the effective neutron multiplication factor (k eff ), the effects of the sample size, computational uncertainty and efficiency of a random number generator to represent the distributions that characterize physical uncertainty in a light water reactor was investigated. A program entitled GB s ample was implemented to enable the application of the random sampling method, which requires an automated process and robust statistical tools. The program was based on the black box model and the MCNPX code was used in and parallel processing for the calculation of particle transport. The uncertainties considered were taken from a benchmark experiment in which the effects in k eff due to physical uncertainties is done through a conservative method. In this work a script called GB s ample was implemented to automate the sampling based method, use multiprocessing and assure the necessary robustness. It has been found the possibility of improving the efficiency of the random sampling method by selecting distributions obtained from a random number generator in order to obtain a better representation of uncertainty figures. After the convergence of the method is achieved, in order to reduce the variance of the uncertainty propagated without increase in computational time, it was found the best number o components to be sampled. It was also observed that if the sampling method is used to calculate the effect on k eff due to physical uncertainties reported by

  2. State-of-the-art mass spectrometer system for determination of uranium and plutonium isotopic distributions in process samples

    International Nuclear Information System (INIS)

    Polson, C.A.

    1983-01-01

    A Finnigan MAT 261 automated thermal ionization mass spectrometer system was purchased by the Savannah River Plant. The MAT 261 is a highly precise, fully automated instrument. Many features make this instrument the state-of-the-art technology in precision isotopic composition measurements. A unique feature of the MAT 261 is the ion detection system which permits measurement of the three uranium or plutonium masses simultaneously. All Faraday cup measuring channels are of the same design and each is equipped with a dedicated amplifier. Each amplifier is connected to a linear voltage/frequency measuring system for ion current integration. These outputs are fed into a Hewlett-Packard 9845T desk-top computer. The computer, and the Finnigan developed software package, control filament heating cycles, sample preconditioning, ion beam focusing, carrousel rotation, mass selection, and data collection and reduction. Precision, accuracy, and linearity were determined under normal laboratory conditions using a NBS uranium suite of standards. These results along with other development in setting up the instrument are presented

  3. Using Process Mining to Learn from Process Changes in Evolutionary Systems

    NARCIS (Netherlands)

    Günther, Christian W.; Rinderle, S.B.; Reichert, M.U.; van der Aalst, Wil M.P.; Recker, Jan

    2008-01-01

    Traditional information systems struggle with the requirement to provide flexibility and process support while still enforcing some degree of control. Accordingly, adaptive process management systems (PMSs) have emerged that provide some flexibility by enabling dynamic process changes during

  4. Development of RF non-IQ sampling module for Helium RFQ LLRF system

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Hae-Seong; Ahn, Tae-Sung; Kim, Seong-Gu; Kwon, Hyeok-Jung; Kim, Han-Sung; Song, Young-Gi; Seol, Kyung-Tae; Cho, Yong-Sub [KOMAC, Gyeongju (Korea, Republic of)

    2015-05-15

    KOMAC (Korea Multi-purpose Accelerator Complex) has a plan to develop the helium irradiation system. This system includes the Ion source, LEBT, RFQ, MEBT systems to transport helium particles to the target. Especially, the RFQ (Radio Frequency Quadrupole) system should receive the 200MHz RF within 1% amplitude error stability. For supplying stable 200MHz RF to the RFQ, the low-level radio frequency (LLRF) should be controlled by control system. The helium RFQ LLRF control system adopted non- IQ sampling method to sample the analog input RF. Sampled input data will be calculated to get the I, Q values. These I, Q values will be used to monitor the amplitude and phase of the RF signal. In this paper, non-IQ sampling logic and amplitude and phase calculating logic of the FPGA will be introduced. Using Xilinx ISE design suite which is tool for developing the FPGA logic module, non-IQ sampling module and amplitude and phase computing module developed. In the future, PI gain module and frequency error computing module will be developed.

  5. A real-time data acquisition and processing system for the analytical laboratory automation of a HTR spent fuel reprocessing facility

    International Nuclear Information System (INIS)

    Watzlawik, K.H.

    1979-12-01

    A real-time data acquisition and processing system for the analytical laboratory of an experimental HTR spent fuel reprocessing facility is presented. The on-line open-loop system combines in-line and off-line analytical measurement procedures including data acquisition and evaluation as well as analytical laboratory organisation under the control of a computer-supported laboratory automation system. In-line measurements are performed for density, volume and temperature in process tanks and registration of samples for off-line measurements. Off-line computer-coupled experiments are potentiometric titration, gas chromatography and X-ray fluorescence analysis. Organisational sections like sample registration, magazining, distribution and identification, multiple data assignment and especially calibrations of analytical devices are performed by the data processing system. (orig.) [de

  6. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-01-01

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  7. Development of a compact and cost effective multi-input digital signal processing system

    Science.gov (United States)

    Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun

    2018-01-01

    A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.

  8. Process control using modern systems of information processing

    International Nuclear Information System (INIS)

    Baldeweg, F.

    1984-01-01

    Modern digital automation techniques allow the application of demanding types of process control. These types of process control are characterized by their belonging to higher levels in a multilevel model. Functional and technical aspects of the performance of digital automation plants are presented and explained. A modern automation system is described considering special procedures of process control (e.g. real time diagnosis)

  9. A microcomputer controlled sample changer system for γ-ray spectroscopy

    International Nuclear Information System (INIS)

    Jost, D.T.; Kraehenbuehl, U.; Gunten, H.R. von

    1982-01-01

    A Z-80 based microcomputer is used to control a sample changer system connected to two γ-ray spectrometers. Samples are changed according to preselected counting criteria (maximum time and/or desired precision). Special precautions were taken to avoid the loss of information and of samples in case of a power failure. (orig.)

  10. Synchrotron/crystal sample preparation

    Science.gov (United States)

    Johnson, R. Barry

    1993-01-01

    The Center for Applied Optics (CAO) of the University of Alabama in Huntsville (UAH) prepared this final report entitled 'Synchrotron/Crystal Sample Preparation' in completion of contract NAS8-38609, Delivery Order No. 53. Hughes Danbury Optical Systems (HDOS) is manufacturing the Advanced X-ray Astrophysics Facility (AXAF) mirrors. These thin-walled, grazing incidence, Wolter Type-1 mirrors, varying in diameter from 1.2 to 0.68 meters, must be ground and polished using state-of-the-art techniques in order to prevent undue stress due to damage or the presence of crystals and inclusions. The effect of crystals on the polishing and grinding process must also be understood. This involves coating special samples of Zerodur and measuring the reflectivity of the coatings in a synchrotron system. In order to gain the understanding needed on the effect of the Zerodur crystals by the grinding and polishing process, UAH prepared glass samples by cutting, grinding, etching, and polishing as required to meet specifications for witness bars for synchrotron measurements and for investigations of crystals embedded in Zerodur. UAH then characterized these samples for subsurface damage and surface roughness and figure.

  11. Nuclear-station post-accident liquid-sampling system: developed by Duke Power Company

    International Nuclear Information System (INIS)

    Burton, D.A.; Birch, M.L.; Orth, W.C.

    1981-01-01

    The accident at Three Mile Island showed that means must be provided to determine the radioactivity levels in high activity liquid and gaseous systems of a nuclear power plant without undue radiation exposure to personnel. The Duke Power Post Accident Liquid Sampling System provides the means for obtaining diluted liquid samples and diluted dissolved gas samples following a reactor accident involving substantial core damage. Their approach yields a straightforward engineering solution at a fraction of the cost of other systems. A description of the system, general design criteria, and color coded flow diagrams are included

  12. Test plan for evaluating the performance of the in-tank fluidic sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by a privatization contractor from double-shell feed tanks, 241-AP-102 and 241-AP-104, Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a conceptual sampling system that would be deployed in a feed tank riser, This sampling system will provide large volume, representative samples without the environmental, radiation exposure, and sample volume impacts of the current base-line ''grab'' sampling method. This test plan identifies ''proof-of-principle'' cold tests for the conceptual sampling system using simulant materials. The need for additional testing was identified as a result of completing tests described in the revision test plan document, Revision 1 outlines tests that will evaluate the performance and ability to provide samples that are representative of a tanks' content within a 95 percent confidence interval, to recovery from plugging, to sample supernatant wastes with over 25 wt% solids content, and to evaluate the impact of sampling at different heights within the feed tank. The test plan also identifies operating parameters that will optimize the performance of the sampling system

  13. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  14. A New Cryogenic Sample Manipulator For SRC's Scienta 2002 System

    International Nuclear Information System (INIS)

    Gundelach, Chad T.; Fisher, Mike V.; Hoechst, Hartmut

    2004-01-01

    We discuss the first bench tests of a sample manipulator which was recently designed at SRC for the Scienta 2002 User system. The manipulator concept utilizes the 10 deg. angular window of the Scienta in the horizontal plane (angle dispersion) by rotating the sample normal around the vertical axis while angular scans along the vertical axis (energy dispersion) are continuous within ±30 deg. relative to the electron lens by rotating the sample around the horizontal axis. With this concept it is possible to precisely map the entire two-dimensional k-space of a crystal by means of stitching together 10 deg. wide stripes centered +15 deg. to -50 deg. relative to the sample normal. Three degrees of translational freedom allow positioning the sample surface at the focal point of the analyzer. Two degrees of rotational freedom are available at this position for manipulating the sample. Samples are mounted to a standard holder and transferred to the manipulator via a load-lock system attached to a prep chamber. The manipulator is configured with a cryogenic cold head, an electrical heater, and a temperature sensor permitting continuous closed-loop operation for 20-380 K

  15. A system for on-line monitoring of light element concentration distributions in thin samples

    Energy Technology Data Exchange (ETDEWEB)

    Brands, P.J.M. E-mail: p.j.m.brands@tue.nl; Mutsaers, P.H.A.; Voigt, M.J.A. de

    1999-09-02

    At the Cyclotron Laboratory, a scanning proton microprobe is used to determine concentration distributions in biomedical samples. The data acquired in these measurements used to be analysed in a time consuming off-line analysis. To avoid the loss of valuable measurement and analysis time, DYANA was developed. DYANA is an on-line method for the analysis of data from biomedical measurements. By using a database of background shapes, light elements such as Na and Mg, can be fitted even more precisely than in conventional fitting procedures. The entire analysis takes only several seconds and is performed while the acquisition system is gathering a new subset of data. Data acquisition must be guaranteed and may not be interfered by other parallel processes. Therefore, the analysis, the data acquisition and the experiment control is performed on a PCI-based Pentium personal computer (PC), running a real-time operating system. A second PC is added to run a graphical user interface for interaction with the experimenter and the monitoring of the analysed results. The system is here illustrated using atherosclerotic tissue but is applicable to all kinds of thin samples.

  16. Performance of Identifiler Direct and PowerPlex 16 HS on the Applied Biosystems 3730 DNA Analyzer for processing biological samples archived on FTA cards.

    Science.gov (United States)

    Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal

    2012-09-01

    Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Identification of continuous-time systems from samples of input ...

    Indian Academy of Sciences (India)

    Abstract. This paper presents an introductory survey of the methods that have been developed for identification of continuous-time systems from samples of input±output data. The two basic approaches may be described as (i) the indirect method, where first a discrete-time model is estimated from the sampled data and then ...

  18. Bangkok's mass rapid transit system's commuter decision-making process in using integrated smartcards

    Directory of Open Access Journals (Sweden)

    Peerakan Kaewwongwattana

    2016-05-01

    Full Text Available This paper studied the decision-making process to use an integrated smartcard ticketing system by Bangkok metropolitan transit commuters. A second-order Confirmatory Factor Analysis using LISREL 9.10 was undertaken on Bangkok commuter's decision-making process on the use of an integrated smartcard system. The sample consisted of 300 Bangkok commuters obtained by accidental sampling using questionnaires with a 5-point Likert scale. The tools in the research questionnaires used scale estimation that achieved a confidence value of 0.84. The research instruments used rating scales measuring information search, alternative choices, and use decision on the 15 variables in the decision-making process which had factor loadings between 0.49 and 0.89 weight elements when sorted in descending order and overall had a high level. Use decision, alternative choices and information search had a factor of 0.89, 0.65 and 0.49, respectively. There was a good fit of the decision-making model to the empirical data (chi-square = 34.55, probability (p = 0.94, df = 49, RMSEA = 0.00, GFI = 0.98, AGFI = 0.96, SRMR = 0.04.

  19. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  20. Stationary stochastic processes theory and applications

    CERN Document Server

    Lindgren, Georg

    2012-01-01

    Some Probability and Process BackgroundSample space, sample function, and observablesRandom variables and stochastic processesStationary processes and fieldsGaussian processesFour historical landmarksSample Function PropertiesQuadratic mean propertiesSample function continuityDerivatives, tangents, and other characteristicsStochastic integrationAn ergodic resultExercisesSpectral RepresentationsComplex-valued stochastic processesBochner's theorem and the spectral distributionSpectral representation of a stationary processGaussian processesStationary counting processesExercisesLinear Filters - General PropertiesLinear time invariant filtersLinear filters and differential equationsWhite noise in linear systemsLong range dependence, non-integrable spectra, and unstable systemsThe ARMA-familyLinear Filters - Special TopicsThe Hilbert transform and the envelopeThe sampling theoremKarhunen-Loève expansionClassical Ergodic Theory and MixingThe basic ergodic theorem in L2Stationarity and transformationsThe ergodic th...

  1. Power processing systems for ion thrusters.

    Science.gov (United States)

    Herron, B. G.; Garth, D. R.; Finke, R. C.; Shumaker, H. A.

    1972-01-01

    The proposed use of ion thrusters to fulfill various communication satellite propulsion functions such as east-west and north-south stationkeeping, attitude control, station relocation and orbit raising, naturally leads to the requirement for lightweight, efficient and reliable thruster power processing systems. Collectively, the propulsion requirements dictate a wide range of thruster power levels and operational lifetimes, which must be matched by the power processing. This paper will discuss the status of such power processing systems, present system design alternatives and project expected near future power system performance.

  2. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    Science.gov (United States)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  3. Infrared surface analysis using a newly developed thin-sample preparation system.

    Science.gov (United States)

    Nagai, Naoto; Nishiyama, Itsuo; Kishima, Yoshio; Iida, Katsuhiko; Mori, Koichi

    2009-01-01

    We developed a new sampling system, the Nano Catcher, for measuring the surface chemical structure of polymers or industrial products and we evaluated the performance of the system. The system can directly pick up surface species whose depth is on the order of approximately 100 nm and can easily provide a sample for a Fourier transform infrared (FT-IR) system without the necessity of passing it over to a measurement plate. The FT-IR reflection data obtained from the Nano Catcher were compared with those obtained using the attenuated total reflection (ATR) method and sampling by hand. Chemical structural analysis of a depth region from a few tens of nanometers to a few hundred nanometers can be directly performed using this system. Such depths are beyond the scope of conventional X-ray photoelectron spectroscopy (XPS) and ATR methods. We can expect the use of the Nano Catcher system to lead to a great improvement in the detection of signals of surface species in these depth regions.

  4. The EnzymeTracker: an open-source laboratory information management system for sample tracking

    Directory of Open Access Journals (Sweden)

    Triplet Thomas

    2012-01-01

    Full Text Available Abstract Background In many laboratories, researchers store experimental data on their own workstation using spreadsheets. However, this approach poses a number of problems, ranging from sharing issues to inefficient data-mining. Standard spreadsheets are also error-prone, as data do not undergo any validation process. To overcome spreadsheets inherent limitations, a number of proprietary systems have been developed, which laboratories need to pay expensive license fees for. Those costs are usually prohibitive for most laboratories and prevent scientists from benefiting from more sophisticated data management systems. Results In this paper, we propose the EnzymeTracker, a web-based laboratory information management system for sample tracking, as an open-source and flexible alternative that aims at facilitating entry, mining and sharing of experimental biological data. The EnzymeTracker features online spreadsheets and tools for monitoring numerous experiments conducted by several collaborators to identify and characterize samples. It also provides libraries of shared data such as protocols, and administration tools for data access control using OpenID and user/team management. Our system relies on a database management system for efficient data indexing and management and a user-friendly AJAX interface that can be accessed over the Internet. The EnzymeTracker facilitates data entry by dynamically suggesting entries and providing smart data-mining tools to effectively retrieve data. Our system features a number of tools to visualize and annotate experimental data, and export highly customizable reports. It also supports QR matrix barcoding to facilitate sample tracking. Conclusions The EnzymeTracker was designed to be easy to use and offers many benefits over spreadsheets, thus presenting the characteristics required to facilitate acceptance by the scientific community. It has been successfully used for 20 months on a daily basis by over 50

  5. Analysis format and evaluation methods for effluent particle sampling systems in nuclear facilities

    International Nuclear Information System (INIS)

    Schwendiman, L.C.; Glissmeyer, J.A.

    1976-06-01

    Airborne effluent sampling systems for nuclear facilities are frequently designed, installed, and operated without a systematic approach which discloses and takes into account all the circumstances and conditions which would affect the validity and adequacy of the sample. Without a comprehensive check list or something similar, the designer of the system may not be given the important information needed to provide a good design. In like manner, an already operating system may be better appraised. Furthermore, the discipline of a more formal approach may compel the one who will use the system to make sure he knows what he wants and can thus give the designer the needed information. An important consideration is the criteria to be applied to the samples to be taken. This analysis format consists of a listing of questions and statements calling forth the necessary information required to analyze a sampling system. With this information developed, one can proceed with an evaluation, the methodology of which is also discussed in the paper. Errors in probe placement, failure to sample at the proper rate, delivery line losses, and others are evaluated using mathematical models and empirically derived relationships. Experimental methods are also described for demonstrating that quality sampling will be achieved. The experiments include using a temporary, simple, but optimal sample collection system to evaluate the more complex systems. The use of tracer particles injected in the stream is also discussed. The samples obtained with the existing system are compared with those obtained by the temporary, optimal system

  6. Vapor and gas sampling of single-shell tank 241-U-104 using the in situ vapor sampling system

    International Nuclear Information System (INIS)

    Lockrem, L.L.

    1997-01-01

    The Vapor Issue.Resolution Program tasked the Vapor Team (VT) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-U-104. This document presents In Situ Vapor Sampling System (ISVS) data resulting from the July 16, 1996 sampling of SST 241-U-104. Analytical results will be presented in separate reports issued by the Pacific Northwest National Laboratory (PNNL) which supplied and analyzed the sample media

  7. Vapor and gas sampling of single-shell tank 241-S-103 using the in situ vapor sampling system

    International Nuclear Information System (INIS)

    Lockrem, L.L.

    1997-01-01

    The Vapor Issue Resolution Program tasked the Vapor Team (VT) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-S-103. This document presents In Situ Vapor Sampling System (ISVS) data resulting from the June 12, 1996 sampling of SST 241-S-103. Analytical results will be presented in separate reports issued by the Pacific Northwest National Laboratory (PNNL) which supplied and analyzed the sample media

  8. Vapor and gas sampling of single-shell tank 241-S-106 using the in situ vapor sampling system

    International Nuclear Information System (INIS)

    Lockrem, L.L.

    1997-01-01

    The Vapor Issue Resolution Program tasked the Vapor Team (VT) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-S-106. This document presents In Situ vapor Sampling System (ISVS) data resulting from the June 13, 1996 sampling of SST 241-S-106. Analytical results will be presented in separate reports issued by the Pacific Northwest National Laboratory (PNNL) which'supplied and analyzed the sample media

  9. Social network supported process recommender system.

    Science.gov (United States)

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  10. Social Network Supported Process Recommender System

    Directory of Open Access Journals (Sweden)

    Yanming Ye

    2014-01-01

    Full Text Available Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  11. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  12. A synchrotron radiation microtomography system for the analysis of trabecular bone samples.

    Science.gov (United States)

    Salomé, M; Peyrin, F; Cloetens, P; Odet, C; Laval-Jeantet, A M; Baruchel, J; Spanne, P

    1999-10-01

    X-ray computed microtomography is particularly well suited for studying trabecular bone architecture, which requires three-dimensional (3-D) images with high spatial resolution. For this purpose, we describe a three-dimensional computed microtomography (microCT) system using synchrotron radiation, developed at ESRF. Since synchrotron radiation provides a monochromatic and high photon flux x-ray beam, it allows high resolution and a high signal-to-noise ratio imaging. The principle of the system is based on truly three-dimensional parallel tomographic acquisition. It uses a two-dimensional (2-D) CCD-based detector to record 2-D radiographs of the transmitted beam through the sample under different angles of view. The 3-D tomographic reconstruction, performed by an exact 3-D filtered backprojection algorithm, yields 3-D images with cubic voxels. The spatial resolution of the detector was experimentally measured. For the application to bone investigation, the voxel size was set to 6.65 microm, and the experimental spatial resolution was found to be 11 microm. The reconstructed linear attenuation coefficient was calibrated from hydroxyapatite phantoms. Image processing tools are being developed to extract structural parameters quantifying trabecular bone architecture from the 3-D microCT images. First results on human trabecular bone samples are presented.

  13. 40 CFR 90.421 - Dilute gaseous exhaust sampling and analytical system description.

    Science.gov (United States)

    2010-07-01

    ... gas mixture temperature, measured at a point immediately ahead of the critical flow venturi, must be... analytical system description. (a) General. The exhaust gas sampling system described in this section is... requirements are as follows: (1) This sampling system requires the use of a Positive Displacement Pump—Constant...

  14. Effect of sample preparation methods on photometric determination of the tellurium and cobalt content in the samples of copper concentrates

    Directory of Open Access Journals (Sweden)

    Viktoriya Butenko

    2016-03-01

    Full Text Available Methods of determination of cobalt and nickel in copper concentrates currently used in factory laboratories are very labor intensive and time consuming. The limiting stage of the analysis is preliminary chemical sample preparation. Carrying out the decomposition process of industrial samples with concentrated mineral acids in open systems does not allow to improve the metrological characteristics of the methods, for this reason improvement the methods of sample preparation is quite relevant and has a practical interest. The work was dedicated to the determination of the optimal conditions of preliminary chemical preparation of copper concentrate samples for the subsequent determination of cobalt and tellurium in the obtained solution using tellurium-spectrophotometric method. Decomposition of the samples was carried out by acid dissolving in individual mineral acids and their mixtures by heating in an open system as well as by using ultrasonification and microwave radiation in a closed system. In order to select the optimal conditions for the decomposition of the samples in a closed system the phase contact time and ultrasonic generator’s power were varied. Intensification of the processes of decomposition of copper concentrates with nitric acid (1:1, ultrasound and microwave radiation allowed to transfer quantitatively cobalt and tellurium into solution spending 20 and 30 min respectively. This reduced the amount of reactants used and improved the accuracy of determination by running the process in strictly identical conditions.

  15. Sampling algorithms for validation of supervised learning models for Ising-like systems

    Science.gov (United States)

    Portman, Nataliya; Tamblyn, Isaac

    2017-12-01

    In this paper, we build and explore supervised learning models of ferromagnetic system behavior, using Monte-Carlo sampling of the spin configuration space generated by the 2D Ising model. Given the enormous size of the space of all possible Ising model realizations, the question arises as to how to choose a reasonable number of samples that will form physically meaningful and non-intersecting training and testing datasets. Here, we propose a sampling technique called ;ID-MH; that uses the Metropolis-Hastings algorithm creating Markov process across energy levels within the predefined configuration subspace. We show that application of this method retains phase transitions in both training and testing datasets and serves the purpose of validation of a machine learning algorithm. For larger lattice dimensions, ID-MH is not feasible as it requires knowledge of the complete configuration space. As such, we develop a new ;block-ID; sampling strategy: it decomposes the given structure into square blocks with lattice dimension N ≤ 5 and uses ID-MH sampling of candidate blocks. Further comparison of the performance of commonly used machine learning methods such as random forests, decision trees, k nearest neighbors and artificial neural networks shows that the PCA-based Decision Tree regressor is the most accurate predictor of magnetizations of the Ising model. For energies, however, the accuracy of prediction is not satisfactory, highlighting the need to consider more algorithmically complex methods (e.g., deep learning).

  16. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  17. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  18. Information Processing Capacity of Dynamical Systems

    Science.gov (United States)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  19. Tritium inventory differences: I. Sampling and U-getter pump holdup

    International Nuclear Information System (INIS)

    Ellefson, R.E.; Gill, J.T.

    1986-01-01

    Inventory differences (ID) in tritium material balance accounts (MBA) can occur with unmeasured transfers from the process or unmeasured holdup in the system. Small but cumulatively significant quantities of tritium can leave the MBA by normal capillary sampling of process gas operation. A predictor model for estimating the quantity of tritium leaving the MBA by sampling has been developed and implemented. The model calculates the gas transferred per sample; using the tritium concentration in the process and the number of samples, a quantity of tritium transferred is predicted. Verification of the model is made by PVT measurement of process transfer from multiple samplings. Comparison of predicted sample transfers with IDs from several MBA's reveals that sampling typically represents 50% of unmeasured transfers for regularly sampled processes

  20. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  1. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  4. On Distributed Port-Hamiltonian Process Systems

    NARCIS (Netherlands)

    Lopezlena, Ricardo; Scherpen, Jacquelien M.A.

    2004-01-01

    In this paper we use the term distributed port-Hamiltonian Process Systems (DPHPS) to refer to the result of merging the theory of distributed Port-Hamiltonian systems (DPHS) with the theory of process systems (PS). Such concept is useful for combining the systematic interconnection of PHS with the

  5. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  6. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  7. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  8. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  9. Target Tracking of a Linear Time Invariant System under Irregular Sampling

    Directory of Open Access Journals (Sweden)

    Jin Xue-Bo

    2012-11-01

    Full Text Available Due to event-triggered sampling in a system, or maybe with the aim of reducing data storage, tracking many applications will encounter irregular sampling time. By calculating the matrix exponential using an inverse Laplace transform, this paper transforms the irregular sampling tracking problem to the problem of tracking with time-varying parameters of a system. Using the common Kalman filter, the developed method is used to track a target for the simulated trajectory and video tracking. The results of simulation experiments have shown that it can obtain good estimation performance even at a very high irregular rate of measurement sampling time.

  10. Vapor and gas sampling of single-shell tank 241-B-102 using the in situ vapor sampling system

    International Nuclear Information System (INIS)

    Lockrem, L.L.

    1997-01-01

    The Vapor Issue Resolution Program tasked the Vapor Team (the team) to collect representative headspace samples from Hanford Site single-shell tank (SST) 241-B-102. This document presents sampling data resulting from the April 18, 1996 sampling of SST 241-B-102. Analytical results will be presented in a separate report issued by Pacific Northwest National Laboratory (PNNL), which supplied and analyzed the sampling media. The team, consisting of Sampling and Mobile Laboratories (SML) and Special Analytical Studies (SAS) personnel, used the vapor sampling system (VSS) to collect representative samples of the air, gases, and vapors from the headspace of SST 241-B-102 with sorbent traps and SUMMA canisters

  11. [Identification of Systemic Contaminations with Legionella Spec. in Drinking Water Plumbing Systems: Sampling Strategies and Corresponding Parameters].

    Science.gov (United States)

    Völker, S; Schreiber, C; Müller, H; Zacharias, N; Kistemann, T

    2017-05-01

    After the amendment of the Drinking Water Ordinance in 2011, the requirements for the hygienic-microbiological monitoring of drinking water installations have increased significantly. In the BMBF-funded project "Biofilm Management" (2010-2014), we examined the extent to which established sampling strategies in practice can uncover drinking water plumbing systems systemically colonized with Legionella. Moreover, we investigated additional parameters that might be suitable for detecting systemic contaminations. We subjected the drinking water plumbing systems of 8 buildings with known microbial contamination (Legionella) to an intensive hygienic-microbiological sampling with high spatial and temporal resolution. A total of 626 drinking hot water samples were analyzed with classical culture-based methods. In addition, comprehensive hygienic observations were conducted in each building and qualitative interviews with operators and users were applied. Collected tap-specific parameters were quantitatively analyzed by means of sensitivity and accuracy calculations. The systemic presence of Legionella in drinking water plumbing systems has a high spatial and temporal variability. Established sampling strategies were only partially suitable to detect long-term Legionella contaminations in practice. In particular, the sampling of hot water at the calorifier and circulation re-entrance showed little significance in terms of contamination events. To detect the systemic presence of Legionella,the parameters stagnation (qualitatively assessed) and temperature (compliance with the 5K-rule) showed better results. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Some experiences in controlling contamination of environmental materials during sampling and processing for low-level actinide analysis

    International Nuclear Information System (INIS)

    Harvey, B.R.; Lovett, M.B.; Boggis, S.J.

    1987-01-01

    Selected experiences in the control of contamination and the threat it poses to the quality of analytical data are discussed in the context of the whole analytical process from collection of marine enviromental samples, through handling and radiochemical separation, to the final interpretation of results. Examples include a demonstration of the contamination introduced during sediment core sectioning, contamination of sea water by a ship's pumping system, and the effect of filtration on the apparent partioning of radionuclides between solid and liquid phases of sea water. (author) 11 refs.; 4 tabs

  13. Some experiences in controlling contamination of environmental materials during sampling and processing for low-level actinide analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, B R; Lovett, M B; Boggis, S J

    1987-10-01

    Selected experiences in the control of contamination and the threat it poses to the quality of analytical data are discussed in the context of the whole analytical process from collection of marine enviromental samples, through handling and radiochemical separation, to the final interpretation of results. Examples include a demonstration of the contamination introduced during sediment core sectioning, contamination of sea water by a ship's pumping system, and the effect of filtration on the apparent partioning of radionuclides between solid and liquid phases of sea water. (author) 11 refs.; 4 tabs.

  14. Robotic Materials Handling in Space: Mechanical Design of the Robot Operated Materials Processing System HitchHiker Experiment

    Science.gov (United States)

    Voellmer, George

    1997-01-01

    The Goddard Space Flight Center has developed the Robot Operated Materials Processing System (ROMPS) that flew aboard STS-64 in September, 1994. The ROMPS robot transported pallets containing wafers of different materials from their storage racks to a furnace for thermal processing. A system of tapered guides and compliant springs was designed to deal with the potential misalignments. The robot and all the sample pallets were locked down for launch and landing. The design of the passive lockdown system, and the interplay between it and the alignment system are presented.

  15. Proposing a Process-Oriented Systems Research for Systems Thinking Development

    Directory of Open Access Journals (Sweden)

    Jae Eon Yu

    2017-04-01

    Full Text Available This paper discusses systems thinking development from Churchman’s systems ideas related to critical systems practice that appreciates the use of systems methods from sociolinguistic perspectives and poststructuralist thought. Systems research enabled us to understand and reinterpret Churchman’s philosophy and systems approach through the works of Deleuze and Foucault. Based upon the interpretation of Churchman’s philosophy and systems approach, I propose ‘process-oriented systems research’ developed from the use of social appreciative process and Churchman’s metasystem approach. By applying a metasystem approach into practice, I basically appreciate Deleuzian ethics and Foucault’s theory of discourse in order to deal with issues of power and knowledge, and metaethics or moral epistemology, where the meaning of good or bad is discussed. A detailed account of an application of process-oriented systems research is given to demonstrate how I use systems methods to examine the usefulness of the systems research in practice.

  16. Set Up of an Automatic Water Quality Sampling System in Irrigation Agriculture

    Directory of Open Access Journals (Sweden)

    Emanuel Heinz

    2013-12-01

    Full Text Available We have developed a high-resolution automatic sampling system for continuous in situ measurements of stable water isotopic composition and nitrogen solutes along with hydrological information. The system facilitates concurrent monitoring of a large number of water and nutrient fluxes (ground, surface, irrigation and rain water in irrigated agriculture. For this purpose we couple an automatic sampling system with a Wavelength-Scanned Cavity Ring Down Spectrometry System (WS-CRDS for stable water isotope analysis (δ2H and δ18O, a reagentless hyperspectral UV photometer (ProPS for monitoring nitrate content and various water level sensors for hydrometric information. The automatic sampling system consists of different sampling stations equipped with pumps, a switch cabinet for valve and pump control and a computer operating the system. The complete system is operated via internet-based control software, allowing supervision from nearly anywhere. The system is currently set up at the International Rice Research Institute (Los Baños, The Philippines in a diversified rice growing system to continuously monitor water and nutrient fluxes. Here we present the system’s technical set-up and provide initial proof-of-concept with results for the isotopic composition of different water sources and nitrate values from the 2012 dry season.

  17. Rapid Surface Sampling and Archival Record (RSSAR) System. Topical report, October 1, 1993--December 31, 1994

    International Nuclear Information System (INIS)

    1997-01-01

    This report describes the results of Phase 1 efforts to develop a Rapid Surface Sampling and Archival Record (RSSAR) System for the detection of semivolatile organic contaminants on concrete, transite, and metal surfaces. The characterization of equipment and building surfaces for the presence of contaminants as part of building decontamination and decommissioning activities is an immensely large tacks of concern to both government and industry. Contaminated and clean materials must be clearly identified and segregated so that the clean materials can be recycled or reused, if possible, or disposed of more cheaply as nonhazardous waste. Characterization of building and equipment surfaces will be needed during initial investigations, during cleanup operations, and during the final confirmatory process, increasing the total number of samples well beyond that needed for initial characterization. This multiplicity of information places a premium on the ability to handle and track data as efficiently as possible. Aware of the shortcomings of traditional surface characterization technology, GE, with DOE support has undertaken a 12-month effort to complete Phase 1 of a proposed four-phase program to develop the RSSAR system. The objectives of this work are to provide instrumentation to cost-effectively sample concrete and steel surfaces, provide a quick-look indication for the presence or absence of contaminants, and collect samples for later, more detailed analysis in a readily accessible and addressable form. The Rapid Surface Sampling and Archival Record (RSSAR) System will be a modular instrument made up of several components: (1) sampling heads for concrete surfaces, steel surfaces, and bulk samples; (2) quick-look detectors for photoionization and ultraviolet; (3) multisample trapping module to trap and store vaporized contaminants in a manner suitable for subsequent detailed lab-based analyses

  18. New and conventional evaporative systems in concentrating nitrogen samples prior to isotope-ratio analysis

    International Nuclear Information System (INIS)

    Lober, R.W.; Reeder, J.D.; Porter, L.K.

    1987-01-01

    Studies were conducted to quantify and compare the efficiencies of various evaporative systems used in evaporating 15 N samples prior to mass spectrometric analysis. Two new forced-air systems were designed and compared with a conventional forced-air system and with an open-air dry bath technique for effectiveness in preventing atmospheric contamination of evaporating samples. The forced-air evaporative systems significantly reduced the time needed to evaporate samples as compared to the open-air dry bath technique; samples were evaporated to dryness in 2.5 h with the forced-air systems as compared to 8 to 10 h on the open-air dry bath. The effectiveness of a given forced-air system to prevent atmospheric contamination of evaporating samples was significantly affected by the flow rate of the air stream flowing over the samples. The average atmospheric contaminant N found in samples evaporated on the open-air dry bath was 0.3 μ N, indicating very low concentrations of atmospheric NH 3 during this study. However, in previous studies the authors have experienced significant contamination of 15 N samples evaporated on an open-air dry bath because the level of contaminant N in the laboratory atmosphere varied and could not be adequately controlled. Average cross-contaminant levels of 0.28, 0.20, and 1.01 μ of N were measured between samples evaporated on the open-air dry bath, the newly-designed forced-air system, and the conventional forced-air system, respectively. The cross-contamination level is significantly higher on the conventional forced-air system than on the other two systems, and could significantly alter the atom % 15 N of high-enriched, low [N] evaporating samples

  19. Moonrise: Sampling the South Pole-Aitken Basin to Address Problems of Solar System Significance

    Science.gov (United States)

    Zeigler, R. A.; Jolliff, B. L.; Korotev, R. L.; Shearer, C. K.

    2016-01-01

    A mission to land in the giant South Pole-Aitken (SPA) Basin on the Moon's southern farside and return a sample to Earth for analysis is a high priority for Solar System Science. Such a sample would be used to determine the age of the SPA impact; the chronology of the basin, including the ages of basins and large impacts within SPA, with implications for early Solar System dynamics and the magmatic history of the Moon; the age and composition of volcanic rocks within SPA; the origin of the thorium signature of SPA with implications for the origin of exposed materials and thermal evolution of the Moon; and possibly the magnetization that forms a strong anomaly especially evident in the northern parts of the SPA basin. It is well known from studies of the Apollo regolith that rock fragments found in the regolith form a representative collection of many different rock types delivered to the site by the impact process (Fig. 1). Such samples are well documented to contain a broad suite of materials that reflect both the local major rock formations, as well as some exotic materials from far distant sources. Within the SPA basin, modeling of the impact ejection process indicates that regolith would be dominated by SPA substrate, formed at the time of the SPA basin-forming impact and for the most part moved around by subsequent impacts. Consistent with GRAIL data, the SPA impact likely formed a vast melt body tens of km thick that took perhaps several million years to cool, but that nonetheless represents barely an instant in geologic time that should be readily apparent through integrated geochronologic studies involving multiple chronometers. It is anticipated that a statistically significant number of age determinations would yield not only the age of SPA but also the age of several prominent nearby basins and large craters within SPA. This chronology would provide a contrast to the Imbrium-dominated chronology of the nearside Apollo samples and an independent test of

  20. Analysis of bioethanol samples through Inductively Coupled Plasma Mass Spectrometry with a total sample consumption system

    Science.gov (United States)

    Sánchez, Carlos; Lienemann, Charles-Philippe; Todolí, Jose-Luis

    2016-10-01

    Bioethanol real samples have been directly analyzed through ICP-MS by means of the so called High Temperature Torch Integrated Sample Introduction System (hTISIS). Because bioethanol samples may contain water, experiments have been carried out in order to determine the effect of ethanol concentration on the ICP-MS response. The ethanol content studied went from 0 to 50%, because higher alcohol concentrations led to carbon deposits on the ICP-MS interface. The spectrometer default spray chamber (double pass) equipped with a glass concentric pneumatic micronebulizer has been taken as the reference system. Two flow regimes have been evaluated: continuous sample aspiration at 25 μL min- 1 and 5 μL air-segmented sample injection. hTISIS temperature has been shown to be critical, in fact ICP-MS sensitivity increased with this variable up to 100-200 °C depending on the solution tested. Higher chamber temperatures led to either a drop in signal or a plateau. Compared with the reference system, the hTISIS improved the sensitivities by a factor included within the 4 to 8 range while average detection limits were 6 times lower for the latter device. Regarding the influence of the ethanol concentration on sensitivity, it has been observed that an increase in the temperature was not enough to eliminate the interferences. It was also necessary to modify the torch position with respect to the ICP-MS interface to overcome them. This fact was likely due to the different extent of ion plasma radial diffusion encountered as a function of the matrix when working at high chamber temperatures. When the torch was moved 1 mm plasma down axis, ethanolic and aqueous solutions provided statistically equal sensitivities. A preconcentration procedure has been applied in order to validate the methodology. It has been found that, under optimum conditions from the point of view of matrix effects, recoveries for spiked samples were close to 100%. Furthermore, analytical concentrations for real

  1. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    Science.gov (United States)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.

  2. Stochastic bounded consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with general sampling delay

    International Nuclear Information System (INIS)

    Wu Zhi-Hai; Peng Li; Xie Lin-Bo; Wen Ji-Wei

    2013-01-01

    In this paper we provide a unified framework for consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with a general sampling delay. First, a stochastic bounded consensus tracking protocol based on sampled data with a general sampling delay is presented by employing the delay decomposition technique. Then, necessary and sufficient conditions are derived for guaranteeing leader-follower multi-agent systems with measurement noises and a time-varying reference state to achieve mean square bounded consensus tracking. The obtained results cover no sampling delay, a small sampling delay and a large sampling delay as three special cases. Last, simulations are provided to demonstrate the effectiveness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  3. Design of penicillin fermentation process simulation system

    Science.gov (United States)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  4. Variable-temperature sample system for ion implantation at -192 to +5000C

    International Nuclear Information System (INIS)

    Fuller, C.T.

    1978-04-01

    A variable-temperature sample system based on exchange-gas coupling was developed for ion-implantation use. The sample temperature can be controlled from -192 0 C to +500 0 C with rapid cooling. The system also has provisions for focusing and alignment of the ion beam, electron suppression, temperature monitoring, sample current measuring, and cryo-shielding. Design considerations and operating characteristics are discussed. 5 figures

  5. Improving the Acquisition and Management of Sample Curation Data

    Science.gov (United States)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  6. Process fluid cooling system

    International Nuclear Information System (INIS)

    Farquhar, N.G.; Schwab, J.A.

    1977-01-01

    A system of heat exchangers is disclosed for cooling process fluids. The system is particularly applicable to cooling steam generator blowdown fluid in a nuclear plant prior to chemical purification of the fluid in which it minimizes the potential of boiling of the plant cooling water which cools the blowdown fluid

  7. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    Science.gov (United States)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  8. Process gas solidification system

    International Nuclear Information System (INIS)

    1980-01-01

    A process for withdrawing gaseous UF 6 from a first system and directing same into a second system for converting the gas to liquid UF 6 at an elevated temperature, additionally including the step of withdrawing the resulting liquid UF 6 from the second system, subjecting it to a specified sequence of flash-evaporation, cooling and solidification operations, and storing it as a solid in a plurality of storage vessels. (author)

  9. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Melek, Zeki; Keyser, John

    2011-01-01

    to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems

  10. Process fault diagnosis using knowledge-based systems

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1991-01-01

    Advancing technology in process plants has led to increased need for computer based process diagnostic systems to assist the operator. One approach to this problem is to use an embedded knowledge based system to interpret measurement signals. Knowledge based systems using only symptom based rules are inadequate for real time diagnosis of dynamic systems; therefore a model based approach is necessary. Though several forms of model based reasoning have been proposed, the use of qualitative causal models incorporating first principles knowledge of process behavior structure, and function appear to have the most promise as a robust modeling methodology. In this paper the structure of a diagnostic system is described which uses model based reasoning and conventional numerical methods to perform process diagnosis. This system is being applied to emergency diesel generator system in nuclear stations

  11. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    Science.gov (United States)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  12. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  13. A high volume sampling system for isotope determination of volatile halocarbons and hydrocarbons

    Directory of Open Access Journals (Sweden)

    E. Bahlmann

    2011-10-01

    Full Text Available The isotopic composition of volatile organic compounds (VOCs can provide valuable information on their sources and fate not deducible from mixing ratios alone. In particular the reported carbon stable isotope ratios of chloromethane and bromomethane from different sources cover a δ13C-range of almost 100‰ making isotope ratios a very promising tool for studying the biogeochemistry of these compounds. So far, the determination of the isotopic composition of C1 and C2 halocarbons others than chloromethane is hampered by their low mixing ratios.

    In order to determine the carbon isotopic composition of C1 and C2 halocarbons with mixing ratios as low as 1 pptv (i a field suitable cryogenic high volume sampling system and (ii a chromatographic set up for processing these samples have been developed and validated. The sampling system was tested at two different sampling sites, an urban and a coastal location in Northern Germany. The average δ13C-values for bromomethane at the urban site were −42.9 ± 1.1‰ and agreed well with previously published results. But at the coastal site bromomethane was substantially enriched in 13C by almost 10‰. Less pronounced differences were observed for chlorodifluoromethane, 1,1,1-trichloroethane and chloromethane. We suggest that these differences are related to the turnover of these compounds in ocean surface waters. Furthermore we report first carbon isotope ratios for iodomethane (−40.4‰ to −79.8‰, bromoform (−13.8‰ to 22.9‰, and other halocarbons.

  14. Method for Business Process Management System Selection

    OpenAIRE

    Westelaken, van de, Thijs; Terwee, Bas; Ravesteijn, Pascal

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However the research on BPMS is mostly focused on the architecture of the system and how to implement such systems. How to select a BPM system that fits the strategy and goals of a specific organization is ...

  15. Masking of endotoxin in surfactant samples: Effects on Limulus-based detection systems.

    Science.gov (United States)

    Reich, Johannes; Lang, Pierre; Grallert, Holger; Motschmann, Hubert

    2016-09-01

    Over the last few decades Limulus Amebocyte Lysate (LAL) has been the most sensitive method for the detection of endotoxins (Lipopolysaccharides) and is well accepted in a broad field of applications. Recently, Low Endotoxin Recovery (LER) in biopharmaceutical drug products has been noticed, whereby the detection of potential endotoxin contaminations is not ensured. Notably, most of these drug products contain surfactants, which can have crucial effects on the detectability of endotoxin. In order to analyze the driving forces of LER, endotoxin detection in samples containing nonionic surfactants in various buffer systems was investigated. The results show that the process of LER is kinetically controlled and temperature-dependent. Furthermore, only the simultaneous presence of nonionic surfactants and components capable of forming metal complexes resulted in LER. In addition, capacity experiments show that even hazardous amounts of endotoxin can remain undetectable within such formulation compositions. In conclusion, the LER phenomenon is caused by endotoxin masking and not by test interference. In this process, the supramolecular structure of endotoxin is altered and exhibits only a limited susceptibility in binding to the Factor C of Limulus-based detection systems. We propose a two-step mechanism of endotoxin masking by complex forming agents and nonionic surfactants. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  16. Control Charts for Processes with an Inherent Between-Sample Variation

    Directory of Open Access Journals (Sweden)

    Eva Jarošová

    2018-06-01

    Full Text Available A number of processes to which statistical control is applied are subject to various effects that cause random changes in the mean value. The removal of these fluctuations is either technologically impossible or economically disadvantageous under current conditions. The frequent occurrence of signals in the Shewhart chart due to these fluctuations is then undesirable and therefore the conventional control limits need to be extended. Several approaches to the design of the control charts with extended limits are presented in the paper and applied on the data from a real production process. The methods assume samples of size greater than 1. The performance of the charts is examined using the operating characteristic and average run length. The study reveals that in many cases, reducing the risk of false alarms is insufficient.

  17. Does Pneumatic Tube System Transport Contribute to Hemolysis in ED Blood Samples?

    Science.gov (United States)

    Phelan, Michael P; Reineks, Edmunds Z; Hustey, Fredric M; Berriochoa, Jacob P; Podolsky, Seth R; Meldon, Stephen; Schold, Jesse D; Chamberlin, Janelle; Procop, Gary W

    2016-09-01

    Our goal was to determine if the hemolysis among blood samples obtained in an emergency department and then sent to the laboratory in a pneumatic tube system was different from those in samples that were hand-carried. The hemolysis index is measured on all samples submitted for potassium analysis. We queried our hospital laboratory database system (SunQuest(®)) for potassium results for specimens obtained between January 2014 and July 2014. From facility maintenance records, we identified periods of system downtime, during which specimens were hand-carried to the laboratory. During the study period, 15,851 blood specimens were transported via our pneumatic tube system and 92 samples were hand delivered. The proportions of hemolyzed specimens in the two groups were not significantly different (13.6% vs. 13.1% [p=0.90]). Results were consistent when the criterion was limited to gross (3.3% vs 3.3% [p=0.99]) or mild (10.3% vs 9.8% [p=0.88]) hemolysis. The hemolysis rate showed minimal variation during the study period (12.6%-14.6%). We found no statistical difference in the percentages of hemolyzed specimens transported by a pneumatic tube system or hand delivered to the laboratory. Certain features of pneumatic tube systems might contribute to hemolysis (e.g., speed, distance, packing material). Since each system is unique in design, we encourage medical facilities to consider whether their method of transport might contribute to hemolysis in samples obtained in the emergency department.

  18. Development of DUMAS data processing system

    International Nuclear Information System (INIS)

    Sakamoto, Hiroshi

    1982-01-01

    In the field of nuclear experiments, the speed-up of data processing has been required recently along with the increase of the amount of data per event or the rate of event occurrence per unit time. In the DUMAS project of RCNP, the development of data processing system has been required, which can perform the high speed transfer and processing. The system should transfer the data of 5 multiwire proportional counters and other counters from the laboratory to the counting room at the rate of 1000 events every second, and also should perform considerably complex processes such as histogramming, particle identification, calculation of various polarizations as well as dumping to the secondary memory in the counting room. Furthermore, easy start-up, adjustment, inspection and maintenance and non-special hardware and software should be considered. A system presently being investigated for satisfying the above requirements is described. The main points are as follows: to employ CAMAC system for the interface with readout circuit, to transfer data between the laboratory and the counting room by converting the byte-serial transfer to the bit-serial optical fiber communication, and to unify the data processing computers to the PDP-11 family by connecting two miniature computers. Development of such a data processing system seems to be useful as an preparatory research for the development of NUMATRON measuring instruments. (Wakatsuki, Y.)

  19. Process evaluation distributed system

    Science.gov (United States)

    Moffatt, Christopher L. (Inventor)

    2006-01-01

    The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.

  20. SPALAX new generation: New process design for a more efficient xenon production system for the CTBT noble gas network.

    Science.gov (United States)

    Topin, Sylvain; Greau, Claire; Deliere, Ludovic; Hovesepian, Alexandre; Taffary, Thomas; Le Petit, Gilbert; Douysset, Guilhem; Moulin, Christophe

    2015-11-01

    The SPALAX (Système de Prélèvement Automatique en Ligne avec l'Analyse du Xénon) is one of the systems used in the International Monitoring System of the Comprehensive Nuclear Test Ban Treaty (CTBT) to detect radioactive xenon releases following a nuclear explosion. Approximately 10 years after the industrialization of the first system, the CEA has developed the SPALAX New Generation, SPALAX-NG, with the aim of increasing the global sensitivity and reducing the overall size of the system. A major breakthrough has been obtained by improving the sampling stage and the purification/concentration stage. The sampling stage evolution consists of increasing the sampling capacity and improving the gas treatment efficiency across new permeation membranes, leading to an increase in the xenon production capacity by a factor of 2-3. The purification/concentration stage evolution consists of using a new adsorbent Ag@ZSM-5 (or Ag-PZ2-25) with a much larger xenon retention capacity than activated charcoal, enabling a significant reduction in the overall size of this stage. The energy consumption of the system is similar to that of the current SPALAX system. The SPALAX-NG process is able to produce samples of almost 7 cm(3) of xenon every 12 h, making it the most productive xenon process among the IMS systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad

    2017-09-27

    Optimizing the performance of big-data streaming applications has become a daunting and time-consuming task: parameters may be tuned from a space of hundreds or even thousands of possible configurations. In this paper, we present a framework for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing three benchmark applications in Apache Storm. Our results show that a hill-climbing algorithm that uses a new heuristic sampling approach based on Latin Hypercube provides the best results. Our gray-box algorithm provides comparable results while being two to five times faster.

  2. Process-aware information systems : bridging people and software through process technology

    NARCIS (Netherlands)

    Dumas, M.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.

    2005-01-01

    A unifying foundation to design and implement process-aware information systems This publication takes on the formidable task of establishing a unifying foundation and set of common underlying principles to effectively model, design, and implement process-aware information systems. Authored by

  3. Introduction to digital image processing

    CERN Document Server

    Pratt, William K

    2013-01-01

    CONTINUOUS IMAGE CHARACTERIZATION Continuous Image Mathematical Characterization Image RepresentationTwo-Dimensional SystemsTwo-Dimensional Fourier TransformImage Stochastic CharacterizationPsychophysical Vision Properties Light PerceptionEye PhysiologyVisual PhenomenaMonochrome Vision ModelColor Vision ModelPhotometry and ColorimetryPhotometryColor MatchingColorimetry ConceptsColor SpacesDIGITAL IMAGE CHARACTERIZATION Image Sampling and Reconstruction Image Sampling and Reconstruction ConceptsMonochrome Image Sampling SystemsMonochrome Image Reconstruction SystemsColor Image Sampling SystemsImage QuantizationScalar QuantizationProcessing Quantized VariablesMonochrome and Color Image QuantizationDISCRETE TWO-DIMENSIONAL LINEAR PROCESSING Discrete Image Mathematical Characterization Vector-Space Image RepresentationGeneralized Two-Dimensional Linear OperatorImage Statistical CharacterizationImage Probability Density ModelsLinear Operator Statistical RepresentationSuperposition and ConvolutionFinite-Area Superp...

  4. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    Science.gov (United States)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  5. Integration mockup and process material management system

    Science.gov (United States)

    Verble, Adas James, Jr.

    1992-01-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  6. 3D Body Scanning Measurement System Associated with RF Imaging, Zero-padding and Parallel Processing

    Directory of Open Access Journals (Sweden)

    Kim Hyung Tae

    2016-04-01

    Full Text Available This work presents a novel signal processing method for high-speed 3D body measurements using millimeter waves with a general processing unit (GPU and zero-padding fast Fourier transform (ZPFFT. The proposed measurement system consists of a radio-frequency (RF antenna array for a penetrable measurement, a high-speed analog-to-digital converter (ADC for significant data acquisition, and a general processing unit for fast signal processing. The RF waves of the transmitter and the receiver are converted to real and imaginary signals that are sampled by a high-speed ADC and synchronized with the kinematic positions of the scanner. Because the distance between the surface and the antenna is related to the peak frequency of the conjugate signals, a fast Fourier transform (FFT is applied to the signal processing after the sampling. The sampling time is finite owing to a short scanning time, and the physical resolution needs to be increased; further, zero-padding is applied to interpolate the spectra of the sampled signals to consider a 1/m floating point frequency. The GPU and parallel algorithm are applied to accelerate the speed of the ZPFFT because of the large number of additional mathematical operations of the ZPFFT. 3D body images are finally obtained by spectrograms that are the arrangement of the ZPFFT in a 3D space.

  7. Advanced information processing system

    Science.gov (United States)

    Lala, J. H.

    1984-01-01

    Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.

  8. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  9. Improving preanalytic processes using the principles of lean production (Toyota Production System).

    Science.gov (United States)

    Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice

    2006-01-01

    The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.

  10. Spectral BRDF measurements of metallic samples for laser processing applications

    International Nuclear Information System (INIS)

    Vitali, L; Fustinoni, D; Gramazio, P; Niro, A

    2015-01-01

    The spectral bidirectional reflectance distribution function (BRDF) of metals plays an important role in industrial processing involving laser-surface interaction. In particular, in laser metal machining, absorbance is strongly dependent on the radiation incidence angle as well as on finishing and contamination grade of the surface, and in turn it can considerably affect processing results. Very recently, laser radiation is also used to structure metallic surfaces, in order to produce many particular optical effects, ranging from a high level polishing to angular color shifting. Of course, full knowledge of the spectral BRDF of these structured layers makes it possible to infer reflectance or color for any irradiation and viewing angles. In this paper, we present Vis-NIR spectral BRDF measurements of laser-polished metallic, opaque, flat samples commonly employed in such applications. The resulting optical properties seem to be dependent on the atmospheric composition during the polishing process in addition to the roughness. The measurements are carried out with a Perkin Elmer Lambda 950 double-beam spectrophotometer, equipped with the Absolute Reflectance/Transmittance Analyzer (ARTA) motorized goniometer. (paper)

  11. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    Science.gov (United States)

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  12. Hydrogen determination using secondary processes of recoil proton interaction with sample material

    International Nuclear Information System (INIS)

    Muminov, V.A.; Khajdarov, R.A.; Navalikhin, L.V.; Pardaev, Eh.

    1980-01-01

    Possibilities of hydrogen content determination in different materials according to secondary processes of interaction of recoil protons(irradiation in the field of fast neutrons) with sample material resulting in the appearance of characteristic X-ray irradiation are studied. Excitated irradiation is recorded with a detector placed in the protective screen and located at a certain distance from the object analyzed and neutron source. The method is tested taking as an example analysis of bromine-containing samples (30% Br, 0.5% H) and tungsten dioxide. The determination limit of hydrogen content constitutes 0.05% at confidence coefficient of 0.9. Neutron flux constituted 10 3 neutrons/cm 2 xs, the time of measurement being 15-20 minutes, the distance from the sample to the detector being 12-15 cm [ru

  13. Handbook of signal processing systems

    CERN Document Server

    Deprettere, Ed; Leupers, Rainer; Takala, Jarmo

    2013-01-01

    Handbook of Signal Processing Systems is organized in three parts. The first part motivates representative applications that drive and apply state-of-the art methods for design and implementation of signal processing systems; the second part discusses architectures for implementing these applications; the third part focuses on compilers and simulation tools, describes models of computation and their associated design tools and methodologies. This handbook is an essential tool for professionals in many fields and researchers of all levels.

  14. 28 CFR 28.12 - Collection of DNA samples.

    Science.gov (United States)

    2010-07-01

    ... Homeland Security, collecting DNA samples from: (1) Aliens lawfully in, or being processed for lawful... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Collection of DNA samples. 28.12 Section 28.12 Judicial Administration DEPARTMENT OF JUSTICE DNA IDENTIFICATION SYSTEM DNA Sample Collection...

  15. Cobit system in the audit processes of the systems of computer systems

    Directory of Open Access Journals (Sweden)

    Julio Jhovany Santacruz Espinoza

    2017-12-01

    Full Text Available The present research work has been carried out to show the benefits of the use of the COBIT system in the auditing processes of the computer systems, the problem is related to: How does it affect the process of audits in the institutions, use of the COBIT system? The main objective is to identify the incidence of the use of the COBIT system in the auditing process used by computer systems within both public and private organizations; In order to achieve our stated objectives of the research will be developed first with the conceptualization of key terms for an easy understanding of the subject, as a conclusion: we can say the COBIT system allows to identify the methodology by using information from the IT departments, to determine the resources of the (IT Information Technology, specified in the COBIT system, such as files, programs, computer networks, including personnel that use or manipulate the information, with the purpose of providing information that the organization or company requires to achieve its objectives.

  16. An NRTA data processing system: PROMAC-J

    International Nuclear Information System (INIS)

    Ikawa, Koji; Ihara, Hitoshi; Nishimura, Hideo

    1993-09-01

    Study of the application of Near-Real-Time Materials Accountancy has been done as an advanced safeguards measure for a spent nuclear fuel reprocessing plant. Also, from the viewpoint of practical application of NRTA concept to a real plant, a data processing system for the NRTA has been developed in consideration of effectiveness and promptness of data processing of NRTA data obtained in the field, so that a user can easily handle the analysis of time sequential MUF data based on the decision analyses in the field. The NRTA data processing system was used for processes and analyses of the NRTA data obtained during the period from September to December, 1985, a full scale field test of the proposed NRTA model for the PNC Tokai reprocessing plant. The result of this field test showed that the NRTA data processing system would be useful to provide sufficient information under the real plant circumstance. The data processing system was improved reflecting the experiences obtained in the field test. This report describes hardwares and softwares of the JAERI NRTA data processing system that was developed as an improvement of the previous system that had been developed and transferred to the PNC Tokai reprocessing plant. Improvements were made on both hardware components and softwares. (author)

  17. An NRTA data processing system: PROMAC-J

    International Nuclear Information System (INIS)

    Ihara, Hitoshi; Nishimura, Hideo; Ikawa, Koji

    1991-03-01

    Study of the application of Near-Real-Time Materials accountancy has been done as an advanced safeguards measure for a spent nuclear fuel reprocessing plant. Also, from the viewpoint of practical application of NRTA concept to a real plant, a data processing system for the NRTA has been developed in consideration of effectiveness and promptness of data processing of NRTA data obtained in the field, so that a user can easily handle the analysis of time sequential MUF data based on the decision analyses in the field. The NRTA data processing system was used for process and analyses of the NRTA data obtained during the period from September to December, 1985, a full scale field test of the proposed NRTA model for the PNC Tokai reprocessing plant. The result of this field test showed that the NRTA data processing system would be useful to provide sufficient information under the real plant circumstance. The data processing system was improved reflecting the experiences obtained in the field test. This report describes hardwares and softwares of the JAERI NRTA data processing system that was developed as an improvement of the previous system that had been developed and transferred to the PNC Tokai reprocessing plant. Improvements were made on both hardware components and softwares. (author)

  18. Tank waste remediation system (TWRS) privatization contractor samples waste envelope D material 241-C-106

    Energy Technology Data Exchange (ETDEWEB)

    Esch, R.A.

    1997-04-14

    This report represents the Final Analytical Report on Tank Waste Remediation System (TWRS) Privatization Contractor Samples for Waste Envelope D. All work was conducted in accordance with ''Addendum 1 of the Letter of Instruction (LOI) for TWRS Privatization Contractor Samples Addressing Waste Envelope D Materials - Revision 0, Revision 1, and Revision 2.'' (Jones 1996, Wiemers 1996a, Wiemers 1996b) Tank 241-C-1 06 (C-106) was selected by TWRS Privatization for the Part 1A Envelope D high-level waste demonstration. Twenty bottles of Tank C-106 material were collected by Westinghouse Hanford Company using a grab sampling technique and transferred to the 325 building for processing by the Pacific Northwest National Laboratory (PNNL). At the 325 building, the contents of the twenty bottles were combined into a single Initial Composite Material. This composite was subsampled for the laboratory-scale screening test and characterization testing, and the remainder was transferred to the 324 building for bench-scale preparation of the Privatization Contractor samples.

  19. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  20. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  1. The effect of sample grinding procedures after processing on gas production profiles and end-product formation in expander processed barley and peas

    NARCIS (Netherlands)

    Azarfar, A.; Poel, van der A.F.B.; Tamminga, S.

    2007-01-01

    Grinding is a technological process widely applied in the feed manufacturing industry and is a prerequisite for obtaining representative samples for laboratory procedures (e.g. gas production analysis). When feeds are subjected to technological processes other than grinding (e.g. expander

  2. Does Pneumatic Tube System Transport Contribute to Hemolysis in ED Blood Samples?

    Directory of Open Access Journals (Sweden)

    Fredric M. Hustey

    2016-09-01

    Full Text Available Introduction: Our goal was to determine if the hemolysis among blood samples obtained in an emergency department and then sent to the laboratory in a pneumatic tube system was different from those in samples that were hand-carried. Methods: The hemolysis index is measured on all samples submitted for potassium analysis. We queried our hospital laboratory database system (SunQuest® for potassium results for specimens obtained between January 2014 and July 2014. From facility maintenance records, we identified periods of system downtime, during which specimens were hand-carried to the laboratory. Results: During the study period, 15,851 blood specimens were transported via our pneumatic tube system and 92 samples were hand delivered. The proportions of hemolyzed specimens in the two groups were not significantly different (13.6% vs. 13.1% [p=0.90]. Results were consistent when the criterion was limited to gross (3.3% vs 3.3% [p=0.99] or mild (10.3% vs 9.8% [p=0.88] hemolysis. The hemolysis rate showed minimal variation during the study period (12.6%–14.6%. Conclusion: We found no statistical difference in the percentages of hemolyzed specimens transported by a pneumatic tube system or hand delivered to the laboratory. Certain features of pneumatic tube systems might contribute to hemolysis (e.g., speed, distance, packing material. Since each system is unique in design, we encourage medical facilities to consider whether their method of transport might contribute to hemolysis in samples obtained in the emergency department.

  3. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  4. Dynamics Explorer science data processing system

    International Nuclear Information System (INIS)

    Smith, P.H.; Freeman, C.H.; Hoffman, R.A.

    1981-01-01

    The Dynamics Explorer project has acquired the ground data processing system from the Atmosphere Explorer project to provide a central computer facility for the data processing, data management and data analysis activities of the investigators. Access to this system is via remote terminals at the investigators' facilities, which provide ready access to the data sets derived from groups of instruments on both spacecraft. The original system has been upgraded with both new hardware and enhanced software systems. These new systems include color and grey scale graphics terminals, an augmentation computer, micrographies facility, a versatile data base with a directory and data management system, and graphics display software packages. (orig.)

  5. An automated blood sampling system used in positron emission tomography

    International Nuclear Information System (INIS)

    Eriksson, L.; Bohm, C.; Kesselberg, M.

    1988-01-01

    Fast dynamic function studies with positron emission tomography (PET), has the potential to give accurate information of physiological functions of the brain. This capability can be realised if the positron camera system accurately quantitates the tracer uptake in the brain with sufficiently high efficiency and in sufficiently short time intervals. However, in addition, the tracer concentration in blood, as a function of time, must be accurately determined. This paper describes and evaluates an automated blood sampling system. Two different detector units are compared. The use of the automated blood sampling system is demonstrated in studies of cerebral blood flow, in studies of the blood-brain barrier transfer of amino acids and of the cerebral oxygen consumption. 5 refs.; 7 figs

  6. On-chip sample preparation for complete blood count from raw blood.

    Science.gov (United States)

    Nguyen, John; Wei, Yuan; Zheng, Yi; Wang, Chen; Sun, Yu

    2015-03-21

    This paper describes a monolithic microfluidic device capable of on-chip sample preparation for both RBC and WBC measurements from whole blood. For the first time, on-chip sample processing (e.g. dilution, lysis, and filtration) and downstream single cell measurement were fully integrated to enable sample preparation and single cell analysis from whole blood on a single device. The device consists of two parallel sub-systems that perform sample processing and electrical measurements for measuring RBC and WBC parameters. The system provides a modular environment capable of handling solutions of various viscosities by adjusting the length of channels and precisely controlling mixing ratios, and features a new 'offset' filter configuration for increased duration of device operation. RBC concentration, mean corpuscular volume (MCV), cell distribution width, WBC concentration and differential are determined by electrical impedance measurement. Experimental characterization of over 100,000 cells from 10 patient blood samples validated the system's capability for performing on-chip raw blood processing and measurement.

  7. Neuro-genetic system for optimization of GMI samples sensitivity.

    Science.gov (United States)

    Pitta Botelho, A C O; Vellasco, M M B R; Hall Barbosa, C R; Costa Silva, E

    2016-03-01

    Magnetic sensors are largely used in several engineering areas. Among them, magnetic sensors based on the Giant Magnetoimpedance (GMI) effect are a new family of magnetic sensing devices that have a huge potential for applications involving measurements of ultra-weak magnetic fields. The sensitivity of magnetometers is directly associated with the sensitivity of their sensing elements. The GMI effect is characterized by a large variation of the impedance (magnitude and phase) of a ferromagnetic sample, when subjected to a magnetic field. Recent studies have shown that phase-based GMI magnetometers have the potential to increase the sensitivity by about 100 times. The sensitivity of GMI samples depends on several parameters, such as sample length, external magnetic field, DC level and frequency of the excitation current. However, this dependency is yet to be sufficiently well-modeled in quantitative terms. So, the search for the set of parameters that optimizes the samples sensitivity is usually empirical and very time consuming. This paper deals with this problem by proposing a new neuro-genetic system aimed at maximizing the impedance phase sensitivity of GMI samples. A Multi-Layer Perceptron (MLP) Neural Network is used to model the impedance phase and a Genetic Algorithm uses the information provided by the neural network to determine which set of parameters maximizes the impedance phase sensitivity. The results obtained with a data set composed of four different GMI sample lengths demonstrate that the neuro-genetic system is able to correctly and automatically determine the set of conditioning parameters responsible for maximizing their phase sensitivities. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Development and evaluation of a multiple-plate fraction collector for sample processing: application to radioprofiling in drug metabolism studies.

    Science.gov (United States)

    Barros, Anthony; Ly, Van T; Chando, Theodore J; Ruan, Qian; Donenfeld, Scott L; Holub, David P; Christopher, Lisa J

    2011-04-05

    Microplate scintillation counters are utilized routinely in drug metabolism laboratories for the off-line radioanalysis of fractions collected during HPLC radioprofiling. In this process, the current fraction collection technology is limited by the number of plates that can be used per injection as well as the potential for sample loss due to dripping or spraying as the fraction collector head moves from well to well or between plates. More importantly, sample throughput is limited in the conventional process, since the collection plates must be manually exchanged after each injection. The Collect PAL, an innovative multiple-plate fraction collector, was developed to address these deficiencies and improve overall sample throughput. It employs a zero-loss design and has sub-ambient temperature control. Operation of the system is completely controlled with software and up to 24 (96- or 384-well) fraction collection plates can be loaded in a completely automated run. The system may also be configured for collection into various-sized tubes or vials. At flow rates of 0.5 or 1.0 mL/min and at collection times of 10 or 15s, the system precisely delivered 83-μL fractions (within 4.1% CV) and 250-μL fractions (within 1.4% CV), respectively, of three different mobile phases into 12 mm × 32 mm vials. Similarly, at a flow rate of 1 mL/min and 10s collection times, the system precisely dispensed mobile phase containing a [(14)C]-radiolabeled compound across an entire 96-well plate (% CV was within 5.3%). Triplicate analyses of metabolism test samples containing [(14)C]buspirone and its metabolites, derived from three different matrices (plasma, urine and bile), indicated that the Collect PAL produced radioprofiles that were reproducible and comparable to the current technology; the % CV for 9 selected peaks in the radioprofiles generated with the Collect PAL were within 9.3%. Radioprofiles generated by collecting into 96- and 384-well plates were qualitatively comparable

  9. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  10. Laboratory evaluation of a gasifier particle sampling system using model compounds of different particle morphology

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Patrik T.; Malik, Azhar; Pagels, Joakim; Lindskog, Magnus; Rissler, Jenny; Gudmundsson, Anders; Bohgard, Mats; Sanati, Mehri [Lund University, Division of Ergonomics and Aerosol Technology, P.O. Box 118, Lund (Sweden)

    2011-07-15

    The objective of this work was to design and evaluate an experimental setup to be used for field studies of particle formation in biomass gasification processes. The setup includes a high-temperature dilution probe and a denuder to separate solid particles from condensable volatile material. The efficiency of the setup to remove volatile material from the sampled stream and the influence from condensation on particles with different morphologies is presented. In order to study the sampling setup model, aerosols were created with a nebulizer to produce compact and solid KCl particles and a diffusion flame burner to produce agglomerated and irregular soot particles. The nebulizer and soot generator was followed by an evaporation-condensation section where volatile material, dioctylsebacete (DOS), was added to the system as a tar model compound. The model aerosol particles were heated to 200 C to create a system containing both solid particles and volatile organic material in gas phase. The heated aerosol particles were sampled and diluted at the same temperature with the dilution probe. Downstream the probe, the DOS was adsorbed in the denuder. This was achieved by slowly decreasing the temperature of the diluted sample towards ambient level in the denuder. Thereby the supersaturation of organic vapors was reduced which decreased the probability for tar condensation and nucleation of new particles. Both the generation system and the sampling technique gave reproducible results. A DOS collection efficiency of >99% was achieved if the denuder inlet concentration was diluted to less than 1-6 mg/m{sup 3} depending on the denuder flow rate. Concentrations higher than that lead to significant impact on the resulting KCl size distribution. The choice of model compounds was done to study the effect from the particle morphology on the achieved particle characteristics after the sampling setup. When similar amounts of volatile material condensed on soot agglomerates and

  11. System Theory and Physiological Processes.

    Science.gov (United States)

    Jones, R W

    1963-05-03

    Engineers and physiologists working together in experimental and theoretical studies predict that the application of system analysis to biological processes will increase understanding of these processes and broaden the base of system theory. Richard W. Jones, professor of electrical engineering at Northwestern University, Evanston, Illinois, and John S. Gray, professor of physiology at Northwestern's Medical School, discuss these developments. Their articles are adapted from addresses delivered in Chicago in November 1962 at the 15th Annual Conference on Engineering in Medicine and Biology.

  12. A multitransputer parallel processing system (MTPPS)

    International Nuclear Information System (INIS)

    Jethra, A.K.; Pande, S.S.; Borkar, S.P.; Khare, A.N.; Ghodgaonkar, M.D.; Bairi, B.R.

    1993-01-01

    This report describes the design and implementation of a 16 node Multi Transputer Parallel Processing System(MTPPS) which is a platform for parallel program development. It is a MIMD machine based on message passing paradigm. The basic compute engine is an Inmos Transputer Ims T800-20. Transputer with local memory constitutes the processing element (NODE) of this MIMD architecture. Multiple NODES can be connected to each other in an identifiable network topology through the high speed serial links of the transputer. A Network Configuration Unit (NCU) incorporates the necessary hardware to provide software controlled network configuration. System is modularly expandable and more NODES can be added to the system to achieve the required processing power. The system is backend to the IBM-PC which has been integrated into the system to provide user I/O interface. PC resources are available to the programmer. Interface hardware between the PC and the network of transputers is INMOS compatible. Therefore, all the commercially available development software compatible to INMOS products can run on this system. While giving the details of design and implementation, this report briefly summarises MIMD Architectures, Transputer Architecture and Parallel Processing Software Development issues. LINPACK performance evaluation of the system and solutions of neutron physics and plasma physics problem have been discussed along with results. (author). 12 refs., 22 figs., 3 tabs., 3 appendixes

  13. Sample Management System for Heavy Ion Irradiation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — A robotic sample management device and system for the exposure of biological and material specimens to heavy ion beams of the NASA Space Radiation Laboratory (NSRL)...

  14. Sample Management System for Heavy Ion Irradiation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A robotic sample management device and system for the exposure of biological and material specimens to heavy ion beams of the NASA Space Radiation Laboratory (NSRL)...

  15. Endogenous System Microbes as Treatment Process ...

    Science.gov (United States)

    Monitoring the efficacy of treatment strategies to remove pathogens in decentralized systems remains a challenge. Evaluating log reduction targets by measuring pathogen levels is hampered by their sporadic and low occurrence rates. Fecal indicator bacteria are used in centralized systems to indicate the presence of fecal pathogens, but are ineffective decentralized treatment process indicators as they generally occur at levels too low to assess log reduction targets. System challenge testing by spiking with high loads of fecal indicator organisms, like MS2 coliphage, has limitations, especially for large systems. Microbes that are endogenous to the decentralized system, occur in high abundances and mimic removal rates of bacterial, viral and/or parasitic protozoan pathogens during treatment could serve as alternative treatment process indicators to verify log reduction targets. To identify abundant microbes in wastewater, the bacterial and viral communities were examined using deep sequencing. Building infrastructure-associated bacteria, like Zoogloea, were observed as dominant members of the bacterial community in graywater. In blackwater, bacteriophage of the order Caudovirales constituted the majority of contiguous sequences from the viral community. This study identifies candidate treatment process indicators in decentralized systems that could be used to verify log removal during treatment. The association of the presence of treatment process indic

  16. The installation of a multiport ground-water sampling system in the 300 Area

    International Nuclear Information System (INIS)

    Gilmore, T.J.

    1989-06-01

    In 1988, the Pacific Northwest Laboratory installed a multiport groundwater sampling system in well 399-1-20, drilled north of the 300 Area on the Hanford Site in southwestern Washington State. The purpose of installing the multiport system is to evaluate methods of determining the vertical distribution of contaminants and hydraulic heads in ground water. Well 399-1-20 is adjacent to a cluster of four Resource Conservation and Recovery Act (RCRA) ground-water monitoring wells. This proximity makes it possible to compare sampling intervals and head measurements between the multiport system and the RCRA monitoring wells. Drilling and installation of the multiport system took 42 working days. Six sampling ports were installed in the upper unconfined aquifer at depths of approximately 120, 103, 86, 74, 56, and 44 feet. The locations of the sampling ports were determined by the hydrogeology of the area and the screened intervals of adjacent ground-water monitoring wells. The system was installed by backfilling sand around the sampling ports and isolating the ports with bentonite seals. The method proved adequate. For future installation, however, development and evaluation of an alternative method is recommended. In the alternative method suggested, the multiport system would be placed inside a cased and screened well, using packers to isolate the sampling zones. 4 refs., 8 figs., 1 tab

  17. L. monocytogenes in a cheese processing facility: Learning from contamination scenarios over three years of sampling.

    Science.gov (United States)

    Rückerl, I; Muhterem-Uyar, M; Muri-Klinger, S; Wagner, K-H; Wagner, M; Stessl, B

    2014-10-17

    The aim of this study was to analyze the changing patterns of Listeria monocytogenes contamination in a cheese processing facility manufacturing a wide range of ready-to-eat products. Characterization of L. monocytogenes isolates included genotyping by pulsed-field gel electrophoresis (PFGE) and multi-locus sequence typing (MLST). Disinfectant-susceptibility tests and the assessment of L. monocytogenes survival in fresh cheese were also conducted. During the sampling period between 2010 and 2013, a total of 1284 environmental samples were investigated. Overall occurrence rates of Listeria spp. and L. monocytogenes were 21.9% and 19.5%, respectively. Identical L. monocytogenes genotypes were found in the food processing environment (FPE), raw materials and in products. Interventions after the sampling events changed contamination scenarios substantially. The high diversity of globally, widely distributed L. monocytogenes genotypes was reduced by identifying the major sources of contamination. Although susceptible to a broad range of disinfectants and cleaners, one dominant L. monocytogenes sequence type (ST) 5 could not be eradicated from drains and floors. Significantly, intense humidity and steam could be observed in all rooms and water residues were visible on floors due to increased cleaning strategies. This could explain the high L. monocytogenes contamination of the FPE (drains, shoes and floors) throughout the study (15.8%). The outcome of a challenge experiment in fresh cheese showed that L. monocytogenes could survive after 14days of storage at insufficient cooling temperatures (8 and 16°C). All efforts to reduce L. monocytogenes environmental contamination eventually led to a transition from dynamic to stable contamination scenarios. Consequently, implementation of systematic environmental monitoring via in-house systems should either aim for total avoidance of FPE colonization, or emphasize a first reduction of L. monocytogenes to sites where

  18. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    Bianchini, Ricardo M.; Estevez, Jorge; Vollmer, Alberto E.; Iglicki, Flora A.

    1999-01-01

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  19. Control measurement system in purex process

    International Nuclear Information System (INIS)

    Mani, V.V.S.

    1985-01-01

    The dependence of a bulk facility handling Purex Process on the control measurement system for evaluating the process performance needs hardly be emphasized. process control, Plant control, inventory control and quality control are the four components of the control measurement system. The scope and requirements of each component are different and the measurement methods are selected accordingly. However, each measurement system has six important elements. These are described in detail. The quality assurance programme carried out by the laboratory as a mechanism through which the quality of measurements is regularly tested and stated in quantitative terms is also explained in terms of internal and external quality assurance, with examples. Suggestions for making the control measurement system more responsive to the operational needs in future are also briefly discussed. (author)

  20. Expect systems and optimisation in process control

    Energy Technology Data Exchange (ETDEWEB)

    Mamdani, A.; Efstathiou, J. (eds.)

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately.

  1. Expert systems and optimisation in process control

    International Nuclear Information System (INIS)

    Mamdani, A.; Efstathiou, J.

    1986-01-01

    This report brings together recent developments both in expert systems and in optimisation, and deals with current applications in industry. Part One is concerned with Artificial Intellegence in planning and scheduling and with rule-based control implementation. The tasks of control maintenance, rescheduling and planning are each discussed in relation to new theoretical developments, techniques available, and sample applications. Part Two covers model based control techniques in which the control decisions are used in a computer model of the process. Fault diagnosis, maintenance and trouble-shooting are just some of the activities covered. Part Three contains case studies of projects currently in progress, giving details of the software available and the likely future trends. One of these, on qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations is indexed separately. (author)

  2. Method for Business Process Management System Selection

    NARCIS (Netherlands)

    Thijs van de Westelaken; Bas Terwee; Pascal Ravesteijn

    2013-01-01

    In recent years business process management (BPM) and specifically information systems that support the analysis, design and execution of processes (also called business process management systems (BPMS)) are getting more attention. This has lead to an increase in research on BPM and BPMS. However

  3. Signals, processes, and systems an interactive multimedia introduction to signal processing

    CERN Document Server

    Karrenberg, Ulrich

    2013-01-01

    This is a very new concept for learning Signal Processing, not only from the physically-based scientific fundamentals, but also from the didactic perspective, based on modern results of brain research. The textbook together with the DVD form a learning system that provides investigative studies and enables the reader to interactively visualize even complex processes. The unique didactic concept is built on visualizing signals and processes on the one hand, and on graphical programming of signal processing systems on the other. The concept has been designed especially for microelectronics, computer technology and communication. The book allows to develop, modify, and optimize useful applications using DasyLab - a professional and globally supported software for metrology and control engineering. With the 3rd edition, the software is also suitable for 64 bit systems running on Windows 7. Real signals can be acquired, processed and played on the sound card of your computer. The book provides more than 200 pre-pr...

  4. Systems integration processes for space nuclear electric propulsion systems

    International Nuclear Information System (INIS)

    Olsen, C.S.; Rice, J.W.; Stanley, M.L.

    1991-01-01

    The various components and subsystems that comprise a nuclear electric propulsion system should be developed and integrated so that each functions ideally and so that each is properly integrated with the other components and subsystems in the optimum way. This paper discusses how processes similar to those used in the development and intergration of the subsystems that comprise the Multimegawatt Space Nuclear Power System concepts can be and are being efficiently and effectively utilized for these purposes. The processes discussed include the development of functional and operational requirements at the system and subsystem level; the assessment of individual nuclear power supply and thruster concepts and their associated technologies; the conduct of systems integration efforts including the evaluation of the mission benefits for each system; the identification and resolution of concepts development, technology development, and systems integration feasibility issues; subsystem, system, and technology development and integration; and ground and flight subsystem and integrated system testing

  5. Materials processing issues for non-destructive laser gas sampling (NDLGS)

    Energy Technology Data Exchange (ETDEWEB)

    Lienert, Thomas J [Los Alamos National Laboratory

    2010-12-09

    The Non-Destructive Laser Gas Sampling (NDLGS) process essentially involves three steps: (1) laser drilling through the top of a crimped tube made of 304L stainles steel (Hammar and Svennson Cr{sub eq}/Ni{sub eq} = 1.55, produced in 1985); (2) gas sampling; and (3) laser re-welding of the crimp. All three steps are performed in a sealed chamber with a fused silica window under controlled vacuum conditions. Quality requirements for successful processing call for a hermetic re-weld with no cracks or other defects in the fusion zone or HAZ. It has been well established that austenitic stainless steels ({gamma}-SS), such as 304L, can suffer from solidification cracking if their Cr{sub eq}/Ni{sub eq} is below a critical value that causes solidification to occur as austenite (fcc structure) and their combined impurity level (%P+%S) is above {approx}0.02%. Conversely, for Cr{sub eq}/Ni{sub eq} values above the critical level, solidification occurs as ferrite (bcc structure), and cracking propensity is greatly reduced at all combined impurity levels. The consensus of results from studies of several researchers starting in the late 1970's indicates that the critical Cr{sub eq}/Ni{sub eq} value is {approx}1.5 for arc welds. However, more recent studies by the author and others show that the critical Cr{sub eq}/Ni{sub eq} value increases to {approx}1 .6 for weld processes with very rapid thermal cycles, such as the pulsed Nd:YAG laser beam welding (LBW) process used here. Initial attempts at NDLGS using pulsed LBW resulted in considerable solidification cracking, consistent with the results of work discussed above. After a brief introduction to the welding metallurgy of {gamma}-SS, this presentation will review the results of a study aimed at developing a production-ready process that eliminates cracking. The solution to the cracking issue, developed at LANL, involved locally augmenting the Cr content by applying either Cr or a Cr-rich stainless steel (ER 312) to the top

  6. Development of systems of analysis in industrial processes with XRF. A technology transfer alternative

    International Nuclear Information System (INIS)

    Galvez, Juan; Poblete, Victor

    1999-01-01

    The FRX Laboratory's experience in developing a unit of analysis by excitation with a radioisotope source is described, with a discussion of its advantages, limitations, types of existing units in the market, use, values and state-of-the-art. The evolution of mining and metallurgical processes has led to the development of new technologies that provide quick and precise control of control and analysis operations, avoiding loss of raw material, chemical reagents, waste of materials and time, to obtain a better quality and purer product. The system developed by the FRX Laboratory is relatively low cost compared to other equipment that is available in the market, and it focuses on single element analysis in hydrometallurgical processes. This system uses a NaI (T1) detector with a beryllium window, related electronics, monitor and printer, which controls the operation automatically using an adequate program for taking samples, measuring, analysis, printing results, changing samples, etc. The sampling is continuous, so it does not have to be taken or prepared chemically. This system can extrapolate to other more complex ones, using new kinds of detectors with higher resolution, more modern electronics, and new multichannel cards. The development of this kind of equipment in Chile means that dependence on foreign technology can be avoided by replacing expensive imported equipment, creating our own technology and transferring it to the domestic market, and even generating income by exporting these units and opening new development prospects (au)

  7. A state-of-the-art mass spectrometer system for determination of uranium and plutonium isotopic distributions in process samples

    International Nuclear Information System (INIS)

    Polson, C.A.

    1984-01-01

    A Finnigan MAT 261 automated thermal ionization mass spectrometer system was purchased by the Savannah River Plant and recently installed by Finnigan factory representatives. This instrument is a refinement of the MAT 260 which has been used routinely for three years in the laboratory at SRP. The MAT 261 is a highly precise, fully automated instrument. Many features make this instrument the state-of-the-art technology in precision isotopic composition measurements. A unique feature of the MAT 261 is the ion detection system which permits measurement of the three uranium or plutonium masses simultaneously. All Faraday cup measuring channels are of the same design and each is equipped with a dedicated amplifier. Each amplifier is connected to a linear voltage/frequency measuring system for ion current integration. These outputs are fed into a Hewlett-Packard 9845T desktop computer. The computer, and the Finnigan developed software package, control filament heating cycles, sample reconditioning, ion beam focusing, carrousel rotation, mass selection, and data collection and reduction. Precision, accuracy, and linearity were determined under normal laboratory conditions using a NBS uranium suite of standards. These results along with other developments in setting up the instrument are presented

  8. RAPID PROCESSING OF ARCHIVAL TISSUE SAMPLES FOR PROTEOMIC ANALYSIS USING PRESSURE-CYCLING TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vinuth N. Puttamallesh1,2

    2017-06-01

    Full Text Available Advent of mass spectrometry based proteomics has revolutionized our ability to study proteins from biological specimen in a high-throughput manner. Unlike cell line based studies, biomedical research involving tissue specimen is often challenging due to limited sample availability. In addition, investigation of clinically relevant research questions often requires enormous amount of time for sample collection prospectively. Formalin fixed paraffin embedded (FFPE archived tissue samples are a rich source of tissue specimen for biomedical research. However, there are several challenges associated with analysing FFPE samples. Protein cross-linking and degradation of proteins particularly affects proteomic analysis. We demonstrate that barocycler that uses pressure-cycling technology enables efficient protein extraction and processing of small amounts of FFPE tissue samples for proteomic analysis. We identified 3,525 proteins from six 10µm esophageal squamous cell carcinoma (ESCC tissue sections. Barocycler allows efficient protein extraction and proteolytic digestion of proteins from FFPE tissue sections at par with conventional methods.

  9. Wideband 4-diode sampling circuit

    Science.gov (United States)

    Wojtulewicz, Andrzej; Radtke, Maciej

    2016-09-01

    The objective of this work was to develop a wide-band sampling circuit. The device should have the ability to collect samples of a very fast signal applied to its input, strengthen it and prepare for further processing. The study emphasizes the method of sampling pulse shaping. The use of ultrafast pulse generator allows sampling signals with a wide frequency spectrum, reaching several gigahertzes. The device uses a pulse transformer to prepare symmetrical pulses. Their final shape is formed with the help of the step recovery diode, two coplanar strips and Schottky diode. Made device can be used in the sampling oscilloscope, as well as other measurement system.

  10. The outlier sample effects on multivariate statistical data processing geochemical stream sediment survey (Moghangegh region, North West of Iran)

    International Nuclear Information System (INIS)

    Ghanbari, Y.; Habibnia, A.; Memar, A.

    2009-01-01

    In geochemical stream sediment surveys in Moghangegh Region in north west of Iran, sheet 1:50,000, 152 samples were collected and after the analyze and processing of data, it revealed that Yb, Sc, Ni, Li, Eu, Cd, Co, as contents in one sample is far higher than other samples. After detecting this sample as an outlier sample, the effect of this sample on multivariate statistical data processing for destructive effects of outlier sample in geochemical exploration was investigated. Pearson and Spear man correlation coefficient methods and cluster analysis were used for multivariate studies and the scatter plot of some elements together the regression profiles are given in case of 152 and 151 samples and the results are compared. After investigation of multivariate statistical data processing results, it was realized that results of existence of outlier samples may appear as the following relations between elements: - true relation between two elements, which have no outlier frequency in the outlier sample. - false relation between two elements which one of them has outlier frequency in the outlier sample. - complete false relation between two elements which both have outlier frequency in the outlier sample

  11. Automatic drawing and CAD actualization in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get expain the curve of radiant sampling data, and we can combine mineral masses and analyse and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  12. Automatic drawing and cad actualiztion in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get explain the curve of radiant sampling data, and we can combine mineral masses and analyses and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  13. Emerging halogenated flame retardants and hexabromocyclododecanes in food samples from an e-waste processing area in Vietnam.

    Science.gov (United States)

    Tao, Fang; Matsukami, Hidenori; Suzuki, Go; Tue, Nguyen Minh; Viet, Pham Hung; Takigami, Hidetaka; Harrad, Stuart

    2016-03-01

    This study reports concentrations of selected emerging halogenated flame retardants (HFRs) and hexabromocyclododecanes (HBCDs) in foodstuffs sourced from an e-waste processing area in Vietnam and two reference sites in Vietnam and Japan. Concentrations of all target HFRs in e-waste-impacted samples in this study exceed significantly (p e-waste processing activities exert a substantial impact on local environmental contamination and human dietary exposure. Significant linear positive correlations in concentrations of syn-Dechlorane Plus (DP) and anti-DP were found between soils and those in co-located chicken samples (p e-waste processing sites and non-e-waste processing areas elsewhere.

  14. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    Science.gov (United States)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  15. Aligning Business Process Quality and Information System Quality

    OpenAIRE

    Heinrich, Robert

    2013-01-01

    Business processes and information systems mutually affect each other in non-trivial ways. Frequently, the business process design and the information system design are not well aligned. This means that business processes are designed without taking the information system impact into account, and vice versa. Missing alignment at design time often results in quality problems at runtime, such as large response times of information systems, large process execution times, overloaded information s...

  16. In Situ Visualization of the Phase Behavior of Oil Samples Under Refinery Process Conditions.

    Science.gov (United States)

    Laborde-Boutet, Cedric; McCaffrey, William C

    2017-02-21

    To help address production issues in refineries caused by the fouling of process units and lines, we have developed a setup as well as a method to visualize the behavior of petroleum samples under process conditions. The experimental setup relies on a custom-built micro-reactor fitted with a sapphire window at the bottom, which is placed over the objective of an inverted microscope equipped with a cross-polarizer module. Using reflection microscopy enables the visualization of opaque samples, such as petroleum vacuum residues, or asphaltenes. The combination of the sapphire window from the micro-reactor with the cross-polarizer module of the microscope on the light path allows high-contrast imaging of isotropic and anisotropic media. While observations are carried out, the micro-reactor can be heated to the temperature range of cracking reactions (up to 450 °C), can be subjected to H2 pressure relevant to hydroconversion reactions (up to 16 MPa), and can stir the sample by magnetic coupling. Observations are typically carried out by taking snapshots of the sample under cross-polarized light at regular time intervals. Image analyses may not only provide information on the temperature, pressure, and reactive conditions yielding phase separation, but may also give an estimate of the evolution of the chemical (absorption/reflection spectra) and physical (refractive index) properties of the sample before the onset of phase separation.

  17. The Process of Systemic Change

    Science.gov (United States)

    Duffy, Francis M.; Reigeluth, Charles M.; Solomon, Monica; Caine, Geoffrey; Carr-Chellman, Alison A.; Almeida, Luis; Frick, Theodore; Thompson, Kenneth; Koh, Joyce; Ryan, Christopher D.; DeMars, Shane

    2006-01-01

    This paper presents several brief papers about the process of systemic change. These are: (1) Step-Up-To-Excellence: A Protocol for Navigating Whole-System Change in School Districts by Francis M. Duffy; (2) The Guidance System for Transforming Education by Charles M. Reigeluth; (3) The Schlechty Center For Leadership In School Reform by Monica…

  18. Human-Systems Integration Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this project is to baseline a Human-Systems Integration Processes (HSIP) document as a companion to the NASA-STD-3001 and Human Integration Design...

  19. The Sample Handling System for the Mars Icebreaker Life Mission: from Dirt to Data

    Science.gov (United States)

    Dave, Arwen; Thompson, Sarah J.; McKay, Christopher P.; Stoker, Carol R.; Zacny, Kris; Paulsen, Gale; Mellerowicz, Bolek; Glass, Brian J.; Wilson, David; Bonaccorsi, Rosalba; hide

    2013-01-01

    The Mars icebreaker life mission will search for subsurface life on mars. It consists of three payload elements: a drill to retrieve soil samples from approx. 1 meter below the surface, a robotic sample handling system to deliver the sample from the drill to the instruments, and the instruments themselves. This paper will discuss the robotic sample handling system.

  20. Oscillating systems with cointegrated phase processes

    DEFF Research Database (Denmark)

    Østergaard, Jacob; Rahbek, Anders; Ditlevsen, Susanne

    2017-01-01

    We present cointegration analysis as a method to infer the network structure of a linearly phase coupled oscillating system. By defining a class of oscillating systems with interacting phases, we derive a data generating process where we can specify the coupling structure of a network...... that resembles biological processes. In particular we study a network of Winfree oscillators, for which we present a statistical analysis of various simulated networks, where we conclude on the coupling structure: the direction of feedback in the phase processes and proportional coupling strength between...... individual components of the system. We show that we can correctly classify the network structure for such a system by cointegration analysis, for various types of coupling, including uni-/bi-directional and all-to-all coupling. Finally, we analyze a set of EEG recordings and discuss the current...

  1. Assessment of Processes of Change for Weight Management in a UK Sample

    Science.gov (United States)

    Andrés, Ana; Saldaña, Carmina; Beeken, Rebecca J.

    2015-01-01

    Objective The present study aimed to validate the English version of the Processes of Change questionnaire in weight management (P-Weight). Methods Participants were 1,087 UK adults, including people enrolled in a behavioural weight management programme, university students and an opportunistic sample. The mean age of the sample was 34.80 (SD = 13.56) years, and 83% were women. BMI ranged from 18.51 to 55.36 (mean = 25.92, SD = 6.26) kg/m2. Participants completed both the stages and processes questionnaires in weight management (S-Weight and P-Weight), and subscales from the EDI-2 and EAT-40. A refined version of the P-Weight consisting of 32 items was obtained based on the item analysis. Results The internal structure of the scale fitted a four-factor model, and statistically significant correlations with external measures supported the convergent validity of the scale. Conclusion The adequate psychometric properties of the P-Weight English version suggest that it could be a useful tool to tailor weight management interventions. PMID:25765163

  2. Chemical and Metallurgy Research (CMR) Sample Tracking System Design Document

    International Nuclear Information System (INIS)

    Bargelski, C. J.; Berrett, D. E.

    1998-01-01

    The purpose of this document is to describe the system architecture of the Chemical and Metallurgy Research (CMR) Sample Tracking System at Los Alamos National Laboratory. During the course of the document observations are made concerning the objectives, constraints and limitations, technical approaches, and the technical deliverables

  3. Process analysis in a THTR trial reprocessing plant

    International Nuclear Information System (INIS)

    Brodda, B.G.; Filss, P.; Kirchner, H.; Kroth, K.; Lammertz, H.; Schaedlich, W.; Brocke, W.; Buerger, K.; Halling, H.; Watzlawik, K.H.

    1979-01-01

    The demands on an analytical control system for a THTR trial reprocessing plant are specified. In a rather detailed example, a typical sampling, sample monitoring and measuring process is described. Analytical control is partly automated. Data acquisition and evaluation by computer are described for some important, largely automated processes. Sample management and recording of in-line and off-line data are carried out by a data processing system. Some important experiments on sample taking, sample transport and on special analysis are described. (RB) [de

  4. Towards High Performance Processing In Modern Java Based Control Systems

    CERN Document Server

    Misiowiec, M; Buttner, M

    2011-01-01

    CERN controls software is often developed on Java foundation. Some systems carry out a combination of data, network and processor intensive tasks within strict time limits. Hence, there is a demand for high performing, quasi real time solutions. Extensive prototyping of the new CERN monitoring and alarm software required us to address such expectations. The system must handle dozens of thousands of data samples every second, along its three tiers, applying complex computations throughout. To accomplish the goal, a deep understanding of multithreading, memory management and interprocess communication was required. There are unexpected traps hidden behind an excessive use of 64 bit memory or severe impact on the processing flow of modern garbage collectors. Tuning JVM configuration significantly affects the execution of the code. Even more important is the amount of threads and the data structures used between them. Accurately dividing work into independent tasks might boost system performance. Thorough profili...

  5. Sampling and analysis plan for the characterization of eight drums at the 200-BP-5 pump-and-treat systems

    International Nuclear Information System (INIS)

    Laws, J.R.

    1995-01-01

    Samples will be collected and analyzed to provide sufficient information for characterization of mercury and aluminum contamination in drums from the final rinse of the tanks in the two pump-and-treat systems supporting the 200-BP-5 Operable Unit. The data will be used to determine the type of contamination in the drums to properly designate the waste for disposal or treatment. This sampling plan does not substitute the sampling requirements but is a separate sampling event to manage eight drums containing waste generated during an unanticipated contamination of the process water with mercury and aluminum nitrate nonahydrate (ANN). The Toxicity Characteristic Leaching Procedure (TCLP) will be used for extraction, and standard US Environmental Protection Agency (EPA) methods will be used for analysis

  6. Experimental study of glass sampling devices

    International Nuclear Information System (INIS)

    Jouan, A.; Moncouyoux, J.P.; Meyere, A.

    1992-01-01

    Two high-level liquid waste containment glass sampling systems have been designed and built. The first device fits entirely inside a standard glass storage canister, and may thus be used in facilities not initially designed for this function. It has been tested successfully in the nonradioactive prototype unit at Marcoule. The work primarily covered the design and construction of an articulated arm supporting the sampling vessel, and the mechanisms necessary for filling the vessel and recovering the sample. System actuation and operation are fully automatic, and the resulting sample is representative of the glass melt. Implementation of the device is delicate however, and its reliability is estimated at about 75%. A second device was designed specifically for new vitrification facilities. It is installed directly on the glass melting furnace, and meets process operating and quality control requirements. Tests conducted at the Marcoule prototype vitrification facility demonstrated the feasibility of the system. Special attention was given to the sampling vessel transfer mechanisms, with two filling and controlled sample cooling options

  7. A Process Management System for Networked Manufacturing

    Science.gov (United States)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  8. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    Science.gov (United States)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  9. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  10. Remote sampling and analysis of highly radioactive samples in shielded boxes

    International Nuclear Information System (INIS)

    Kirpikov, D.A.; Miroshnichenko, I.V.; Pykhteev, O.Yu.

    2010-01-01

    equipped with radiation resistant video cameras provided for additional visual inspection. All the acquired information including visual images is sent to the workstation of an operator who controls handling of highly radioactive samples. Authorized participants of chemical control are able to supervise the sampling procedure and process the obtained results via local area network owing to inclusion of the workstation in the common data acquisition, processing and storage system. The paper describes the developed equipment and solutions and discusses the issues related to development of the local computer-aided chemical control system. (author)

  11. Enterprise and system of systems capability development life-cycle processes.

    Energy Technology Data Exchange (ETDEWEB)

    Beck, David Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While the approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.

  12. The Maia Spectroscopy Detector System: Engineering for Integrated Pulse Capture, Low-Latency Scanning and Real-Time Processing

    International Nuclear Information System (INIS)

    Kirkham, R.; Siddons, D.; Dunn, P.A.; Kuczewski, A.J.; Dodanwela, R.; Moorhead, G.F.; Ryan, C.G.; De Geronimo, G.; Beuttenmuller, R.; Pinelli, D.; Pfeffer, M.; Davey, P.; Jensen, M.; de Jonge, M.D.; Howard, D.L.; Kusel, M.; McKinlay, J.

    2010-01-01

    The Maia detector system is engineered for energy dispersive x-ray fluorescence spectroscopy and elemental imaging at photon rates exceeding 10 7 /s, integrated scanning of samples for pixel transit times as small as 50 (micro)s and high definition images of 10 8 pixels and real-time processing of detected events for spectral deconvolution and online display of pure elemental images. The system developed by CSIRO and BNL combines a planar silicon 384 detector array, application-specific integrated circuits for pulse shaping and peak detection and sampling and optical data transmission to an FPGA-based pipelined, parallel processor. This paper describes the system and the underpinning engineering solutions.

  13. SAIL--a software system for sample and phenotype availability across biobanks and cohorts.

    Science.gov (United States)

    Gostev, Mikhail; Fernandez-Banet, Julio; Rung, Johan; Dietrich, Joern; Prokopenko, Inga; Ripatti, Samuli; McCarthy, Mark I; Brazma, Alvis; Krestyaninova, Maria

    2011-02-15

    The Sample avAILability system-SAIL-is a web based application for searching, browsing and annotating biological sample collections or biobank entries. By providing individual-level information on the availability of specific data types (phenotypes, genetic or genomic data) and samples within a collection, rather than the actual measurement data, resource integration can be facilitated. A flexible data structure enables the collection owners to provide descriptive information on their samples using existing or custom vocabularies. Users can query for the available samples by various parameters combining them via logical expressions. The system can be scaled to hold data from millions of samples with thousands of variables. SAIL is available under Aferro-GPL open source license: https://github.com/sail.

  14. Rapid Surface Sampling and Archival Record (RSSAR) system. Final report, October 1995 - May 1997

    International Nuclear Information System (INIS)

    1998-01-01

    This report describes the results of Phase 2 efforts to develop a Rapid Surface Sampling and Archival Record (RSSAR) System for the detection of semivolatile organic contaminants on concrete, transite, and metal surfaces. The characterization of equipment and building surfaces for the presence of contaminants as part of building decontamination and decommissioning activities is an immensely large task of concern to both government and industry. Because of the high cost of hazardous waste disposal, old, contaminated buildings cannot simply be demolished and scrapped. Contaminated and clean materials must be clearly identified and segregated so that the clean material can be recycled or reused, if possible, or disposed of more cheaply as nonhazardous waste. DOE has a number of sites requiring surface characterization. These sites are large, contain very heterogeneous patterns of contamination (requiring high sampling density), and will thus necessitate an enormous number of samples to be taken and analyzed. Characterization of building and equipment surfaces will be needed during initial investigations, during cleanup operations, and during the final confirmation process, increasing the total number of samples well beyond that needed for initial characterization. This multiplicity of information places a premium on the ability to handle and track data as efficiently as possible

  15. Rapid Surface Sampling and Archival Record (RSSAR) system. Final report, October 1995--May 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    This report describes the results of Phase 2 efforts to develop a Rapid Surface Sampling and Archival Record (RSSAR) System for the detection of semivolatile organic contaminants on concrete, transite, and metal surfaces. The characterization of equipment and building surfaces for the presence of contaminants as part of building decontamination and decommissioning activities is an immensely large task of concern to both government and industry. Because of the high cost of hazardous waste disposal, old, contaminated buildings cannot simply be demolished and scrapped. Contaminated and clean materials must be clearly identified and segregated so that the clean material can be recycled or reused, if possible, or disposed of more cheaply as nonhazardous waste. DOE has a number of sites requiring surface characterization. These sites are large, contain very heterogeneous patterns of contamination (requiring high sampling density), and will thus necessitate an enormous number of samples to be taken and analyzed. Characterization of building and equipment surfaces will be needed during initial investigations, during cleanup operations, and during the final confirmation process, increasing the total number of samples well beyond that needed for initial characterization. This multiplicity of information places a premium on the ability to handle and track data as efficiently as possible.

  16. Monte Carlo importance sampling optimization for system reliability applications

    International Nuclear Information System (INIS)

    Campioni, Luca; Vestrucci, Paolo

    2004-01-01

    This paper focuses on the reliability analysis of multicomponent systems by the importance sampling technique, and, in particular, it tackles the optimization aspect. A methodology based on the minimization of the variance at the component level is proposed for the class of systems consisting of independent components. The claim is that, by means of such a methodology, the optimal biasing could be achieved without resorting to the typical approach by trials

  17. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    Science.gov (United States)

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  18. Information processing in decision-making systems.

    Science.gov (United States)

    van der Meer, Matthijs; Kurth-Nelson, Zeb; Redish, A David

    2012-08-01

    Decisions result from an interaction between multiple functional systems acting in parallel to process information in very different ways, each with strengths and weaknesses. In this review, the authors address three action-selection components of decision-making: The Pavlovian system releases an action from a limited repertoire of potential actions, such as approaching learned stimuli. Like the Pavlovian system, the habit system is computationally fast but, unlike the Pavlovian system permits arbitrary stimulus-action pairings. These associations are a "forward'' mechanism; when a situation is recognized, the action is released. In contrast, the deliberative system is flexible but takes time to process. The deliberative system uses knowledge of the causal structure of the world to search into the future, planning actions to maximize expected rewards. Deliberation depends on the ability to imagine future possibilities, including novel situations, and it allows decisions to be taken without having previously experienced the options. Various anatomical structures have been identified that carry out the information processing of each of these systems: hippocampus constitutes a map of the world that can be used for searching/imagining the future; dorsal striatal neurons represent situation-action associations; and ventral striatum maintains value representations for all three systems. Each system presents vulnerabilities to pathologies that can manifest as psychiatric disorders. Understanding these systems and their relation to neuroanatomy opens up a deeper way to treat the structural problems underlying various disorders.

  19. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  20. Process options and projected mass flows for the HTGR refabrication scrap recovery system

    International Nuclear Information System (INIS)

    Tiegs, S.M.

    1979-03-01

    The two major uranium recovery processing options reviewed are (1) internal recovery of the scrap by the refabrication system and (2) transfer to and external recovery of the scrap by the head end of the reprocessing system. Each option was reviewed with respect to equipment requirements, preparatory processing, and material accountability. Because there may be a high cost factor on transfer of scrap fuel material to the reprocessing system for recovery, all of the scrap streams will be recycled internally within the refabrication system, with the exception of reject fuel elements, which will be transferred to the head end of the reprocessing system for uranium recovery. The refabrication facility will be fully remote; thus, simple recovery techniques were selected as the reference processes for scrap recovery. Crushing, burning, and leaching methods will be used to recover uranium from the HTGR refabrication scrap fuel forms, which include particles without silicon carbide coatings, particles with silicon carbide coatings, uncarbonized fuel rods, carbon furnace parts, perchloroethylene distillation bottoms, and analytical sample remnants. Mass flows through the reference scrap recovery system were calculated for the HTGR reference recycle facility operating with the highly enriched uranium fuel cycle. Output per day from the refabrication scrap recovery system is estimated to be 4.02 kg of 2355 U and 10.85 kg of 233 U. Maximum equipment capacities were determined, and future work will be directed toward the development and costing of the scrap recovery system chosen as reference

  1. The computer system for the express-analysis of the irradiation samples

    International Nuclear Information System (INIS)

    Vzorov, I.K.; Kalmykov, A.V.; Korenev, S.A.; Minashkin, V.F.; Sikolenko, V.V.

    1999-01-01

    The computer system for the express-analysis (SEA) of the irradiation samples is described. The system is working together with the pulse high current electrons and ions source. It allows us to correct irradiation regime in real time. The SEA system automatically measures volt-ampere and volt-farad characteristics, sample resistance by 'four-probe' method, sample capacitor parameters. Its parameters are: in the volt-ampere measuring regime - U max = 200 V, minimal voltage step U sh =0.05 V, voltage accuracy 0.25%; in the capacity measuring regime - capacity measurement diapason 0 - 1600 pF, working frequencies diapason 1 -150 kHz, capacity accuracy 0.5%, voltage shifting diapason 1 - 200 V, minimal step of voltage shifting U sh 0.05 V. The SEA management is performed by IBM/AT computer. The control and measuring apparatus was realized in CAMAC standard. The programmed set consists of the first display procedures, control, treatment and information exit. (author)

  2. System and method for deriving a process-based specification

    Science.gov (United States)

    Hinchey, Michael Gerard (Inventor); Rash, James Larry (Inventor); Rouff, Christopher A. (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  3. Phantom construction by the lithography process for micro-radiographic system analysis

    International Nuclear Information System (INIS)

    Rocha, Henrique de Souza; Lopes, Ricardo Tadeu; Macedo, Pedro Ivo M.T.

    2002-01-01

    In this work it was analyzed the viability of the use of a standard phantom, manufactured by the lithograph process, for obtaining the space resolution of a microradiographic system. The project predicted the construction of three types of phantoms, one for obtaining the function of modulation transfer in systems with resolutions between 10 and 60 μm and other two for the direct reading of the space resolution, in systems with resolution between 10 and 100 μm and between 100 and 400 μm. Despite of the results have been obtained from preliminary samples of the built phantoms, it was possible to find good results in relation to the space resolution. Using a reference system formed by a conventional microfocused X-rays tube with a CCD detector, was possible to match a space resolution of 15 μm in 20% of modulation in a system with a estimated resolution of 12,5 μm. (author)

  4. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  5. Tank 241-C-106 sampling data requirements developed through the data quality objectives (DQO) process

    International Nuclear Information System (INIS)

    Wang, O.S.; Bell, K.E.; Anderson, C.M.; Peffers, M.S.; Pulsipher, B.A.; Scott, J.L.

    1994-01-01

    The rate of heat generation for tank 241-C-106 at the Hanford Site is estimated at more then 100,000 Btu/h. The heat is generated primarily from the radioactive decay of 90 Sr waste that was inadvertently transferred into the tank in the late 1960s. If proper tank cooling is not maintained for this tank, heat-induced structural damage to the tank's concrete shell could result in the release of nuclear waste to the environment. Because of high-heat concerns in January 1991, tank 241-C-106 was designated as a Watch List tank and deemed as a Priority 1 safety issue. Waste Tank Safety Program (WTSP) is responsible for the resolution of this safety issue. Although forced cooling is effective for short term, the long-term resolution for tank cooling is waste retrieval. Single-shell Tank Retrieval Project (Retrieval) is responsible for the safe retrieval and transfer of radioactive waste from tank 241-C-106 to a selected double-shell tank. This data quality objective (DQO) study is an effort to determine engineering and design data needs for WTSP and assist Retrieval in designing contingency action retrieval systems. The 7-step DQO process is a tool developed by the Environmental Protection Agency with a goal of identifying needs and reducing costs. This report discusses the results of two DQO efforts for WTSP and Retrieval. The key data needs to support WTSP are thermal conductivity, permeability, and heat load profile. For the Retrieval support, there are nine and three data needs identified, respectively, for retrieval engineering system design and HVAC system design. The updated schedule to drill two core samples using rotary mode is set for March 1994. The analysis of the sample is expected to be completed by September 1994

  6. Data-driven soft sensor design with multiple-rate sampled data

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Knudsen, Jørgen K.H.

    2007-01-01

    Multi-rate systems are common in industrial processes where quality measurements have slower sampling rate than other process variables. Since inter-sample information is desirable for effective quality control, different approaches have been reported to estimate the quality between samples......, including numerical interpolation, polynomial transformation, data lifting and weighted partial least squares (WPLS). Two modifications to the original data lifting approach are proposed in this paper: reformulating the extraction of a fast model as an optimization problem and ensuring the desired model...... properties through Tikhonov Regularization. A comparative investigation of the four approaches is performed in this paper. Their applicability, accuracy and robustness to process noise are evaluated on a single-input single output (SISO) system. The regularized data lifting and WPLS approaches...

  7. Pico-litre Sample Introduction and Acoustic Levitation Systems for Time Resolved Protein Crystallography Experiments at XFELS

    Directory of Open Access Journals (Sweden)

    Peter Docker

    2017-07-01

    Full Text Available The system described in this work is a variant from traditional acoustic levitation first described by, Marzo et al. It uses multiple transducers eliminating the requirement for a mirror surface, allowing for an open geometry as the sound from multiple transducers combines to generate the acoustic trap which is configured to catch pico litres of crystal slurries. These acoustic traps also have the significant benefit of eliminating potential beam attenuation due to support structures or microfluidic devices. Additionally they meet the need to eliminate sample environments when experiments are carried out using an X-ray Free Electron Lasers (XFEL such as the Linac Coherent Light Source (LCLS as any sample environment would not survive the exposure to the X-Ray beam. XFELs generate Light a billion times brighter than the sun. The application for this system will be to examine turn over in Beta lactamase proteins which is responsible for bacteria developing antibiotic resistance and therefore of significant importance to future world health. The system will allow for diffraction data to be collected before and after turnover allowing for a better understanding of the underling processes. The authors first described this work at Nanotech 2017.

  8. Eco-efficiency of grinding processes and systems

    CERN Document Server

    Winter, Marius

    2016-01-01

    This research monograph aims at presenting an integrated assessment approach to describe, model, evaluate and improve the eco-efficiency of existing and new grinding processes and systems. Various combinations of grinding process parameters and system configurations can be evaluated based on the eco-efficiency. The book presents the novel concept of empirical and physical modeling of technological, economic and environmental impact indicators. This includes the integrated evaluation of different grinding process and system scenarios. The book is a valuable read for research experts and practitioners in the field of eco-efficiency of manufacturing processes but the book may also be beneficial for graduate students.

  9. Virtual sampling in variational processing of Monte Carlo simulation in a deep neutron penetration problem

    International Nuclear Information System (INIS)

    Allagi, Mabruk O.; Lewins, Jeffery D.

    1999-01-01

    In a further study of virtually processed Monte Carlo estimates in neutron transport, a shielding problem has been studied. The use of virtual sampling to estimate the importance function at a certain point in the phase space depends on the presence of neutrons from the real source at that point. But in deep penetration problems, not many neutrons will reach regions far away from the source. In order to overcome this problem, two suggestions are considered: (1) virtual sampling is used as far as the real neutrons can reach, then fictitious sampling is introduced for the remaining regions, distributed in all the regions, or (2) only one fictitious source is placed where the real neutrons almost terminate and then virtual sampling is used in the same way as for the real source. Variational processing is again found to improve the Monte Carlo estimates, being best when using one fictitious source in the far regions with virtual sampling (option 2). When fictitious sources are used to estimate the importances in regions far away from the source, some optimization has to be performed for the proportion of fictitious to real sources, weighted against accuracy and computational costs. It has been found in this study that the optimum number of cells to be treated by fictitious sampling is problem dependent, but as a rule of thumb, fictitious sampling should be employed in regions where the number of neutrons from the real source fall below a specified limit for good statistics

  10. The validation of a pixe system for trace element analysis of biological samples

    Science.gov (United States)

    Saied, S. O.; Crumpton, D.; Francois, P. E.

    1981-03-01

    A PIXE system has been developed for measuring trace element levels in biological samples and a study made of the precision and accuracy achievable. The calibration of the system has been established using thin targets of known elemental composition and the reproducibility studied using protons of energy 2.5 MeV. Both thick and thin samples prepared from NBS bovine liver have been analysed and the elemental ratios present established for a set of replicate samples. These are compared with the results of other workers. Problems relating to sample preparation are discussed.

  11. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    International Nuclear Information System (INIS)

    Cole, Z.; Roos, P.A.; Berg, T.; Kaylor, B.; Merkel, K.D.; Babbitt, W.R.; Reibel, R.R.

    2007-01-01

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier

  12. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Z. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)]. E-mail: cole@s2corporation.com; Roos, P.A. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Berg, T. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Kaylor, B. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Merkel, K.D. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Babbitt, W.R. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Reibel, R.R. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)

    2007-11-15

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier.

  13. Process for Selecting System Level Assessments for Human System Technologies

    Science.gov (United States)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  14. Microfluidic Sample Preparation for Diagnostic Cytopathology

    Science.gov (United States)

    Mach, Albert J.; Adeyiga, Oladunni B.; Di Carlo, Dino

    2014-01-01

    The cellular components of body fluids are routinely analyzed to identify disease and treatment approaches. While significant focus has been placed on developing cell analysis technologies, tools to automate the preparation of cellular specimens have been more limited, especially for body fluids beyond blood. Preparation steps include separating, concentrating, and exposing cells to reagents. Sample preparation continues to be routinely performed off-chip by technicians, preventing cell-based point-of-care diagnostics, increasing the cost of tests, and reducing the consistency of the final analysis following multiple manually-performed steps. Here, we review the assortment of biofluids for which suspended cells are analyzed, along with their characteristics and diagnostic value. We present an overview of the conventional sample preparation processes for cytological diagnosis. We finally discuss the challenges and opportunities in developing microfluidic devices for the purpose of automating or miniaturizing these processes, with particular emphases on preparing large or small volume samples, working with samples of high cellularity, automating multi-step processes, and obtaining high purity subpopulations of cells. We hope to convey the importance of and help identify new research directions addressing the vast biological and clinical applications in preparing and analyzing the array of available biological fluids. Successfully addressing the challenges described in this review can lead to inexpensive systems to improve diagnostic accuracy while simultaneously reducing overall systemic healthcare costs. PMID:23380972

  15. Generic Health Management: A System Engineering Process Handbook Overview and Process

    Science.gov (United States)

    Wilson, Moses Lee; Spruill, Jim; Hong, Yin Paw

    1995-01-01

    Health Management, a System Engineering Process, is one of those processes-techniques-and-technologies used to define, design, analyze, build, verify, and operate a system from the viewpoint of preventing, or minimizing, the effects of failure or degradation. It supports all ground and flight elements during manufacturing, refurbishment, integration, and operation through combined use of hardware, software, and personnel. This document will integrate Health Management Processes (six phases) into five phases in such a manner that it is never a stand alone task/effort which separately defines independent work functions.

  16. Process Information System - Nuclear Power Plant Krsko

    International Nuclear Information System (INIS)

    Mandic, D.; Barbic, B.; Linke, B.; Colak, I.

    1998-01-01

    Original NEK design was using several Process Computer Systems (PCS) for both process control and process supervision. PCS were built by different manufacturers around different hardware and software platforms. Operational experience and new regulatory requirements imposed new technical and functional requirements on the PCS. Requirements such as: - Acquisition of new signals from the technological processes and environment - Implementation of new application programs - Significant improvement of MMI (Man Machine Interface) - Process data transfer to other than Main Control Room (MCR) locations - Process data archiving and capability to retrieve same data for future analysis were impossible to be implemented within old systems. In order to satisfy new requirements, NEK has decided to build new Process Information System (PIS). During the design and construction of the PIS Project Phase I, in addition to the main foreign contractor, there was significant participation of local architect engineering and construction companies. This paper presents experience of NEK and local partners. (author)

  17. Radio frequency tags systems to initiate system processing

    Science.gov (United States)

    Madsen, Harold O.; Madsen, David W.

    1994-09-01

    This paper describes the automatic identification technology which has been installed at Applied Magnetic Corp. MR fab. World class manufacturing requires technology exploitation. This system combines (1) FluoroTrac cassette and operator tracking, (2) CELLworks cell controller software tools, and (3) Auto-Soft Inc. software integration services. The combined system eliminates operator keystrokes and errors during normal processing within a semiconductor fab. The methods and benefits of this system are described.

  18. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  19. Solar System Samples for Research, Education, and Public Outreach

    Science.gov (United States)

    Allen, J.; Luckey, M.; McInturff, B.; Kascak, A.; Tobola, K.; Galindo, C.; Allen, C.

    2011-01-01

    In the next two years, during the NASA Year of the Solar System, spacecraft from NASA and our international partners will; encounter a comet, orbit asteroid 4 Vesta, continue to explore Mars with rovers, and launch robotic explorers to the Moon and Mars. We have pieces of all these worlds in our laboratories, and their continued study provides incredibly valuable "ground truth" to complement space exploration missions. Extensive information about these unique materials, as well as actual lunar samples and meteorites, are available for display and education. The Johnson Space Center (JSC) has the unique responsibility to curate NASA's extraterrestrial samples from past and future missions. Curation includes documentation, preservation, preparation, and distribution of samples for research, education, and public outreach.

  20. Architecture Of High Speed Image Processing System

    Science.gov (United States)

    Konishi, Toshio; Hayashi, Hiroshi; Ohki, Tohru

    1988-01-01

    One of architectures for a high speed image processing system which corresponds to a new algorithm for a shape understanding is proposed. And the hardware system which is based on the archtecture was developed. Consideration points of the architecture are mainly that using processors should match with the processing sequence of the target image and that the developed system should be used practically in an industry. As the result, it was possible to perform each processing at a speed of 80 nano-seconds a pixel.