WorldWideScience

Sample records for integrated sample pre-preparations

  1. Lab-on-Valve Micro Sequential Injection: A Versatile Approach for Implementing Integrated Sample Pre-preparations and Executing (Bio)Chemical Assays

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    waste generation. Most recently, the socalled third generation of FIA has emerged, that is, the Lab-on-Valve (LOV) approach, the conceptual basis of which is to incorporate all the necessary unit operational manipulations required, and, when possible, even the detection device into a single small...... integrated microconduit, or “laboratory”, placed atop a selection valve. The lecture will detail the evolution of the three generations of FIA, emphasis being placed on the LOV approach. Proven itself as a versatile front end to a variety of detection techniques, its utility will be exemplified by a series...... of the renewable microcolumn concept. Despite their excellent analytical chemical capabilities, ETAAS as well as ICPMS often require that the samples are subjected to suitable pretreatment in order to obtain the necessary sensitivity and selectivity. Either in order to separate the analyte from potentially...

  2. Lab-on-Valve Micro Sequential Injection: A Versatile Approach for Implementing Integrated Sample Pre-preparations and Executing (Bio)Chemical Assays

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    waste generation. Most recently, the Lab-on-Valve (LOV) approach has emerged. Termed the third generation of FIA, the conceptual basis of the LOV is to incorporate all the necessary unit operational manipulations required in a chemical assay, and, when possible, even the detection device, into a single...... small integrated microconduit, or “laboratory”, placed atop a selection valve. In the lecture emphasis will be placed on the LOV approach. Proven itself as a versatile front end to a variety of detection techniques, its utility will be exemplified by various applications. Particular focus......-phase microcolumn concept utilising hydrophobic as well as hydrophilic bead materials. Although ETAAS and ICPMS both are characterised by excellent analytical chemical capabilities, they nevertheless often require that the samples be subjected to suitable pretreatment in order to obtain the necessary sensitivity...

  3. POLAR ORGANIC CHEMICAL INTEGRATIVE SAMPLING ...

    Science.gov (United States)

    The purpose of the research presented in this paper is two-fold: (1) to demonstrate the 4 coupling of two state-of-the-art techniques: a time-weighted polar organic integrative sampler (POCIS) and micro-liquid chromatography-electrospray/ion trap mass spectrometry (u-LC-6 ES/ITMS); and (2) the assessment of these methodologies in a real-world environment -wastewater effluent - for detecting six drugs (four prescription and two illicit). In the effluent from three wastewater treatment plants (WWTP), azithromycin was detected at concentrations ranging from 15ng/L to 66ng/L, equivalent to the total annual release of 0.4 -4 kg into the receiving waters. Detected and confirmed in the effluent from two WWTPs were two illicit drugs methamphetamine and methylenedioxymethamphetamine (MDMA), at 2ng/L and 0.5ng/L, respectively. While the ecotoxicological significance of drugs in environmental matrices, particularly water, has not been closely examined, it can only be surmised that these substances have the potential to adversely affect biota that are continuously exposed to them even at very low levels. The potential for chronic affects on human health is also unknown, but of increasing concern due to the multi use character of water, particularly in densely populated arid areas. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality

  4. Integrated sampling vs ion chromatography: Mathematical considerations

    International Nuclear Information System (INIS)

    Sundberg, L.L.

    1992-01-01

    This paper presents some general purpose considerations that can be utilized when comparisons are made between the results of integrated sampling over several hours or days, and ion chromatography where sample collection times are measured in minutes. The discussion is geared toward the measurement of soluble transition metal ions in BWR feedwater. Under steady-state conditions, the concentrations reported by both techniques should be in reasonable agreement. Transient operations effect both types of measurements. A simplistic model, applicable to both sampling techniques, is presented that demonstrates the effect of transients which occur during the acquisition of a steady-state sample. For a common set of conditions, the integrated concentration is proportional to the concentration and duration of the transient, and inversely proportional to the sample collection time. The adjustment of the collection period during a known transient allows an estimation of peak transient concentration. Though the probability of sampling a random transient with the integrated sampling technique is very high, the magnitude is severely diluted with long integration times. Transient concentrations are magnified with ion chromatography, but the probability of sampling a transient is significantly lower using normal ion chromatography operations. Various data averaging techniques are discussed for integrated sampling and IC determinations. The use of time-weighted averages appears to offer more advantages over arithmetic and geometric means for integrated sampling when the collection period is variable. For replicate steady-state ion chromatography determinations which bracket a transient sample, it may be advantageous to ignore the calculation of averages, and report the data as trending information only

  5. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  6. Boson sampling with integrated optical circuits

    International Nuclear Information System (INIS)

    Bentivegna, M.

    2014-01-01

    Simulating the evolution of non-interacting bosons through a linear transformation acting on the system’s Fock state is strongly believed to be hard for a classical computer. This is commonly known as the Boson Sampling problem, and has recently got attention as the first possible way to demonstrate the superior computational power of quantum devices over classical ones. In this paper we describe the quantum optics approach to this problem, highlighting the role of integrated optical circuits.

  7. Multiscale sampling model for motion integration.

    Science.gov (United States)

    Sherbakov, Lena; Yazdanbakhsh, Arash

    2013-09-30

    Biologically plausible strategies for visual scene integration across spatial and temporal domains continues to be a challenging topic. The fundamental question we address is whether classical problems in motion integration, such as the aperture problem, can be solved in a model that samples the visual scene at multiple spatial and temporal scales in parallel. We hypothesize that fast interareal connections that allow feedback of information between cortical layers are the key processes that disambiguate motion direction. We developed a neural model showing how the aperture problem can be solved using different spatial sampling scales between LGN, V1 layer 4, V1 layer 6, and area MT. Our results suggest that multiscale sampling, rather than feedback explicitly, is the key process that gives rise to end-stopped cells in V1 and enables area MT to solve the aperture problem without the need for calculating intersecting constraints or crafting intricate patterns of spatiotemporal receptive fields. Furthermore, the model explains why end-stopped cells no longer emerge in the absence of V1 layer 6 activity (Bolz & Gilbert, 1986), why V1 layer 4 cells are significantly more end-stopped than V1 layer 6 cells (Pack, Livingstone, Duffy, & Born, 2003), and how it is possible to have a solution to the aperture problem in area MT with no solution in V1 in the presence of driving feedback. In summary, while much research in the field focuses on how a laminar architecture can give rise to complicated spatiotemporal receptive fields to solve problems in the motion domain, we show that one can reframe motion integration as an emergent property of multiscale sampling achieved concurrently within lamina and across multiple visual areas.

  8. INTEGRATIVE SAMPLING OF ANTIBIOTICS AND OTHER ...

    Science.gov (United States)

    Pharmaceuticals from human and veterinary use continually enter the environment through municipal wastewater treatment plants (WWTPs), surface runoff from animal waste, and direct disposal of unused medications. The presence of these chemicals, albeit often at subtherapeutic trace levels, may be partly responsible for development of antibiotic-resistant bacteria and sublethal effects in aquatic organisms. Conventional sampling techniques (i.e., grab sampling) often are insufficient for detecting these trace levels. A new sampling technique, the Polar Organic Chemical Integrative Sampler (POCIS), developed by scientists at the USGS's Columbia Environmental Research Center, can provide the time-weighted average concentrations of these complex mixtures. A pilot study targeting the antibiotic azithromycin involved deploying the POCIS for 30 days in the effluents of three WWTPs in Nevada, Utah, and South Carolina. Azithromycin was detected at each WWTP at 19 to 66 ng/L. This translates to a yearly loading, into each of the three receiving waters, of 0.4 to 4 kg/year. In a separate study investigating potential impacts of confined animal feeding operations on national wildlife refuges in the Delmarva peninsula, the antibiotic tetracycline and the natural hormone 17B-estradiol were detected at multiple sites. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and

  9. Integration and interpolation of sampled waveforms

    International Nuclear Information System (INIS)

    Stearns, S.D.

    1978-01-01

    Methods for integrating, interpolating, and improving the signal-to-noise ratio of digitized waveforms are discussed with regard to seismic data from underground tests. The frequency-domain integration method and the digital interpolation method of Schafer and Rabiner are described and demonstrated using test data. The use of bandpass filtering for noise reduction is also demonstrated. With these methods, a backlog of seismic test data has been successfully processed

  10. Organ culture storage of pre-prepared corneal donor material for Descemet's membrane endothelial keratoplasty.

    Science.gov (United States)

    Bhogal, Maninder; Matter, Karl; Balda, Maria S; Allan, Bruce D

    2016-11-01

    To evaluate the effect of media composition and storage method on pre-prepared Descemet's membrane endothelial keratoplasty (DMEK) grafts. 50 corneas were used. Endothelial wound healing and proliferation in different media were assessed using a standard injury model. DMEK grafts were stored using three methods: peeling with free scroll storage; partial peeling with storage on the stroma and fluid bubble separation with storage on the stroma. Endothelial cell (EC) phenotype and the extent of endothelial overgrowth were examined. Global cell viability was assessed for storage methods that maintained a normal cell phenotype. 1 mm wounds healed within 4 days. Enhanced media did not increase EC proliferation but may have increased EC migration into the wounded area. Grafts that had been trephined showed evidence of EC overgrowth, whereas preservation of a physical barrier in the bubble group prevented this. In grafts stored in enhanced media or reapposed to the stroma after trephination, endothelial migration occurred sooner and cells underwent endothelial-mesenchymal transformation. Ongoing cell loss, with new patterns of cell death, was observed after returning grafts to storage. Grafts stored as free scrolls retained more viable ECs than grafts prepared with the fluid bubble method (74.2± 3% vs 60.3±6%, p=0.04 (n=8). Free scroll storage is superior to liquid bubble and partial peeling techniques. Free scrolls only showed overgrowth of ECs after 4 days in organ culture, indicating a viable time window for the clinical use of pre-prepared DMEK donor material using this method. Methods for tissue preparation and storage media developed for whole corneas should not be used in pre-prepared DMEK grafts without prior evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. FISHprep: A Novel Integrated Device for Metaphase FISH Sample Preparation

    DEFF Research Database (Denmark)

    Shah, Pranjul Jaykumar; Vedarethinam, Indumathi; Kwasny, Dorota

    2011-01-01

    We present a novel integrated device for preparing metaphase chromosomes spread slides (FISHprep). The quality of cytogenetic analysis from patient samples greatly relies on the efficiency of sample pre-treatment and/or slide preparation. In cytogenetic slide preparation, cell cultures...... are routinely used to process samples (for culture, arrest and fixation of cells) and/or to expand limited amount of samples (in case of prenatal diagnostics). Arguably, this expansion and other sample pretreatments form the longest part of the entire diagnostic protocols spanning over 3–4 days. We present here...... with minimal handling for metaphase FISH slide preparation....

  12. Approximation of the exponential integral (well function) using sampling methods

    Science.gov (United States)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  13. Entropic sampling in the path integral Monte Carlo method

    International Nuclear Information System (INIS)

    Vorontsov-Velyaminov, P N; Lyubartsev, A P

    2003-01-01

    We have extended the entropic sampling Monte Carlo method to the case of path integral representation of a quantum system. A two-dimensional density of states is introduced into path integral form of the quantum canonical partition function. Entropic sampling technique within the algorithm suggested recently by Wang and Landau (Wang F and Landau D P 2001 Phys. Rev. Lett. 86 2050) is then applied to calculate the corresponding entropy distribution. A three-dimensional quantum oscillator is considered as an example. Canonical distributions for a wide range of temperatures are obtained in a single simulation run, and exact data for the energy are reproduced

  14. Nevada National Security Site Integrated Groundwater Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Marutzky, Sam; Farnham, Irene

    2014-10-01

    The purpose of the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan) is to provide a comprehensive, integrated approach for collecting and analyzing groundwater samples to meet the needs and objectives of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) Underground Test Area (UGTA) Activity. Implementation of this Plan will provide high-quality data required by the UGTA Activity for ensuring public protection in an efficient and cost-effective manner. The Plan is designed to ensure compliance with the UGTA Quality Assurance Plan (QAP). The Plan’s scope comprises sample collection and analysis requirements relevant to assessing the extent of groundwater contamination from underground nuclear testing. This Plan identifies locations to be sampled by corrective action unit (CAU) and location type, sampling frequencies, sample collection methodologies, and the constituents to be analyzed. In addition, the Plan defines data collection criteria such as well-purging requirements, detection levels, and accuracy requirements; identifies reporting and data management requirements; and provides a process to ensure coordination between NNSS groundwater sampling programs for sampling of interest to UGTA. This Plan does not address compliance with requirements for wells that supply the NNSS public water system or wells involved in a permitted activity.

  15. Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Britt, Phillip F [ORNL

    2015-03-01

    Analysis of Waste Isolation Pilot Plant Samples: Integrated Summary Report. Summaries of conclusions, analytical processes, and analytical results. Analysis of samples taken from the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico in support of the WIPP Technical Assessment Team (TAT) activities to determine to the extent feasible the mechanisms and chemical reactions that may have resulted in the breach of at least one waste drum and release of waste material in WIPP Panel 7 Room 7 on February 14, 2014. This report integrates and summarizes the results contained in three separate reports, described below, and draws conclusions based on those results. Chemical and Radiochemical Analyses of WIPP Samples R-15 C5 SWB and R16 C-4 Lip; PNNL-24003, Pacific Northwest National Laboratory, December 2014 Analysis of Waste Isolation Pilot Plant (WIPP) Underground and MgO Samples by the Savannah River National Laboratory (SRNL); SRNL-STI-2014-00617; Savannah River National Laboratory, December 2014 Report for WIPP UG Sample #3, R15C5 (9/3/14); LLNL-TR-667015; Lawrence Livermore National Laboratory, January 2015 This report is also contained in the Waste Isolation Pilot Plant Technical Assessment Team Report; SRNL-RP-2015-01198; Savannah River National Laboratory, March 17, 2015, as Appendix C: Analysis Integrated Summary Report.

  16. Controlling a sample changer using the integrated counting system

    International Nuclear Information System (INIS)

    Deacon, S.; Stevens, M.P.

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described, firstly the running options are given, followed by a program description listing and flowchart. (author)

  17. Controlling a sample changer using the integrated counting system

    Energy Technology Data Exchange (ETDEWEB)

    Deacon, S; Stevens, M P

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module-the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described; first the running options are given, followed by a program description listing and flowchart.

  18. An integrated approach for multi-level sample size determination

    International Nuclear Information System (INIS)

    Lu, M.S.; Teichmann, T.; Sanborn, J.B.

    1997-01-01

    Inspection procedures involving the sampling of items in a population often require steps of increasingly sensitive measurements, with correspondingly smaller sample sizes; these are referred to as multilevel sampling schemes. In the case of nuclear safeguards inspections verifying that there has been no diversion of Special Nuclear Material (SNM), these procedures have been examined often and increasingly complex algorithms have been developed to implement them. The aim in this paper is to provide an integrated approach, and, in so doing, to describe a systematic, consistent method that proceeds logically from level to level with increasing accuracy. The authors emphasize that the methods discussed are generally consistent with those presented in the references mentioned, and yield comparable results when the error models are the same. However, because of its systematic, integrated approach the proposed method elucidates the conceptual understanding of what goes on, and, in many cases, simplifies the calculations. In nuclear safeguards inspections, an important aspect of verifying nuclear items to detect any possible diversion of nuclear fissile materials is the sampling of such items at various levels of sensitivity. The first step usually is sampling by ''attributes'' involving measurements of relatively low accuracy, followed by further levels of sampling involving greater accuracy. This process is discussed in some detail in the references given; also, the nomenclature is described. Here, the authors outline a coordinated step-by-step procedure for achieving such multilevel sampling, and they develop the relationships between the accuracy of measurement and the sample size required at each stage, i.e., at the various levels. The logic of the underlying procedures is carefully elucidated; the calculations involved and their implications, are clearly described, and the process is put in a form that allows systematic generalization

  19. Phase contrast STEM for thin samples: Integrated differential phase contrast.

    Science.gov (United States)

    Lazić, Ivan; Bosch, Eric G T; Lazar, Sorin

    2016-01-01

    It has been known since the 1970s that the movement of the center of mass (COM) of a convergent beam electron diffraction (CBED) pattern is linearly related to the (projected) electrical field in the sample. We re-derive a contrast transfer function (CTF) for a scanning transmission electron microscopy (STEM) imaging technique based on this movement from the point of view of image formation and continue by performing a two-dimensional integration on the two images based on the two components of the COM movement. The resulting integrated COM (iCOM) STEM technique yields a scalar image that is linear in the phase shift caused by the sample and therefore also in the local (projected) electrostatic potential field of a thin sample. We confirm that the differential phase contrast (DPC) STEM technique using a segmented detector with 4 quadrants (4Q) yields a good approximation for the COM movement. Performing a two-dimensional integration, just as for the COM, we obtain an integrated DPC (iDPC) image which is approximately linear in the phase of the sample. Beside deriving the CTFs of iCOM and iDPC, we clearly point out the objects of the two corresponding imaging techniques, and highlight the differences to objects corresponding to COM-, DPC-, and (HA) ADF-STEM. The theory is validated with simulations and we present first experimental results of the iDPC-STEM technique showing its capability for imaging both light and heavy elements with atomic resolution and a good signal to noise ratio (SNR). Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Phase contrast STEM for thin samples: Integrated differential phase contrast

    International Nuclear Information System (INIS)

    Lazić, Ivan; Bosch, Eric G.T.; Lazar, Sorin

    2016-01-01

    It has been known since the 1970s that the movement of the center of mass (COM) of a convergent beam electron diffraction (CBED) pattern is linearly related to the (projected) electrical field in the sample. We re-derive a contrast transfer function (CTF) for a scanning transmission electron microscopy (STEM) imaging technique based on this movement from the point of view of image formation and continue by performing a two-dimensional integration on the two images based on the two components of the COM movement. The resulting integrated COM (iCOM) STEM technique yields a scalar image that is linear in the phase shift caused by the sample and therefore also in the local (projected) electrostatic potential field of a thin sample. We confirm that the differential phase contrast (DPC) STEM technique using a segmented detector with 4 quadrants (4Q) yields a good approximation for the COM movement. Performing a two-dimensional integration, just as for the COM, we obtain an integrated DPC (iDPC) image which is approximately linear in the phase of the sample. Beside deriving the CTFs of iCOM and iDPC, we clearly point out the objects of the two corresponding imaging techniques, and highlight the differences to objects corresponding to COM-, DPC-, and (HA) ADF-STEM. The theory is validated with simulations and we present first experimental results of the iDPC-STEM technique showing its capability for imaging both light and heavy elements with atomic resolution and a good signal to noise ratio (SNR). - Highlights: • First DPC-based atomic resolution images of potential and charge density are obtained. • This is enabled by integration and differentiation of 2D DPC signals, respectively. • Integrated DPC (iDPC) based on 4 quadrant imaging is compared to iCOM imaging. • Noise analysis and comparison with standard STEM imaging modes is provided. • iDPC allows direct imaging of light (C, N, O …) and heavy (Ga, Au …) atoms together.

  1. Phase contrast STEM for thin samples: Integrated differential phase contrast

    Energy Technology Data Exchange (ETDEWEB)

    Lazić, Ivan, E-mail: ivan.lazic@fei.com; Bosch, Eric G.T.; Lazar, Sorin

    2016-01-15

    It has been known since the 1970s that the movement of the center of mass (COM) of a convergent beam electron diffraction (CBED) pattern is linearly related to the (projected) electrical field in the sample. We re-derive a contrast transfer function (CTF) for a scanning transmission electron microscopy (STEM) imaging technique based on this movement from the point of view of image formation and continue by performing a two-dimensional integration on the two images based on the two components of the COM movement. The resulting integrated COM (iCOM) STEM technique yields a scalar image that is linear in the phase shift caused by the sample and therefore also in the local (projected) electrostatic potential field of a thin sample. We confirm that the differential phase contrast (DPC) STEM technique using a segmented detector with 4 quadrants (4Q) yields a good approximation for the COM movement. Performing a two-dimensional integration, just as for the COM, we obtain an integrated DPC (iDPC) image which is approximately linear in the phase of the sample. Beside deriving the CTFs of iCOM and iDPC, we clearly point out the objects of the two corresponding imaging techniques, and highlight the differences to objects corresponding to COM-, DPC-, and (HA) ADF-STEM. The theory is validated with simulations and we present first experimental results of the iDPC-STEM technique showing its capability for imaging both light and heavy elements with atomic resolution and a good signal to noise ratio (SNR). - Highlights: • First DPC-based atomic resolution images of potential and charge density are obtained. • This is enabled by integration and differentiation of 2D DPC signals, respectively. • Integrated DPC (iDPC) based on 4 quadrant imaging is compared to iCOM imaging. • Noise analysis and comparison with standard STEM imaging modes is provided. • iDPC allows direct imaging of light (C, N, O …) and heavy (Ga, Au …) atoms together.

  2. Integrated sampling and analysis plan for samples measuring >10 mrem/hour

    International Nuclear Information System (INIS)

    Haller, C.S.

    1992-03-01

    This integrated sampling and analysis plan was prepared to assist in planning and scheduling of Hanford Site sampling and analytical activities for all waste characterization samples that measure greater than 10 mrem/hour. This report also satisfies the requirements of the renegotiated Interim Milestone M-10-05 of the Hanford Federal Facility Agreement and Consent Order (the Tri-Party Agreement). For purposes of comparing the various analytical needs with the Hanford Site laboratory capabilities, the analytical requirements of the various programs were normalized by converting required laboratory effort for each type of sample to a common unit of work, the standard analytical equivalency unit (AEU). The AEU approximates the amount of laboratory resources required to perform an extensive suite of analyses on five core segments individually plus one additional suite of analyses on a composite sample derived from a mixture of the five core segments and prepare a validated RCRA-type data package

  3. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  4. Nevada National Security Site Integrated Groundwater Sampling Plan, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene

    2018-03-01

    The purpose is to provide a comprehensive, integrated approach for collecting and analyzing groundwater samples to meet the needs and objectives of the DOE/EM Nevada Program’s UGTA Activity. Implementation of this Plan will provide high-quality data required by the UGTA Activity for ensuring public protection in an efficient and cost-effective manner. The Plan is designed to ensure compliance with the UGTA Quality Assurance Plan (QAP) (NNSA/NFO, 2015); Federal Facility Agreement and Consent Order (FFACO) (1996, as amended); and DOE Order 458.1, Radiation Protection of the Public and the Environment (DOE, 2013). The Plan’s scope comprises sample collection and analysis requirements relevant to assessing both the extent of groundwater contamination from underground nuclear testing and impact of testing on water quality in downgradient communities. This Plan identifies locations to be sampled by CAU and location type, sampling frequencies, sample collection methodologies, and the constituents to be analyzed. In addition, the Plan defines data collection criteria such as well purging, detection levels, and accuracy requirements/recommendations; identifies reporting and data management requirements; and provides a process to ensure coordination between NNSS groundwater sampling programs for sampling analytes of interest to UGTA. Information used in the Plan development—including the rationale for selection of wells, sampling frequency, and the analytical suite—is discussed under separate cover (N-I, 2014) and is not reproduced herein. This Plan does not address compliance for those wells involved in a permitted activity. Sampling and analysis requirements associated with these wells are described in their respective permits and are discussed in NNSS environmental reports (see Section 5.2). In addition, sampling for UGTA CAUs that are in the Closure Report (CR) stage are not included in this Plan. Sampling requirements for these CAUs are described in the CR

  5. Integration of Apollo Lunar Sample Data into Google Moon

    Science.gov (United States)

    Dawson, Melissa D.; Todd, Nancy S.; Lofgren, Gary

    2010-01-01

    The Google Moon Apollo Lunar Sample Data Integration project is a continuation of the Apollo 15 Google Moon Add-On project, which provides a scientific and educational tool for the study of the Moon and its geologic features. The main goal of this project is to provide a user-friendly interface for an interactive and educational outreach and learning tool for the Apollo missions. Specifically, this project?s focus is the dissemination of information about the lunar samples collected during the Apollo missions by providing any additional information needed to enhance the Apollo mission data on Google Moon. Apollo missions 15 and 16 were chosen to be completed first due to the availability of digitized lunar sample photographs and the amount of media associated with these missions. The user will be able to learn about the lunar samples collected in these Apollo missions, as well as see videos, pictures, and 360 degree panoramas of the lunar surface depicting the lunar samples in their natural state, following collection and during processing at NASA. Once completed, these interactive data layers will be submitted for inclusion into the Apollo 15 and 16 missions on Google Moon.

  6. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 4 TANK 21H QUALIFICATION SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2011-06-22

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H to qualify them for use in the Integrated Salt Disposition Program (ISDP) Batch 4 processing. All sample results agree with expectations based on prior analyses where available. No issues with the projected Salt Batch 4 strategy are identified. This revision includes additional data points that were not available in the original issue of the document, such as additional plutonium results, the results of the monosodium titanate (MST) sorption test and the extraction, scrub strip (ESS) test. This report covers the revision to the Tank 21H qualification sample results for Macrobatch (Salt Batch) 4 of the Integrated Salt Disposition Program (ISDP). A previous document covers initial characterization which includes results for a number of non-radiological analytes. These results were used to perform aluminum solubility modeling to determine the hydroxide needs for Salt Batch 4 to prevent the precipitation of solids. Sodium hydroxide was then added to Tank 21 and additional samples were pulled for the analyses discussed in this report. This work was specified by Task Technical Request and by Task Technical and Quality Assurance Plan (TTQAP).

  7. Integrated microfabricated biodevices. New advances in sample preparation (T2)

    International Nuclear Information System (INIS)

    Guttman, A.

    2002-01-01

    Full text: Interdisciplinary science and technologies have converged in the past few years to create exciting challenges and opportunities, which involve novel, integrated microfabricated systems, facilitating large-scale analytical applications. These new devices are referred to as lab-on-a-chip or micro Total Analysis Systems (uTAS). Their development involves both established and evolving technologies, which include microlithography, micromachining, micro-electromechanical systems (MEMS) technology, microfluidics and nanotechnology. The advent of this extremely powerful and rapid analysis technique opens up new horizons in analytical chemistry and molecular biology, capable of revealing global changes in gene expression levels by enabling genome, proteome and metabolome analysis on microchips. This presentation will provide an overview of the key device subject areas and the basic interdisciplinary technologies. It will also give a better understanding of how to utilize these miniaturized technologies as well as to provide appropriate technical solutions to problems perceived as being more fundamental. Theoretical and practical aspects of integrating sample preparation/purification and analysis units with chemical and biochemical reactors in monolithic microdevices are going to be thoroughly discussed. Important applications for this novel 'synergized' technology in high throughput analysis of biologically important molecules will also be addressed. (author)

  8. Storage Effects on Sample Integrity of Environmental Surface Sampling Specimens with Bacillus anthracis Spores.

    Science.gov (United States)

    Perry, K Allison; O'Connell, Heather A; Rose, Laura J; Noble-Wang, Judith A; Arduino, Matthew J

    The effect of packaging, shipping temperatures and storage times on recovery of Bacillus anthracis . Sterne spores from swabs was investigated. Macrofoam swabs were pre-moistened, inoculated with Bacillus anthracis spores, and packaged in primary containment or secondary containment before storage at -15°C, 5°C, 21°C, or 35°C for 0-7 days. Swabs were processed according to validated Centers for Disease Control/Laboratory Response Network culture protocols, and the percent recovery relative to a reference sample (T 0 ) was determined for each variable. No differences were observed in recovery between swabs held at -15° and 5°C, (p ≥ 0.23). These two temperatures provided significantly better recovery than swabs held at 21°C or 35°C (all 7 days pooled, p ≤ 0.04). The percent recovery at 5°C was not significantly different if processed on days 1, 2 or 4, but was significantly lower on day 7 (day 2 vs. 7, 5°C, 10 2 , p=0.03). Secondary containment provided significantly better percent recovery than primary containment, regardless of storage time (5°C data, p ≤ 0.008). The integrity of environmental swab samples containing Bacillus anthracis spores shipped in secondary containment was maintained when stored at -15°C or 5°C and processed within 4 days to yield the optimum percent recovery of spores.

  9. Short Note An integrated remote sampling approach for aquatic ...

    African Journals Online (AJOL)

    A sampling method and apparatus for collecting meaningful and quantifiable samples of aquatic macroinvertebrates, and the macrophytes they are associated with, are presented. Where physical danger from wildlife is a significant factor, especially in Africa, this apparatus offers some safety in that it can be operated from a ...

  10. Glass sampling program during DWPF Integrated Cold Runs

    International Nuclear Information System (INIS)

    Plodinec, M.J.

    1990-01-01

    The described glass sampling program is designed to achieve two objectives: To demonstrate Defense Waste Processing Facility (DWPF) ability to control and verify the radionuclide release properties of the glass product; To confirm DWPF's readiness to obtain glass samples during production, and SRL's readiness to analyze and test those samples remotely. The DWPF strategy for control of the radionuclide release properties of the glass product, and verification of its acceptability are described in this report. The basic approach of the test program is then defined

  11. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    OpenAIRE

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using mill...

  12. An integrate-over-temperature approach for enhanced sampling.

    Science.gov (United States)

    Gao, Yi Qin

    2008-02-14

    A simple method is introduced to achieve efficient random walking in the energy space in molecular dynamics simulations which thus enhances the sampling over a large energy range. The approach is closely related to multicanonical and replica exchange simulation methods in that it allows configurations of the system to be sampled in a wide energy range by making use of Boltzmann distribution functions at multiple temperatures. A biased potential is quickly generated using this method and is then used in accelerated molecular dynamics simulations.

  13. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 5 TANK 21H QUALIFICATION MST, ESS AND PODD SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2012-04-24

    Savannah River National Laboratory (SRNL) performed experiments on qualification material for use in the Integrated Salt Disposition Program (ISDP) Batch 5 processing. This qualification material was a composite created from recent samples from Tank 21H and archived samples from Tank 49H to match the projected blend from these two tanks. Additionally, samples of the composite were used in the Actinide Removal Process (ARP) and extraction-scrub-strip (ESS) tests. ARP and ESS test results met expectations. A sample from Tank 21H was also analyzed for the Performance Objectives Demonstration Document (PODD) requirements. SRNL was able to meet all of the requirements, including the desired detection limits for all the PODD analytes. This report details the results of the Actinide Removal Process (ARP), Extraction-Scrub-Strip (ESS) and Performance Objectives Demonstration Document (PODD) samples of Macrobatch (Salt Batch) 5 of the Integrated Salt Disposition Program (ISDP).

  14. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Directory of Open Access Journals (Sweden)

    Valeria Toffoli

    2013-12-01

    Full Text Available The design and characteristics of a micro-system for thermogravimetric analysis (TGA in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  15. Heater-Integrated Cantilevers for Nano-Samples Thermogravimetric Analysis

    Science.gov (United States)

    Toffoli, Valeria; Carrato, Sergio; Lee, Dongkyu; Jeon, Sangmin; Lazzarino, Marco

    2013-01-01

    The design and characteristics of a micro-system for thermogravimetric analysis (TGA) in which heater, temperature sensor and mass sensor are integrated into a single device are presented. The system consists of a suspended cantilever that incorporates a microfabricated resistor, used as both heater and thermometer. A three-dimensional finite element analysis was used to define the structure parameters. TGA sensors were fabricated by standard microlithographic techniques and tested using milli-Q water and polyurethane microcapsule. The results demonstrated that our approach provides a faster and more sensitive TGA with respect to commercial systems.

  16. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  17. Integrating a sampling oscilloscope card and spectroscopy ADCs in a data acquisition system

    CERN Document Server

    Maartensson, L

    2001-01-01

    A high-rate sampling oscilloscope card has been integrated into an existing data acquisition system for spectroscopy ADCs. Experiments where pulse-shape analyses are important have then been made possible. Good performance characteristics of the integrated system have been achieved. Spectroscopy ADC data together with pulse-shape data sampled 512 times at 100 MHz are saved to hard disk at event rates up to about 1 kHz with low dead time losses.

  18. Apollo Lunar Sample Integration into Google Moon: A New Approach to Digitization

    Science.gov (United States)

    Dawson, Melissa D.; Todd, nancy S.; Lofgren, Gary E.

    2011-01-01

    The Google Moon Apollo Lunar Sample Data Integration project is part of a larger, LASER-funded 4-year lunar rock photo restoration project by NASA s Acquisition and Curation Office [1]. The objective of this project is to enhance the Apollo mission data already available on Google Moon with information about the lunar samples collected during the Apollo missions. To this end, we have combined rock sample data from various sources, including Curation databases, mission documentation and lunar sample catalogs, with newly available digital photography of rock samples to create a user-friendly, interactive tool for learning about the Apollo Moon samples

  19. Passive sampling of selected endocrine disrupting compounds using polar organic chemical integrative samplers

    International Nuclear Information System (INIS)

    Arditsoglou, Anastasia; Voutsa, Dimitra

    2008-01-01

    Two types of polar organic chemical integrative samplers (pharmaceutical POCIS and pesticide POCIS) were examined for their sampling efficiency of selected endocrine disrupting compounds (EDCs). Laboratory-based calibration of POCISs was conducted by exposing them at high and low concentrations of 14 EDCs (4-alkyl-phenols, their ethoxylate oligomers, bisphenol A, selected estrogens and synthetic steroids) for different time periods. The kinetic studies showed an integrative uptake up to 28 days. The sampling rates for the individual compounds were obtained. The use of POCISs could result in an integrative approach to the quality status of the aquatic systems especially in the case of high variation of water concentrations of EDCs. The sampling efficiency of POCISs under various field conditions was assessed after their deployment in different aquatic environments. - Calibration and field performance of polar organic integrative samplers for monitoring EDCs in aquatic environments

  20. Background Information for the Nevada National Security Site Integrated Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene; Marutzky, Sam

    2014-12-01

    This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Background information on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

  1. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  2. Integrating sphere based reflectance measurements for small-area semiconductor samples

    Science.gov (United States)

    Saylan, S.; Howells, C. T.; Dahlem, M. S.

    2018-05-01

    This article describes a method that enables reflectance spectroscopy of small semiconductor samples using an integrating sphere, without the use of additional optical elements. We employed an inexpensive sample holder to measure the reflectance of different samples through 2-, 3-, and 4.5-mm-diameter apertures and applied a mathematical formulation to remove the bias from the measured spectra caused by illumination of the holder. Using the proposed method, the reflectance of samples fabricated using expensive or rare materials and/or low-throughput processes can be measured. It can also be incorporated to infer the internal quantum efficiency of small-area, research-level solar cells. Moreover, small samples that reflect light at large angles and develop scattering may also be measured reliably, by virtue of an integrating sphere insensitive to directionalities.

  3. Efficient sampling over rough energy landscapes with high barriers: A combination of metadynamics with integrated tempering sampling

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Y. Isaac [Institute of Theoretical and Computational Chemistry, College of Chemistry and Molecular Engineering, Peking University, Beijing 100871 (China); Zhang, Jun; Che, Xing; Yang, Lijiang; Gao, Yi Qin, E-mail: gaoyq@pku.edu.cn [Institute of Theoretical and Computational Chemistry, College of Chemistry and Molecular Engineering, Peking University, Beijing 100871 (China); Biodynamic Optical Imaging Center, Peking University, Beijing 100871 (China)

    2016-03-07

    In order to efficiently overcome high free energy barriers embedded in a complex energy landscape and calculate overall thermodynamics properties using molecular dynamics simulations, we developed and implemented a sampling strategy by combining the metadynamics with (selective) integrated tempering sampling (ITS/SITS) method. The dominant local minima on the potential energy surface (PES) are partially exalted by accumulating history-dependent potentials as in metadynamics, and the sampling over the entire PES is further enhanced by ITS/SITS. With this hybrid method, the simulated system can be rapidly driven across the dominant barrier along selected collective coordinates. Then, ITS/SITS ensures a fast convergence of the sampling over the entire PES and an efficient calculation of the overall thermodynamic properties of the simulation system. To test the accuracy and efficiency of this method, we first benchmarked this method in the calculation of ϕ − ψ distribution of alanine dipeptide in explicit solvent. We further applied it to examine the design of template molecules for aromatic meta-C—H activation in solutions and investigate solution conformations of the nonapeptide Bradykinin involving slow cis-trans isomerizations of three proline residues.

  4. Accelerated sampling by infinite swapping of path integral molecular dynamics with surface hopping

    Science.gov (United States)

    Lu, Jianfeng; Zhou, Zhennan

    2018-02-01

    To accelerate the thermal equilibrium sampling of multi-level quantum systems, the infinite swapping limit of a recently proposed multi-level ring polymer representation is investigated. In the infinite swapping limit, the ring polymer evolves according to an averaged Hamiltonian with respect to all possible surface index configurations of the ring polymer and thus connects the surface hopping approach to the mean-field path-integral molecular dynamics. A multiscale integrator for the infinite swapping limit is also proposed to enable efficient sampling based on the limiting dynamics. Numerical results demonstrate the huge improvement of sampling efficiency of the infinite swapping compared with the direct simulation of path-integral molecular dynamics with surface hopping.

  5. The Luna 16 and Luna 20 samples and their integrated studies in India

    International Nuclear Information System (INIS)

    Lal, D.

    1974-01-01

    The results of an integrated study of physical and chemical properties of lunar samples returned to earth by the automatic Soviet stations Luna 16 and 20 carried out with a view to dilineate the evolutionary history of moon, are reported. The nature of the Luna 16 and Luna 20 landing sites and lunar samples, and the manner in which the integrated analyses were planned and executed, are discussed. It is noted that the two lunar missions have provided a wealth of new information in unravelling the early history of planetary formation. (A.K.)

  6. Prevalence of the integration status for human papillomavirus 16 in esophageal carcinoma samples.

    Science.gov (United States)

    Li, Shuying; Shen, Haie; Li, Ji; Hou, Xiaoli; Zhang, Ke; Li, Jintao

    2018-03-01

    To investigate the etiology of esophageal cancer (EC) related with human papillomavirus (HPV) infection. Fresh surgically resected tissue samples and clinical information were obtained from 189 patients. Genomic DNA was extracted, and HPV was detected using polymerase chain reaction (PCR) with HPV L1 gene primers of MY09/11; HPV16 was detected using HPV16 E6 type-specific primer sets. Copies of HPV16 E2, E6, and the human housekeeping gene β-actin were tested using quantitative PCR to analyze the relationship between HPV16 integration and esophageal squamous cell carcinoma and the relationship between the HPV16 integration status and clinical information of patients. Of the 189 samples, 168 HPV-positive samples were detected, of which 76 were HPV16 positive. Among the HPV16 positive samples, 2 cases (E2/E6 ratio>1) were 2.6% (2/76) purely episomal, 65 (E2/E6 ratio between 0 and 1) were 85.6% (65/76) mixture of integrated and episomal, and 9 (E2/E6 ratio=0) were 11.8% (9/76) purely integrated. The results indicate that integration of HPV16 was more common in the host genome than in the episome genome. The prevalence rate of HPV16 integration is increasing with the pathological stage progression of esophageal carcinoma (EC). A high prevalence of HPV16 suggested that HPV16 has an etiological effect on the progress of EC. Integration of HPV16 is more common than episome genome in the host cells, indicating that continuous HPV infection is the key to esophageal epithelial cell malignant conversion and canceration.

  7. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  8. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling

    Directory of Open Access Journals (Sweden)

    Laura Grisotto

    2016-04-01

    Full Text Available In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.

  9. Passive sampling of selected pesticides in aquatic environment using polar organic chemical integrative samplers.

    Science.gov (United States)

    Thomatou, Alphanna-Akrivi; Zacharias, Ierotheos; Hela, Dimitra; Konstantinou, Ioannis

    2011-08-01

    Polar chemical integrative samplers (POCIS) were examined for their sampling efficiency of 12 pesticides and one metabolite commonly detected in surface waters. Laboratory-based calibration experiments of POCISs were conducted. The determined passive sampling rates were applied for the monitoring of pesticides levels in Lake Amvrakia, Western Greece. Spot sampling was also performed for comparison purposes. Calibration experiments were performed on the basis of static renewal exposure of POCIS under stirred conditions for different time periods of up to 28 days. The analytical procedures were based on the coupling of POCIS and solid phase extraction by Oasis HLB cartridges with gas chromatography-mass spectrometry. The recovery of the target pesticides from the POCIS was generally >79% with relative standard deviation (RSD) monitoring campaign using both passive and spot sampling whereas higher concentrations were measured by spot sampling in most cases. Passive sampling by POCIS provides a useful tool for the monitoring of pesticides in aquatic systems since integrative sampling at rates sufficient for analytical quantitation of ambient levels was observed. Calibration data are in demand for a greater number of compounds in order to extend the use in environmental monitoring.

  10. Controlled cooling versus rapid freezing of teratozoospermic semen samples: Impact on sperm chromatin integrity

    Directory of Open Access Journals (Sweden)

    Shivananda N Kalludi

    2011-01-01

    Full Text Available Aim: The present study evaluates the impact of controlled slow cooling and rapid freezing techniques on the sperm chromatin integrity in teratozoospermic and normozoospermic samples. Setting: The study was done in a university infertility clinic, which is a tertiary healthcare center serving the general population. Design: It was a prospective study designed in vitro. Materials and Methods: Semen samples from normozoospermic (N=16 and teratozoospermic (N=13 infertile men were cryopreserved using controlled cooling and rapid freezing techniques. The sperm chromatin integrity was analyzed in fresh and frozen-thawed samples. Statistical Analysis Used: Data were reported as mean and standard error (mean ± SEM of mean. The difference between two techniques was determined by a paired t-test. Results: The freeze-thaw induced chromatin denaturation was significantly (P<0.01 elevated in the post-thaw samples of normozoospermic and teratozoospermic groups. Compared to rapid freezing, there was no difference in the number of red sperms (with DNA damage by the controlled slow cooling method in both normozoospermic and teratozoospermic groups. Freeze-thaw induced sperm chromatin denaturation in teratozoospermic samples did not vary between controlled slow cooling and rapid freezing techniques. Conclusions: Since the controlled slow cooling technique involves the use of expensive instrument and is a time consuming protocol, rapid freezing can be a good alternative technique for teratozoospermic and normozoospermic samples when sperm DNA damage is a concern.

  11. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.; Metz, Thomas O.; Chia, Nicholas

    2016-05-03

    ABSTRACT

    Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical).

    IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated

  12. Integrated sample-to-detection chip for nucleic acid test assays.

    Science.gov (United States)

    Prakash, R; Pabbaraju, K; Wong, S; Tellier, R; Kaler, K V I S

    2016-06-01

    Nucleic acid based diagnostic techniques are routinely used for the detection of infectious agents. Most of these assays rely on nucleic acid extraction platforms for the extraction and purification of nucleic acids and a separate real-time PCR platform for quantitative nucleic acid amplification tests (NATs). Several microfluidic lab on chip (LOC) technologies have been developed, where mechanical and chemical methods are used for the extraction and purification of nucleic acids. Microfluidic technologies have also been effectively utilized for chip based real-time PCR assays. However, there are few examples of microfluidic systems which have successfully integrated these two key processes. In this study, we have implemented an electro-actuation based LOC micro-device that leverages multi-frequency actuation of samples and reagents droplets for chip based nucleic acid extraction and real-time, reverse transcription (RT) PCR (qRT-PCR) amplification from clinical samples. Our prototype micro-device combines chemical lysis with electric field assisted isolation of nucleic acid in a four channel parallel processing scheme. Furthermore, a four channel parallel qRT-PCR amplification and detection assay is integrated to deliver the sample-to-detection NAT chip. The NAT chip combines dielectrophoresis and electrostatic/electrowetting actuation methods with resistive micro-heaters and temperature sensors to perform chip based integrated NATs. The two chip modules have been validated using different panels of clinical samples and their performance compared with standard platforms. This study has established that our integrated NAT chip system has a sensitivity and specificity comparable to that of the standard platforms while providing up to 10 fold reduction in sample/reagent volumes.

  13. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  14. Digital pulse-shape analyzer based on fast sampling of an integrated charge pulse

    International Nuclear Information System (INIS)

    Jordanov, V.T.; Knoll, G.F.

    1995-01-01

    A novel configuration for pulse-shape analysis and discrimination has been developed. The current pulse from detector is sent to a gated integrator and then sampled by a flash analog-to-digital converter (ADC). The sampled data are processed digitally, thus allowing implementation of a near-optimum weighting function and elimination some of the instabilities associated with the gated integrator. The analyzer incorporates pileup rejection circuit that reduces the pileup effects at high counting rates. The system was tested liquid scintillator. Figures of merit for neutron-gamma pulse-shape discrimination were found to be: 0.78 for 25 keV (electron equivalent energy) and 3.5 for 500 keV. The technique described in this paper was developed to be used in a near tissue-equivalent neutron-gamma dosimeter which employs a liquid scintillator detector

  15. Integrated Automation of High-Throughput Screening and Reverse Phase Protein Array Sample Preparation

    DEFF Research Database (Denmark)

    Pedersen, Marlene Lemvig; Block, Ines; List, Markus

    into automated robotic high-throughput screens, which allows subsequent protein quantification. In this integrated solution, samples are directly forwarded to automated cell lysate preparation and preparation of dilution series, including reformatting to a protein spotter-compatible format after the high......-throughput screening. Tracking of huge sample numbers and data analysis from a high-content screen to RPPAs is accomplished via MIRACLE, a custom made software suite developed by us. To this end, we demonstrate that the RPPAs generated in this manner deliver reliable protein readouts and that GAPDH and TFR levels can...

  16. Site-Wide Integrated Water Monitoring - Defining and Implementing Sampling Objectives to Support Site Closure - 13060

    International Nuclear Information System (INIS)

    Wilborn, Bill; Knapp, Kathryn; Farnham, Irene; Marutzky, Sam

    2013-01-01

    The Underground Test Area (UGTA) activity is responsible for assessing and evaluating the effects of the underground nuclear weapons tests on groundwater at the Nevada National Security Site (NNSS), formerly the Nevada Test Site (NTS), and implementing a corrective action closure strategy. The UGTA strategy is based on a combination of characterization, modeling studies, monitoring, and institutional controls (i.e., monitored natural attenuation). The closure strategy verifies through appropriate monitoring activities that contaminants of concern do not exceed the SDWA at the regulatory boundary and that adequate institutional controls are established and administered to ensure protection of the public. Other programs conducted at the NNSS supporting the environmental mission include the Routine Radiological Environmental Monitoring Program (RREMP), Waste Management, and the Infrastructure Program. Given the current programmatic and operational demands for various water-monitoring activities at the same locations, and the ever-increasing resource challenges, cooperative and collaborative approaches to conducting the work are necessary. For this reason, an integrated sampling plan is being developed by the UGTA activity to define sampling and analysis objectives, reduce duplication, eliminate unnecessary activities, and minimize costs. The sampling plan will ensure the right data sets are developed to support closure and efficient transition to long-term monitoring. The plan will include an integrated reporting mechanism for communicating results and integrating process improvements within the UGTA activity as well as between other U.S. Department of Energy (DOE) Programs. (authors)

  17. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  18. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  19. Homogeneous immunosubtraction integrated with sample preparation is enabled by a microfluidic format

    Science.gov (United States)

    Apori, Akwasi A.; Herr, Amy E.

    2011-01-01

    Immunosubtraction is a powerful and resource-intensive laboratory medicine assay that reports both protein mobility and binding specificity. To expedite and automate this electrophoretic assay, we report on advances to the electrophoretic immunosubtraction assay by introducing a homogeneous, not heterogeneous, format with integrated sample preparation. To accomplish homogeneous immunosubtraction, a step-decrease in separation matrix pore-size at the head of a polyacrylamide gel electrophoresis (PAGE) separation channel enables ‘subtraction’ of target analyte when capture antibody is present (as the large immune-complex is excluded from PAGE), but no subtraction when capture antibody is absent. Inclusion of sample preparation functionality via small pore size polyacrylamide membranes is also key to automated operation (i.e., sample enrichment, fluorescence sample labeling, and mixing of sample with free capture antibody). Homogenous sample preparation and assay operation allows on-the-fly, integrated subtraction of one to multiple protein targets and reuse of each device. Optimization of the assay is detailed which allowed for ~95% subtraction of target with 20% non-specific extraction of large species at the optimal antibody-antigen ratio, providing conditions needed for selective target identification. We demonstrate the assay on putative markers of injury and inflammation in cerebrospinal fluid (CSF), an emerging area of diagnostics research, by rapidly reporting protein mobility and binding specificity within the sample matrix. We simultaneously detect S100B and C-reactive protein, suspected biomarkers for traumatic brain injury (TBI), in ~2 min. Lastly, we demonstrate S100B detection (65 nM) in raw human CSF with a lower limit of detection of ~3.25 nM, within the clinically relevant concentration range for detecting TBI in CSF. Beyond the novel CSF assay introduced here, a fully automated immunosubtraction assay would impact a spectrum of routine but labor

  20. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  1. Strategies for monitoring the emerging polar organic contaminants in water with emphasis on integrative passive sampling.

    Science.gov (United States)

    Söderström, Hanna; Lindberg, Richard H; Fick, Jerker

    2009-01-16

    Although polar organic contaminants (POCs) such as pharmaceuticals are considered as some of today's most emerging contaminants few of them are regulated or included in on-going monitoring programs. However, the growing concern among the public and researchers together with the new legislature within the European Union, the registration, evaluation and authorisation of chemicals (REACH) system will increase the future need of simple, low cost strategies for monitoring and risk assessment of POCs in aquatic environments. In this article, we overview the advantages and shortcomings of traditional and novel sampling techniques available for monitoring the emerging POCs in water. The benefits and drawbacks of using active and biological sampling were discussed and the principles of organic passive samplers (PS) presented. A detailed overview of type of polar organic PS available, and their classes of target compounds and field of applications were given, and the considerations involved in using them such as environmental effects and quality control were discussed. The usefulness of biological sampling of POCs in water was found to be limited. Polar organic PS was considered to be the only available, but nevertheless, an efficient alternative to active water sampling due to its simplicity, low cost, no need of power supply or maintenance, and the ability of collecting time-integrative samples with one sample collection. However, the polar organic PS need to be further developed before they can be used as standard in water quality monitoring programs.

  2. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  3. Towards an integrated petrophysical tool for multiphase flow properties of core samples

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    This paper describes the first use of an Integrated Petrophysical Tool (IPT) on reservoir rock samples. The IPT simultaneously measures the following petrophysical properties: (1) Complete capillary pressure cycle: primary drainage, spontaneous and forced imbibitions, secondary drainage (the cycle leads to the wettability of the core by using the USBM index); End-points and parts of the relative permeability curves; Formation factor and resistivity index. The IPT is based on the steady-state injection of one fluid through the sample placed in a Hassler cell. The experiment leading to the whole Pc cycle on two reservoir sandstones consists of about 30 steps at various oil or water flow rates. It takes about four weeks and is operated at room conditions. Relative permeabilities are in line with standard steady-state measurements. Capillary pressures are in accordance with standard centrifuge measurements. There is no comparison for the resistivity index, but the results are in agreement with literature data. However, the accurate determination of saturation remains the main difficulty and some improvements are proposed. In conclusion, the Integrated Petrophysical Tool is as accurate as standard methods and has the advantage of providing the various parameters on the same sample and during a single experiment. The FIT is easy to use and can be automated. In addition, it can be operated in reservoir conditions.

  4. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...

  5. Evaluation of inlet sampling integrity on NSF/NCAR airborne platforms

    Science.gov (United States)

    Campos, T. L.; Stith, J. L.; Stephens, B. B.; Romashkin, P.

    2017-12-01

    An inlet test project was conducted during IDEAS-IV-GV (2013), to evaluate the sampling integrity of two inlet designs. Use of a single CO2 sensor provided a high precision detector and a large difference in the mean cabin and external concentrations (500-700 ppmv in the cabin). The original HIAPER Modular InLet (HIMIL) is comprised of a tapered flow straightening flow through `cigar' mounted to a strut. The cigar center sampling line sits 12" from the fuselage skin. An o-ring seals the feedthrough plate coupling sampling lines from the strut into the cigar. However, there is no seal to prevent air inside the strut from seeping out around the cigar body. A pressure-equalizing drain hole in the strut access panel; it was positioned at an approximate distance of 4" from the fuselage to ensure that air from any source that drained out of the strut was confined to a low release point. A second aft-facing inlet design was also evaluated. The sampling center line was moved farther from the fuselage at a height of 16". A similar approach was also applied to sampling locations on the C-130 in 2015. The results of these tests and recommendations for best practices will be presented.

  6. The sample of INTEGRAL SPI-ACS gamma-ray bursts

    International Nuclear Information System (INIS)

    Rau, A.; Kienlin, A. von; Licht, G.G.; Hurley, K.

    2005-01-01

    The anti-coincidence system of the spectrometer on board INTEGRAL is operated as a nearly omni directional gamma-ray burst detector above ∼ 75 KeV. During the elapsed mission time 324 burst candidates were detected. As part of the 3rd Interplanetary Network of gamma-ray detectors the cosmic origin of 115 burst was confirmed. Here we present a preliminary analysis of the SPI-ACS gamma-ray burst sample. In particular we discuss the origin of a significant population of short events (duration < 0.2 s) and a possible method for a flux calibration of the data

  7. Integrity of the Human Faecal Microbiota following Long-Term Sample Storage.

    Directory of Open Access Journals (Sweden)

    Elahe Kia

    Full Text Available In studies of the human microbiome, faecal samples are frequently used as a non-invasive proxy for the study of the intestinal microbiota. To obtain reliable insights, the need for bacterial DNA of high quality and integrity following appropriate faecal sample collection and preservation steps is paramount. In a study of dietary mineral balance in the context of type 2 diabetes (T2D, faecal samples were collected from healthy and T2D individuals throughout a 13-day residential trial. These samples were freeze-dried, then stored mostly at -20°C from the trial date in 2000/2001 until the current research in 2014. Given the relative antiquity of these samples (~14 years, we sought to evaluate DNA quality and comparability to freshly collected human faecal samples. Following the extraction of bacterial DNA, gel electrophoresis indicated that our DNA extracts were more sheared than extracts made from freshly collected faecal samples, but still of sufficiently high molecular weight to support amplicon-based studies. Likewise, spectrophotometric assessment of extracts revealed that they were of high quality and quantity. A subset of bacterial 16S rRNA gene amplicons were sequenced using Illumina MiSeq and compared against publicly available sequence data representing a similar cohort analysed by the American Gut Project (AGP. Notably, our bacterial community profiles were highly consistent with those from the AGP data. Our results suggest that when faecal specimens are stored appropriately, the microbial profiles are preserved and robust to extended storage periods.

  8. Estimating cross-validatory predictive p-values with integrated importance sampling for disease mapping models.

    Science.gov (United States)

    Li, Longhai; Feng, Cindy X; Qiu, Shi

    2017-06-30

    An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. The 4-vessel Sampling Approach to Integrative Studies of Human Placental Physiology In Vivo.

    Science.gov (United States)

    Holme, Ane M; Holm, Maia B; Roland, Marie C P; Horne, Hildegunn; Michelsen, Trond M; Haugen, Guttorm; Henriksen, Tore

    2017-08-02

    The human placenta is highly inaccessible for research while still in utero. The current understanding of human placental physiology in vivo is therefore largely based on animal studies, despite the high diversity among species in placental anatomy, hemodynamics and duration of the pregnancy. The vast majority of human placenta studies are ex vivo perfusion studies or in vitro trophoblast studies. Although in vitro studies and animal models are essential, extrapolation of the results from such studies to the human placenta in vivo is uncertain. We aimed to study human placenta physiology in vivo at term, and present a detailed protocol of the method. Exploiting the intraabdominal access to the uterine vein just before the uterine incision during planned cesarean section, we collect blood samples from the incoming and outgoing vessels on the maternal and fetal sides of the placenta. When combining concentration measurements from blood samples with volume blood flow measurements, we are able to quantify placental and fetal uptake and release of any compound. Furthermore, placental tissue samples from the same mother-fetus pairs can provide measurements of transporter density and activity and other aspects of placental functions in vivo. Through this integrative use of the 4-vessel sampling method we are able to test some of the current concepts of placental nutrient transfer and metabolism in vivo, both in normal and pathological pregnancies. Furthermore, this method enables the identification of substances secreted by the placenta to the maternal circulation, which could be an important contribution to the search for biomarkers of placenta dysfunction.

  10. Validation of the abbreviated Radon Progeny Integrating Sampling Unit (RPISU) method for Mesa County, Colorado

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1987-06-01

    The US Department of Energy (DOE) Office of Remedial Action and Waste Technology established the Technical Measurements Center at the DOE Grand Junction, Colorado, Projects Office to standardize, calibrate, and compare measurements made in support of DOE remedial action programs. Indoor radon-daughter concentration measurements are made to determine whether a structure is in need of remedial action. The Technical Measurements Center conducted this study to validate an abbreviated Radon Progeny Integrated Sampling Unit (RPISU) method of making indoor radon-daughter measurements to determine whether a structure has a radon-daughter concentration (RDC) below the levels specified in various program standards. The Technical Measurements Center established a criterion against which RDC measurements made using the RPISU sampling method are evaluated to determine if sampling can be terminated or whether further measurements are required. This abbreviated RPISU criterion was tested against 317 actual sets of RPISU data from measurements made over an eight-year period in Mesa County, Colorado. The data from each location were tested against a standard that was assumed to be the same as the actual annual average RDC from that location. At only two locations was the criterion found to fail. Using the abbreviated RPISU method, only 0.6% of locations sampled can be expected to be falsely indicated as having annual average RDC levels below a given standard

  11. Localization of fluorescently labeled structures in frozen-hydrated samples using integrated light electron microscopy.

    Science.gov (United States)

    Faas, F G A; Bárcena, M; Agronskaia, A V; Gerritsen, H C; Moscicka, K B; Diebolder, C A; van Driel, L F; Limpens, R W A L; Bos, E; Ravelli, R B G; Koning, R I; Koster, A J

    2013-03-01

    Correlative light and electron microscopy is an increasingly popular technique to study complex biological systems at various levels of resolution. Fluorescence microscopy can be employed to scan large areas to localize regions of interest which are then analyzed by electron microscopy to obtain morphological and structural information from a selected field of view at nm-scale resolution. Previously, an integrated approach to room temperature correlative microscopy was described. Combined use of light and electron microscopy within one instrument greatly simplifies sample handling, avoids cumbersome experimental overheads, simplifies navigation between the two modalities, and improves the success rate of image correlation. Here, an integrated approach for correlative microscopy under cryogenic conditions is presented. Its advantages over the room temperature approach include safeguarding the native hydrated state of the biological specimen, preservation of the fluorescence signal without risk of quenching due to heavy atom stains, and reduced photo bleaching. The potential of cryo integrated light and electron microscopy is demonstrated for the detection of viable bacteria, the study of in vitro polymerized microtubules, the localization of mitochondria in mouse embryonic fibroblasts, and for a search into virus-induced intracellular membrane modifications within mammalian cells. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Author Contribution to the Pu Handbook II: Chapter 37 LLNL Integrated Sample Preparation Glovebox (TEM) Section

    International Nuclear Information System (INIS)

    Wall, Mark A.

    2016-01-01

    The development of our Integrated Actinide Sample Preparation Laboratory (IASPL) commenced in 1998 driven by the need to perform transmission electron microscopy studies on naturally aged plutonium and its alloys looking for the microstructural effects of the radiological decay process (1). Remodeling and construction of a laboratory within the Chemistry and Materials Science Directorate facilities at LLNL was required to turn a standard radiological laboratory into a Radiological Materials Area (RMA) and Radiological Buffer Area (RBA) containing type I, II and III workplaces. Two inert atmosphere dry-train glove boxes with antechambers and entry/exit fumehoods (Figure 1), having a baseline atmosphere of 1 ppm oxygen and 1 ppm water vapor, a utility fumehood and a portable, and a third double-walled enclosure have been installed and commissioned. These capabilities, along with highly trained technical staff, facilitate the safe operation of sample preparation processes and instrumentation, and sample handling while minimizing oxidation or corrosion of the plutonium. In addition, we are currently developing the capability to safely transfer small metallographically prepared samples to a mini-SEM for microstructural imaging and chemical analysis. The gloveboxes continue to be the most crucial element of the laboratory allowing nearly oxide-free sample preparation for a wide variety of LLNL-based characterization experiments, which includes transmission electron microscopy, electron energy loss spectroscopy, optical microscopy, electrical resistivity, ion implantation, X-ray diffraction and absorption, magnetometry, metrological surface measurements, high-pressure diamond anvil cell equation-of-state, phonon dispersion measurements, X-ray absorption and emission spectroscopy, and differential scanning calorimetry. The sample preparation and materials processing capabilities in the IASPL have also facilitated experimentation at world-class facilities such as the

  13. Author Contribution to the Pu Handbook II: Chapter 37 LLNL Integrated Sample Preparation Glovebox (TEM) Section

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Mark A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-25

    The development of our Integrated Actinide Sample Preparation Laboratory (IASPL) commenced in 1998 driven by the need to perform transmission electron microscopy studies on naturally aged plutonium and its alloys looking for the microstructural effects of the radiological decay process (1). Remodeling and construction of a laboratory within the Chemistry and Materials Science Directorate facilities at LLNL was required to turn a standard radiological laboratory into a Radiological Materials Area (RMA) and Radiological Buffer Area (RBA) containing type I, II and III workplaces. Two inert atmosphere dry-train glove boxes with antechambers and entry/exit fumehoods (Figure 1), having a baseline atmosphere of 1 ppm oxygen and 1 ppm water vapor, a utility fumehood and a portable, and a third double-walled enclosure have been installed and commissioned. These capabilities, along with highly trained technical staff, facilitate the safe operation of sample preparation processes and instrumentation, and sample handling while minimizing oxidation or corrosion of the plutonium. In addition, we are currently developing the capability to safely transfer small metallographically prepared samples to a mini-SEM for microstructural imaging and chemical analysis. The gloveboxes continue to be the most crucial element of the laboratory allowing nearly oxide-free sample preparation for a wide variety of LLNL-based characterization experiments, which includes transmission electron microscopy, electron energy loss spectroscopy, optical microscopy, electrical resistivity, ion implantation, X-ray diffraction and absorption, magnetometry, metrological surface measurements, high-pressure diamond anvil cell equation-of-state, phonon dispersion measurements, X-ray absorption and emission spectroscopy, and differential scanning calorimetry. The sample preparation and materials processing capabilities in the IASPL have also facilitated experimentation at world-class facilities such as the

  14. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    Strydom, R.; Rolle, R.; Van der Linde, A.

    1992-01-01

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  15. Magnetic particles for in vitro molecular diagnosis: From sample preparation to integration into microsystems.

    Science.gov (United States)

    Tangchaikeeree, Tienrat; Polpanich, Duangporn; Elaissari, Abdelhamid; Jangpatarapongsa, Kulachart

    2017-10-01

    Colloidal magnetic particles (MPs) have been developed in association with molecular diagnosis for several decades. MPs have the great advantage of easy manipulation using a magnet. In nucleic acid detection, these particles can act as a capture support for rapid and simple biomolecule separation. The surfaces of MPs can be modified by coating with various polymer materials to provide functionalization for different applications. The use of MPs enhances the sensitivity and specificity of detection due to the specific activity on the surface of the particles. Practical applications of MPs demonstrate greater efficiency than conventional methods. Beyond traditional detection, MPs have been successfully adopted as a smart carrier in microfluidic and lab-on-a-chip biosensors. The versatility of MPs has enabled their integration into small single detection units. MPs-based biosensors can facilitate rapid and highly sensitive detection of very small amounts of a sample. In this review, the application of MPs to the detection of nucleic acids, from sample preparation to analytical readout systems, is described. State-of-the-art integrated microsystems containing microfluidic and lab-on-a-chip biosensors for the nucleic acid detection are also addressed. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Integrating field sampling, geostatistics and remote sensing to map wetland vegetation in the Pantanal, Brazil

    Directory of Open Access Journals (Sweden)

    J. Arieira

    2011-03-01

    Full Text Available Development of efficient methodologies for mapping wetland vegetation is of key importance to wetland conservation. Here we propose the integration of a number of statistical techniques, in particular cluster analysis, universal kriging and error propagation modelling, to integrate observations from remote sensing and field sampling for mapping vegetation communities and estimating uncertainty. The approach results in seven vegetation communities with a known floral composition that can be mapped over large areas using remotely sensed data. The relationship between remotely sensed data and vegetation patterns, captured in four factorial axes, were described using multiple linear regression models. There were then used in a universal kriging procedure to reduce the mapping uncertainty. Cross-validation procedures and Monte Carlo simulations were used to quantify the uncertainty in the resulting map. Cross-validation showed that accuracy in classification varies according with the community type, as a result of sampling density and configuration. A map of uncertainty derived from Monte Carlo simulations revealed significant spatial variation in classification, but this had little impact on the proportion and arrangement of the communities observed. These results suggested that mapping improvement could be achieved by increasing the number of field observations of those communities with a scattered and small patch size distribution; or by including a larger number of digital images as explanatory variables in the model. Comparison of the resulting plant community map with a flood duration map, revealed that flooding duration is an important driver of vegetation zonation. This mapping approach is able to integrate field point data and high-resolution remote-sensing images, providing a new basis to map wetland vegetation and allow its future application in habitat management, conservation assessment and long-term ecological monitoring in wetland

  17. Integrated science and engineering for the OSIRIS-REx asteroid sample return mission

    Science.gov (United States)

    Lauretta, D.

    2014-07-01

    Introduction: The Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) asteroid sample return mission will survey near-Earth asteroid (101955) Bennu to understand its physical, mineralogical, and chemical properties, assess its resource potential, refine the impact hazard, and return a sample of this body to the Earth [1]. This mission is scheduled for launch in 2016 and will rendezvous with the asteroid in 2018. Sample return to the Earth follows in 2023. The OSIRIS-REx mission has the challenge of visiting asteroid Bennu, characterizing it at global and local scales, then selecting the best site on the asteroid surface to acquire a sample for return to the Earth. Minimizing the risk of exploring an unknown world requires a tight integration of science and engineering to inform flight system and mission design. Defining the Asteroid Environment: We have performed an extensive astronomical campaign in support of OSIRIS-REx. Lightcurve and phase function observations were obtained with UA Observatories telescopes located in southeastern Arizona during the 2005--2006 and 2011--2012 apparitions [2]. We observed Bennu using the 12.6-cm radar at the Arecibo Observatory in 1999, 2005, and 2011 and the 3.5-cm radar at the Goldstone tracking station in 1999 and 2005 [3]. We conducted near-infrared measurements using the NASA Infrared Telescope Facility at the Mauna Kea Observatory in Hawaii in September 2005 [4]. Additional spectral observations were obtained in July 2011 and May 2012 with the Magellan 6.5-m telescope [5]. We used the Spitzer space telescope to observe Bennu in May 2007 [6]. The extensive knowledge gained as a result of our telescopic characterization of Bennu was critical in the selection of this object as the OSIRIS-REx mission target. In addition, we use these data, combined with models of the asteroid, to constrain over 100 different asteroid parameters covering orbital, bulk, rotational, radar

  18. Elemental distribution and sample integrity comparison of freeze-dried and frozen-hydrated biological tissue samples with nuclear microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Vavpetič, P., E-mail: primoz.vavpetic@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Vogel-Mikuš, K. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Jeromel, L. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Ogrinc Potočnik, N. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); FOM-Institute AMOLF, Science Park 104, 1098 XG Amsterdam (Netherlands); Pongrac, P. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Department of Plant Physiology, University of Bayreuth, Universitätstr. 30, 95447 Bayreuth (Germany); Drobne, D.; Pipan Tkalec, Ž.; Novak, S.; Kos, M.; Koren, Š.; Regvar, M. [Biotechnical Faculty, Department of Biology, University of Ljubljana, Jamnikarjeva 101, SI-1000 Ljubljana (Slovenia); Pelicon, P. [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia)

    2015-04-01

    The analysis of biological samples in frozen-hydrated state with micro-PIXE technique at Jožef Stefan Institute (JSI) nuclear microprobe has matured to a point that enables us to measure and examine frozen tissue samples routinely as a standard research method. Cryotome-cut slice of frozen-hydrated biological sample is mounted between two thin foils and positioned on the sample holder. The temperature of the cold stage in the measuring chamber is kept below 130 K throughout the insertion of the samples and the proton beam exposure. Matrix composition of frozen-hydrated tissue is consisted mostly of ice. Sample deterioration during proton beam exposure is monitored during the experiment, as both Elastic Backscattering Spectrometry (EBS) and Scanning Transmission Ion Microscopy (STIM) in on–off axis geometry are recorded together with the events in two PIXE detectors and backscattered ions from the chopper in a single list-mode file. The aim of this experiment was to determine differences and similarities between two kinds of biological sample preparation techniques for micro-PIXE analysis, namely freeze-drying and frozen-hydrated sample preparation in order to evaluate the improvements in the elemental localisation of the latter technique if any. In the presented work, a standard micro-PIXE configuration for tissue mapping at JSI was used with five detection systems operating in parallel, with proton beam cross section of 1.0 × 1.0 μm{sup 2} and a beam current of 100 pA. The comparison of the resulting elemental distributions measured at the biological tissue prepared in the frozen-hydrated and in the freeze-dried state revealed differences in elemental distribution of particular elements at the cellular level due to the morphology alteration in particular tissue compartments induced either by water removal in the lyophilisation process or by unsatisfactory preparation of samples for cutting and mounting during the shock-freezing phase of sample preparation.

  19. An analog memory integrated circuit for waveform sampling up to 900 MHz

    International Nuclear Information System (INIS)

    Haller, G.M.; Wooley, B.A.

    1994-01-01

    The potential of switched-capacitor technology for acquiring analog signals in high-energy physics (HEP) applications has been demonstrated in a number of analog memory designs. The design and implementation of a switched-capacitor memory suitable for capturing high-speed analog waveforms is described. Highlights of the presented circuit are a 900 MHz sampling frequency (generated on chip), input signal independent cell pedestal and sampling instances, and cell gains that are insensitive to component sizes. A two-channel version of the memory with 32 cells for each channel has been integrate in a 2-μm complementary metal oxide semiconductor (CMOS) process with polysilicon-to-polysilicon capacitors. The measured rms cell response variation in a channel after cell pedestal subtraction is less than 0.3 mV across the full input signal range. The cell-to-cell gain matching is better than 0.01% rms, and the nonlinearity is less than 0.03% for a 2.5-V input range. The dynamic range of the memory exceeds 13 bits, and the peak signal-to-(noise + distortion) ratio for a 21.4 MHz sine wave sampled at 900 MHz is 59 dB

  20. A lab-on-a-chip system with integrated sample preparation and loop-mediated isothermal amplification for rapid and quantitative detection of Salmonella spp. in food samples

    DEFF Research Database (Denmark)

    Sun, Yi; Than Linh, Quyen; Hung, Tran Quang

    2015-01-01

    was capable to detect Salmonella at concentration of 50 cells per test within 40 min. The simple design, together with high level of integration, isothermal amplification, and quantitative analysis of multiple samples in short time will greatly enhance the practical applicability of the LOC system for rapid...... amplification (LAMP) for rapid and quantitative detection of Salmonella spp. in food samples. The whole diagnostic procedures including DNA isolation, isothermal amplification, and real-time detection were accomplished in a single chamber. Up to eight samples could be handled simultaneously and the system...... and usually take a few hours to days to complete. In response to the demand for rapid on line or at site detection of pathogens, in this study, we describe for the first time an eight-chamber lab-on-a-chip (LOC) system with integrated magnetic beads-based sample preparation and loop-mediated isothermal...

  1. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  2. Workshop on New Views of the Moon: Integrated Remotely Sensed, Geophysical, and Sample Datasets

    Science.gov (United States)

    Jolliff, Brad L.; Ryder, Graham

    1998-01-01

    It has been more than 25 years since Apollo 17 returned the last of the Apollo lunar samples. Since then, a vast amount of data has been obtained from the study of rocks and soils from the Apollo and Luna sample collections and, more recently, on a set of about a dozen lunar meteorites collected on Earth. Based on direct studies of the samples, many constraints have been established for the age, early differentiation, crust and mantle structure, and subsequent impact modification of the Moon. In addition, geophysical experiments at the surface, as well as remote sensing from orbit and Earth-based telescopic studies, have provided additional datasets about the Moon that constrain the nature of its surface and internal structure. Some might be tempted to say that we know all there is to know about the Moon and that it is time to move on from this simple satellite to more complex objects. However, the ongoing Lunar Prospector mission and the highly successful Clementine mission have provided important clues to the real geological complexity of the Moon, and have shown us that we still do not yet adequately understand the geologic history of Earth's companion. These missions, like Galileo during its lunar flyby, are providing global information viewed through new kinds of windows, and providing a fresh context for models of lunar origin, evolution, and resources, and perhaps even some grist for new questions and new hypotheses. The probable detection and characterization of water ice at the poles, the extreme concentration of Th and other radioactive elements in the Procellarum-Imbrium-Frigon's resurfaced areas of the nearside of the Moon, and the high-resolution gravity modeling enabled by these missions are examples of the kinds of exciting new results that must be integrated with the extant body of knowledge based on sample studies, in situ experiments, and remote-sensing missions to bring about the best possible understanding of the Moon and its history.

  3. A holistic passive integrative sampling approach for assessing the presence and potential impacts of waterborne environmental contaminants

    Science.gov (United States)

    Petty, J.D.; Huckins, J.N.; Alvarez, D.A.; Brumbaugh, W. G.; Cranor, W.L.; Gale, R.W.; Rastall, A.C.; Jones-Lepp, T. L.; Leiker, T.J.; Rostad, C. E.; Furlong, E.T.

    2004-01-01

    As an integral part of our continuing research in environmental quality assessment approaches, we have developed a variety of passive integrative sampling devices widely applicable for use in defining the presence and potential impacts of a broad array of contaminants. The semipermeable membrane device has gained widespread use for sampling hydrophobic chemicals from water and air, the polar organic chemical integrative sampler is applicable for sequestering waterborne hydrophilic organic chemicals, the stabilized liquid membrane device is used to integratively sample waterborne ionic metals, and the passive integrative mercury sampler is applicable for sampling vapor phase or dissolved neutral mercury species. This suite of integrative samplers forms the basis for a new passive sampling approach for assessing the presence and potential toxicological significance of a broad spectrum of environmental contaminants. In a proof-of-concept study, three of our four passive integrative samplers were used to assess the presence of a wide variety of contaminants in the waters of a constructed wetland, and to determine the effectiveness of the constructed wetland in removing contaminants. The wetland is used for final polishing of secondary-treatment municipal wastewater and the effluent is used as a source of water for a state wildlife area. Numerous contaminants, including organochlorine pesticides, polycyclic aromatic hydrocarbons, organophosphate pesticides, and pharmaceutical chemicals (e.g., ibuprofen, oxindole, etc.) were detected in the wastewater. Herein we summarize the results of the analysis of the field-deployed samplers and demonstrate the utility of this holistic approach.

  4. Towards an Integrated QR Code Biosensor: Light-Driven Sample Acquisition and Bacterial Cellulose Paper Substrate.

    Science.gov (United States)

    Yuan, Mingquan; Jiang, Qisheng; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2018-06-01

    This paper addresses two key challenges toward an integrated forward error-correcting biosensor based on our previously reported self-assembled quick-response (QR) code. The first challenge involves the choice of the paper substrate for printing and self-assembling the QR code. We have compared four different substrates that includes regular printing paper, Whatman filter paper, nitrocellulose membrane and lab synthesized bacterial cellulose. We report that out of the four substrates bacterial cellulose outperforms the others in terms of probe (gold nanorods) and ink retention capability. The second challenge involves remote activation of the analyte sampling and the QR code self-assembly process. In this paper, we use light as a trigger signal and a graphite layer as a light-absorbing material. The resulting change in temperature due to infrared absorption leads to a temperature gradient that then exerts a diffusive force driving the analyte toward the regions of self-assembly. The working principle has been verified in this paper using assembled biosensor prototypes where we demonstrate higher sample flow rate due to light induced thermal gradients.

  5. Comprehensive profiling of retroviral integration sites using target enrichment methods from historical koala samples without an assembled reference genome

    Directory of Open Access Journals (Sweden)

    Pin Cui

    2016-03-01

    Full Text Available Background. Retroviral integration into the host germline results in permanent viral colonization of vertebrate genomes. The koala retrovirus (KoRV is currently invading the germline of the koala (Phascolarctos cinereus and provides a unique opportunity for studying retroviral endogenization. Previous analysis of KoRV integration patterns in modern koalas demonstrate that they share integration sites primarily if they are related, indicating that the process is currently driven by vertical transmission rather than infection. However, due to methodological challenges, KoRV integrations have not been comprehensively characterized. Results. To overcome these challenges, we applied and compared three target enrichment techniques coupled with next generation sequencing (NGS and a newly customized sequence-clustering based computational pipeline to determine the integration sites for 10 museum Queensland and New South Wales (NSW koala samples collected between the 1870s and late 1980s. A secondary aim of this study sought to identify common integration sites across modern and historical specimens by comparing our dataset to previously published studies. Several million sequences were processed, and the KoRV integration sites in each koala were characterized. Conclusions. Although the three enrichment methods each exhibited bias in integration site retrieval, a combination of two methods, Primer Extension Capture and hybridization capture is recommended for future studies on historical samples. Moreover, identification of integration sites shows that the proportion of integration sites shared between any two koalas is quite small.

  6. Pathway Relevance Ranking for Tumor Samples through Network-Based Data Integration.

    Directory of Open Access Journals (Sweden)

    Lieven P C Verbeke

    Full Text Available The study of cancer, a highly heterogeneous disease with different causes and clinical outcomes, requires a multi-angle approach and the collection of large multi-omics datasets that, ideally, should be analyzed simultaneously. We present a new pathway relevance ranking method that is able to prioritize pathways according to the information contained in any combination of tumor related omics datasets. Key to the method is the conversion of all available data into a single comprehensive network representation containing not only genes but also individual patient samples. Additionally, all data are linked through a network of previously identified molecular interactions. We demonstrate the performance of the new method by applying it to breast and ovarian cancer datasets from The Cancer Genome Atlas. By integrating gene expression, copy number, mutation and methylation data, the method's potential to identify key pathways involved in breast cancer development shared by different molecular subtypes is illustrated. Interestingly, certain pathways were ranked equally important for different subtypes, even when the underlying (epi-genetic disturbances were diverse. Next to prioritizing universally high-scoring pathways, the pathway ranking method was able to identify subtype-specific pathways. Often the score of a pathway could not be motivated by a single mutation, copy number or methylation alteration, but rather by a combination of genetic and epi-genetic disturbances, stressing the need for a network-based data integration approach. The analysis of ovarian tumors, as a function of survival-based subtypes, demonstrated the method's ability to correctly identify key pathways, irrespective of tumor subtype. A differential analysis of survival-based subtypes revealed several pathways with higher importance for the bad-outcome patient group than for the good-outcome patient group. Many of the pathways exhibiting higher importance for the bad

  7. Carbon Nanotube Integrative Sampler (CNIS) for passive sampling of nanosilver in the aquatic environment.

    Science.gov (United States)

    Shen, Li; Fischer, Jillian; Martin, Jonathan; Hoque, Md Ehsanul; Telgmann, Lena; Hintelmann, Holger; Metcalfe, Chris D; Yargeau, Viviane

    2016-11-01

    Nanomaterials such as nanosilver (AgNP) can be released into the aquatic environment through production, usage, and disposal. Sensitive and cost-effective methods are needed to monitor AgNPs in the environment. This work is hampered by a lack of sensitive methods to detect nanomaterials in environmental matrixes. The present study focused on the development, calibration and application of a passive sampling technique for detecting AgNPs in aquatic matrixes. A Carbon Nanotube Integrative Sampler (CNIS) was developed using multi-walled carbon nanotubes (CNTs) as the sorbent for accumulating AgNPs and other Ag species from water. Sampling rates were determined in the laboratory for different sampler configurations and in different aquatic matrixes. The sampler was field tested at the Experimental Lakes Area, Canada, in lake water dosed with AgNPs. For a configuration of the CNIS consisting of CNTs bound to carbon fiber (i.e. CNT veil) placed in Chemcatcher® housing, the time weighted average (TWA) concentrations of silver estimated from deployments of the sampler in lake mesocosms dosed with AgNPs were similar to the measured concentrations of "colloidal silver" (i.e. <0.22μm in size) in the water column. For a configuration of CNIS consisting of CNTs in loose powder form placed in a custom made housing that were deployed in a whole lake dosed with AgNPs, the estimated TWA concentrations of "CNIS-labile Ag" were similar to the concentrations of total silver measured in the epilimnion of the lake. However, sampling rates for the CNIS in various matrixes are relatively low (i.e. 1-20mL/day), so deployment periods of several weeks are required to detect AgNPs at environmentally relevant concentrations, which can allow biofilms to develop on the sampler and could affect the sampling rates. With further development, this novel sampler may provide a simple and sensitive method for screening for the presence of AgNPs in surface waters. Copyright © 2016 Elsevier B.V. All

  8. Determination of 222Rn in water samples from wells and springs in Tokyo by a modified integral counting method

    International Nuclear Information System (INIS)

    Homma, Y.; Murase, Y.; Handa, K.; Murakami, I.

    1997-01-01

    222 Rn in 2L-water samples was extracted with 30 mL toluene, and 21 mL of the toluene solution was transferred into a liquid scintillation vial, in which PPO - 2,5-diphenyloxazole was placed in advance. The total activity of 222 Rn in the water sample was calculated based on the Ostwald's coefficient of solubilities of 222 Rn in toluene and water at the temperature of the sample water and the volume of water and toluene. About 40% of 222 Rn dissolved in 2L-water sample can be collected. After allowing to stand for 3.5 h, the equilibrium mixture of 222 Rn and its daughters was measured with an Aloka liquid scintillation spectrometer using a modified integral counting method which extrapolates the integral counting curve not to the zero pulse-height, but to the zero detection threshold, an average energy required to produce a measurable pulse, of the liquid scintillation spectrometer. The general method which agitates water sample (usually about 10 mL) with a liquid scintillation cocktail is practical when the activity of 222 Rn is high. By adding 10 mL of water sample, however, it is possible also to add variable amounts of quencher. In some cases water sample is preserved with nitric acid. The slope of the integral counting rate curve increases as quench level of the sample increases. Therefore, it is clear that the modified integral counting method gives more accurate 222 Rn concentrations for water samples of strong quench than the conventional integral counting method. 222 Rn sample of 0.2 Bq/L can be determined within an overall uncertainty of 3.1%

  9. Lights Will Guide You : Sample Preparation and Applications for Integrated Laser and Electron Microscopy

    Science.gov (United States)

    Karreman, M. A.

    2013-03-01

    Correlative microscopy is the combined use of two different forms of microscopy in the study of a specimen, allowing for the exploitation of the advantages of both imaging tools. The integrated Laser and Electron Microscope (iLEM), developed at Utrecht University, combines a fluorescence microscope (FM) and a transmission electron microscope (TEM) in a single set-up. The region of interest in the specimen is labeled or tagged with a fluorescent probe and can easily be identified within a large field of view with the FM. Next, this same area is retraced in the TEM and can be studied at high resolution. The iLEM demands samples that can be imaged with both FM and TEM. Biological specimen, typically composed of light elements, generate low image contrast in the TEM. Therefore, these samples are often ‘contrasted’ with heavy metal stains. FM, on the other hand, images fluorescent samples. Sample preparation for correlative microscopy, and iLEM in particular, is complicated by the fact that the heavy metals stains employed for TEM quench the fluorescent signal of the probe that is imaged with FM. The first part of this thesis outlines preparation procedures for biological material yielding specimen that can be imaged with the iLEM. Here, approaches for the contrasting of thin sections of cells and tissue are introduced that do not affect the fluorescence signal of the probe that marks the region of interest. Furthermore, two novel procedures, VIS2FIXH and VIS2FIX­FS are described that allow for the chemical fixation of thin sections of cryo-immobilized material. These procedures greatly expedite the sample preparation process, and open up novel possibilities for the immuno-labeling of difficult antigens, eg. proteins and lipids that are challenging to preserve. The second part of this thesis describes applications of iLEM in research in the field of life and material science. The iLEM was employed in the study of UVC induced apoptosis (programmed cell death) of

  10. imFASP: An integrated approach combining in-situ filter-aided sample pretreatment with microwave-assisted protein digestion for fast and efficient proteome sample preparation.

    Science.gov (United States)

    Zhao, Qun; Fang, Fei; Wu, Ci; Wu, Qi; Liang, Yu; Liang, Zhen; Zhang, Lihua; Zhang, Yukui

    2016-03-17

    An integrated sample preparation method, termed "imFASP", which combined in-situ filter-aided sample pretreatment and microwave-assisted trypsin digestion, was developed for preparation of microgram and even nanogram amounts of complex protein samples with high efficiency in 1 h. For imFASP method, proteins dissolved in 8 M urea were loaded onto a filter device with molecular weight cut off (MWCO) as 10 kDa, followed by in-situ protein preconcentration, denaturation, reduction, alkylation, and microwave-assisted tryptic digestion. Compared with traditional in-solution sample preparation method, imFASP method generated more protein and peptide identifications (IDs) from preparation of 45 μg Escherichia coli protein sample due to the higher efficiency, and the sample preparation throughput was significantly improved by 14 times (1 h vs. 15 h). More importantly, when the starting amounts of E. coli cell lysate decreased to nanogram level (50-500 ng), the protein and peptide identified by imFASP method were improved at least 30% and 44%, compared with traditional in-solution preparation method, suggesting dramatically higher peptide recovery of imFASP method for trace amounts of complex proteome samples. All these results demonstrate that the imFASP method developed here is of high potential for high efficient and high throughput preparation of trace amounts of complex proteome samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  12. Optical method for the characterization of laterally patterned samples in integrated circuits

    Science.gov (United States)

    Maris, Humphrey J [Barrington, RI

    2009-03-17

    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  13. SEAMIST trademark in-situ instrumentation and vapor sampling system applications in the Sandia Mixed Waste Landfill Integrated Demonstration Program

    International Nuclear Information System (INIS)

    Lowry, W.E.; Dunn, S.D.; Cremer, S.C.; Williams, C.

    1994-01-01

    The SEAMIST trademark inverting membrane deployment system has been used successfully at the Mixed Waste Landfill Integrated Demonstration (MWLID) for multipoint vapor sampling/pressure measurement/permeability measurement/sensor integration demonstrations and borehole lining. Several instruments were deployed inside the SEAMIST trademark lined boreholes to detect metals, radionuclides, moisture, and geologic variations. The liner protected the instruments from contamination, maintained support of the uncased borehole wall, and sealed the total borehole from air circulation. The current activities have included the installation of three multipoint vapor sampling systems and sensor integration systems in 100-foot-deep vertical boreholes. A long term pressure monitoring program has recorded barometric pressure effects at depth with relatively high spatial resolution. The SEAMIST trademark system has been integrated with a variety of hydrologic and chemical sensors for in-situ measurements, demonstrating its versatility as an instrument deployment system which allows easy emplacement and removal. Standard SEAMIST trademark vapor sampling systems were also integrated with state-of-the-art VOC analysis technologies (automated GC, UV laser fluorometer). The results and status of these demonstration tests are presented

  14. Particle integrity, sampling, and application of a DNA-tagged tracer for aerosol transport studies

    Energy Technology Data Exchange (ETDEWEB)

    Kaeser, Cynthia Jeanne [Michigan State Univ., East Lansing, MI (United States)

    2017-07-21

    formulations of two different food-grade sugars (maltodextrin and erythritol) to humidity as high as 66% had no significant effect on the DNA label’s degradation or the particle’s aerodynamic diameter, confirming particle stability under such conditions. In summary, confirmation of the DNATrax particles’ size and label integrity under variable conditions combined with experiment multiplexing and high resolution sampling provides a powerful experimental design for modeling aerosol transport through occupied indoor and outdoor locations.

  15. Integrating the Theory of Sampling into Underground Mine Grade Control Strategies

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available Grade control in underground mines aims to deliver quality tonnes to the process plant via the accurate definition of ore and waste. It comprises a decision-making process including data collection and interpretation; local estimation; development and mining supervision; ore and waste destination tracking; and stockpile management. The foundation of any grade control programme is that of high-quality samples collected in a geological context. The requirement for quality samples has long been recognised, where they should be representative and fit-for-purpose. Once a sampling error is introduced, it propagates through all subsequent processes contributing to data uncertainty, which leads to poor decisions and financial loss. Proper application of the Theory of Sampling reduces errors during sample collection, preparation, and assaying. To achieve quality, sampling techniques must minimise delimitation, extraction, and preparation errors. Underground sampling methods include linear (chip and channel, grab (broken rock, and drill-based samples. Grade control staff should be well-trained and motivated, and operating staff should understand the critical need for grade control. Sampling must always be undertaken with a strong focus on safety and alternatives sought if the risk to humans is high. A quality control/quality assurance programme must be implemented, particularly when samples contribute to a reserve estimate. This paper assesses grade control sampling with emphasis on underground gold operations and presents recommendations for optimal practice through the application of the Theory of Sampling.

  16. The integrated performance evaluation program quality assurance guidance in support of EM environmental sampling and analysis activities

    International Nuclear Information System (INIS)

    1994-05-01

    EM's (DOE's Environmental Restoration and Waste Management) Integrated Performance Evaluation Program (IPEP) has the purpose of integrating information from existing PE programs with expanded QA activities to develop information about the quality of radiological, mixed waste, and hazardous environmental sample analyses provided by all laboratories supporting EM programs. The guidance addresses the goals of identifying specific PE sample programs and contacts, identifying specific requirements for participation in DOE's internal and external (regulatory) programs, identifying key issues relating to application and interpretation of PE materials for EM headquarters and field office managers, and providing technical guidance covering PE materials for site-specific activities. (PE) Performance Evaluation materials or samples are necessary for the quality assurance/control programs covering environmental data collection

  17. Assessing Exhaustiveness of Stochastic Sampling for Integrative Modeling of Macromolecular Structures.

    Science.gov (United States)

    Viswanath, Shruthi; Chemmama, Ilan E; Cimermancic, Peter; Sali, Andrej

    2017-12-05

    Modeling of macromolecular structures involves structural sampling guided by a scoring function, resulting in an ensemble of good-scoring models. By necessity, the sampling is often stochastic, and must be exhaustive at a precision sufficient for accurate modeling and assessment of model uncertainty. Therefore, the very first step in analyzing the ensemble is an estimation of the highest precision at which the sampling is exhaustive. Here, we present an objective and automated method for this task. As a proxy for sampling exhaustiveness, we evaluate whether two independently and stochastically generated sets of models are sufficiently similar. The protocol includes testing 1) convergence of the model score, 2) whether model scores for the two samples were drawn from the same parent distribution, 3) whether each structural cluster includes models from each sample proportionally to its size, and 4) whether there is sufficient structural similarity between the two model samples in each cluster. The evaluation also provides the sampling precision, defined as the smallest clustering threshold that satisfies the third, most stringent test. We validate the protocol with the aid of enumerated good-scoring models for five illustrative cases of binary protein complexes. Passing the proposed four tests is necessary, but not sufficient for thorough sampling. The protocol is general in nature and can be applied to the stochastic sampling of any set of models, not just structural models. In addition, the tests can be used to stop stochastic sampling as soon as exhaustiveness at desired precision is reached, thereby improving sampling efficiency; they may also help in selecting a model representation that is sufficiently detailed to be informative, yet also sufficiently coarse for sampling to be exhaustive. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  18. Integrative analysis of single nucleotide polymorphisms and gene expression efficiently distinguishes samples from closely related ethnic populations

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2012-07-01

    Full Text Available Abstract Background Ancestry informative markers (AIMs are a type of genetic marker that is informative for tracing the ancestral ethnicity of individuals. Application of AIMs has gained substantial attention in population genetics, forensic sciences, and medical genetics. Single nucleotide polymorphisms (SNPs, the materials of AIMs, are useful for classifying individuals from distinct continental origins but cannot discriminate individuals with subtle genetic differences from closely related ancestral lineages. Proof-of-principle studies have shown that gene expression (GE also is a heritable human variation that exhibits differential intensity distributions among ethnic groups. GE supplies ethnic information supplemental to SNPs; this motivated us to integrate SNP and GE markers to construct AIM panels with a reduced number of required markers and provide high accuracy in ancestry inference. Few studies in the literature have considered GE in this aspect, and none have integrated SNP and GE markers to aid classification of samples from closely related ethnic populations. Results We integrated a forward variable selection procedure into flexible discriminant analysis to identify key SNP and/or GE markers with the highest cross-validation prediction accuracy. By analyzing genome-wide SNP and/or GE markers in 210 independent samples from four ethnic groups in the HapMap II Project, we found that average testing accuracies for a majority of classification analyses were quite high, except for SNP-only analyses that were performed to discern study samples containing individuals from two close Asian populations. The average testing accuracies ranged from 0.53 to 0.79 for SNP-only analyses and increased to around 0.90 when GE markers were integrated together with SNP markers for the classification of samples from closely related Asian populations. Compared to GE-only analyses, integrative analyses of SNP and GE markers showed comparable testing

  19. Design and performance of a multi-channel, multi-sampling, PSD-enabling integrated circuit

    International Nuclear Information System (INIS)

    Engel, G.L.; Hall, M.J.; Proctor, J.M.; Elson, J.M.; Sobotka, L.G.; Shane, R.; Charity, R.J.

    2009-01-01

    This paper presents the design and test results of an eight-channel prototype integrated circuit chip intended to greatly simplify the pulse-processing electronics needed for large arrays of scintillation detectors. Because the chip design employs (user-controlled) multi-region charge integration, particle identification is incorporated into the basic design. Each channel on the chip also contains a time-to-voltage converter which provides relative time information. The pulse-height integrals and the relative time are all stored on capacitors and are either reset, after a user controlled time, or sequentially read out if acquisition of the event is desired. Each of the three pulse-height sub-channels consists of a gated integrator with eight programmable charging rates and an externally programmable gate generator that defines the start (with four time ranges) and width (with four time ranges) of the gate relative to an external discriminator signal. The chip supports three triggering modes, two time ranges, two power modes, and produces four sparsified analog pulse trains (three for the integrators and another for the time) with synchronized addresses for off-chip digitization with a pipelined ADC. The eight-channel prototype chip occupies an area of 2.8 mmx5.7 mm, dissipates 60 mW (low-power mode), and was fabricated in the AMI 0.5-μm process (C5N).

  20. Design and performance of a multi-channel, multi-sampling, PSD-enabling integrated circuit

    Energy Technology Data Exchange (ETDEWEB)

    Engel, G.L., E-mail: gengel@siue.ed [Department of Electrical and Computer Engineering, VLSI Design Research Laboratory, Southern Illinois University Edwardsville, Engineering Building, Room 3043 Edwardsville, IL 62026 1081 (United States); Hall, M.J.; Proctor, J.M. [Department of Electrical and Computer Engineering, VLSI Design Research Laboratory, Southern Illinois University Edwardsville, Engineering Building, Room 3043 Edwardsville, IL 62026 1081 (United States); Elson, J.M.; Sobotka, L.G.; Shane, R.; Charity, R.J. [Departments of Chemistry and Physics, Washington University, Saint Louis, MO 63130 (United States)

    2009-12-21

    This paper presents the design and test results of an eight-channel prototype integrated circuit chip intended to greatly simplify the pulse-processing electronics needed for large arrays of scintillation detectors. Because the chip design employs (user-controlled) multi-region charge integration, particle identification is incorporated into the basic design. Each channel on the chip also contains a time-to-voltage converter which provides relative time information. The pulse-height integrals and the relative time are all stored on capacitors and are either reset, after a user controlled time, or sequentially read out if acquisition of the event is desired. Each of the three pulse-height sub-channels consists of a gated integrator with eight programmable charging rates and an externally programmable gate generator that defines the start (with four time ranges) and width (with four time ranges) of the gate relative to an external discriminator signal. The chip supports three triggering modes, two time ranges, two power modes, and produces four sparsified analog pulse trains (three for the integrators and another for the time) with synchronized addresses for off-chip digitization with a pipelined ADC. The eight-channel prototype chip occupies an area of 2.8 mmx5.7 mm, dissipates 60 mW (low-power mode), and was fabricated in the AMI 0.5-mum process (C5N).

  1. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    Science.gov (United States)

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  2. Loaded dice in Monte Carlo : importance sampling in phase space integration and probability distributions for discrepancies

    NARCIS (Netherlands)

    Hameren, Andreas Ferdinand Willem van

    2001-01-01

    Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of

  3. Peripheral biomarkers revisited: integrative profiling of peripheral samples for psychiatric research.

    Science.gov (United States)

    Hayashi-Takagi, Akiko; Vawter, Marquis P; Iwamoto, Kazuya

    2014-06-15

    Peripheral samples, such as blood and skin, have been used for decades in psychiatric research as surrogates for central nervous system samples. Although the validity of the data obtained from peripheral samples has been questioned and other state-of-the-art techniques, such as human brain imaging, genomics, and induced pluripotent stem cells, seem to reduce the value of peripheral cells, accumulating evidence has suggested that revisiting peripheral samples is worthwhile. Here, we re-evaluate the utility of peripheral samples and argue that establishing an understanding of the common signaling and biological processes in the brain and peripheral samples is required for the validity of such models. First, we present an overview of the available types of peripheral cells and describe their advantages and disadvantages. We then briefly summarize the main achievements of omics studies, including epigenome, transcriptome, proteome, and metabolome analyses, as well as the main findings of functional cellular assays, the results of which imply that alterations in neurotransmission, metabolism, the cell cycle, and the immune system may be partially responsible for the pathophysiology of major psychiatric disorders such as schizophrenia. Finally, we discuss the future utility of peripheral samples for the development of biomarkers and tailor-made therapies, such as multimodal assays that are used as a battery of disease and trait pathways and that might be potent and complimentary tools for use in psychiatric research. © 2013 Society of Biological Psychiatry Published by Society of Biological Psychiatry All rights reserved.

  4. Integration of georeferencing, habitat, sampling, and genetic data for documentation of wild plant genetic resources

    Science.gov (United States)

    Plant genetic resource collections provide novel materials to the breeding and research communities. Availability of detailed documentation of passport, phenotypic, and genetic data increases the value of the genebank accessions. Inclusion of georeferenced sources, habitats, and sampling data in co...

  5. A fast charge-integrating sample-and-hold circuit for fast decision-making with calorimeter arrays

    International Nuclear Information System (INIS)

    Schuler, G.

    1982-01-01

    This paper describes a fast charge-integrating sample-and-hold circuit, particularly suited to the fast trigger electronics used with large arrays of photomultipliers in total-energy measurements of high-energy particles interactions. During a gate logic pulse, the circuit charges a capacitor with the current fed into the signal input. The output voltage is equal to the voltage developed across the capacitor, which is held until a fast clear discharges the capacitor. The main characteristics of the fast-charge-integrating sample-and-hold circuit are: i) a conversion factor of 1 V/220 pC; ii) a droop rate of 4 mV/μs for a 50 Ω load; and iii) a 1 μs fast-clear time. (orig.)

  6. Integration of electromagnetic induction sensor data in soil sampling scheme optimization using simulated annealing.

    Science.gov (United States)

    Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G

    2015-07-01

    Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The

  7. Immunoassay of C-reactive protein by hot electron induced electrochemiluminescence using integrated electrodes with hydrophobic sample confinement

    Energy Technology Data Exchange (ETDEWEB)

    Ylinen-Hinkka, T., E-mail: tiina.ylinen-hinkka@aalto.fi [Laboratory of Analytical Chemistry, Aalto University School of Chemical Technology, P.O. Box 16100, FI-00076 Aalto (Finland); Niskanen, A.J.; Franssila, S. [Department of Materials Science and Engineering, Aalto University School of Chemical Technology, P.O. Box 16200, FI-00076 Aalto (Finland); Kulmala, S. [Laboratory of Analytical Chemistry, Aalto University School of Chemical Technology, P.O. Box 16100, FI-00076 Aalto (Finland)

    2011-09-19

    Highlights: {center_dot} C-reactive protein has been determined in the concentration range 0.01-10 mg L{sup -1} using an electrochemiluminescence microchip which employs integrated electrodes with hydrophobic sample confinement. {center_dot} This arrangement enables very simple and fast CRP analysis amenable to point-of-care applications. - Abstract: C-reactive protein (CRP) was determined in the concentration range 0.01-10 mg L{sup -1} using hot electron induced electrochemiluminescence (HECL) with devices combining both working and counter electrodes and sample confinement on a single chip. The sample area on the electrodes was defined by a hydrophobic ring, which enabled dispensing the reagents and the analyte directly on the electrode. Immunoassay of CRP by HECL using integrated electrodes is a good candidate for a high-sensitivity point-of-care CRP-test, because the concentration range is suitable, miniaturisation of the measurement system has been demonstrated and the assay method with integrated electrodes is easy to use. High-sensitivity CRP tests can be used to monitor the current state of cardiovascular disease and also to predict future cardiovascular problems in apparently healthy people.

  8. Finite Sample Comparison of Parametric, Semiparametric, and Wavelet Estimators of Fractional Integration

    DEFF Research Database (Denmark)

    Nielsen, Morten Ø.; Frederiksen, Per Houmann

    2005-01-01

    In this paper we compare through Monte Carlo simulations the finite sample properties of estimators of the fractional differencing parameter, d. This involves frequency domain, time domain, and wavelet based approaches, and we consider both parametric and semiparametric estimation methods. The es...... the time domain parametric methods, and (4) without sufficient trimming of scales the wavelet-based estimators are heavily biased.......In this paper we compare through Monte Carlo simulations the finite sample properties of estimators of the fractional differencing parameter, d. This involves frequency domain, time domain, and wavelet based approaches, and we consider both parametric and semiparametric estimation methods....... The estimators are briefly introduced and compared, and the criteria adopted for measuring finite sample performance are bias and root mean squared error. Most importantly, the simulations reveal that (1) the frequency domain maximum likelihood procedure is superior to the time domain parametric methods, (2) all...

  9. Total elimination of sampling errors in polarization imagery obtained with integrated microgrid polarimeters.

    Science.gov (United States)

    Tyo, J Scott; LaCasse, Charles F; Ratliff, Bradley M

    2009-10-15

    Microgrid polarimeters operate by integrating a focal plane array with an array of micropolarizers. The Stokes parameters are estimated by comparing polarization measurements from pixels in a neighborhood around the point of interest. The main drawback is that the measurements used to estimate the Stokes vector are made at different locations, leading to a false polarization signature owing to instantaneous field-of-view (IFOV) errors. We demonstrate for the first time, to our knowledge, that spatially band limited polarization images can be ideally reconstructed with no IFOV error by using a linear system framework.

  10. Macrophyte species distribution, indices of biotic integrity and sampling intensity in isolated Florida marshes

    Science.gov (United States)

    This study tested macrophyte condition metrics calculated after decreasing the effort and area of sampling by 33% to 66%, as tested in 74 emergent isolated wetlands. Four belted transects from wetland edge to center were established and rooted macrophytes were identified. The eff...

  11. GIBSI: an integrated modelling system for watershed management – sample applications and current developments

    Directory of Open Access Journals (Sweden)

    A. N. Rousseau

    2007-11-01

    Full Text Available Hydrological and pollutant fate models have long been developed for research purposes. Today, they find an application in integrated watershed management, as decision support systems (DSS. GIBSI is such a DSS designed to assist stakeholders in watershed management. It includes a watershed database coupled to a GIS and accessible through a user-friendly interface, as well as modelling tools that simulate, on a daily time step, hydrological processes such as evapotranspiration, runoff, soil erosion, agricultural pollutant transport and surface water quality. Therefore, GIBSI can be used to assess a priori the effect of management scenarios (reservoirs, land use, waste water effluents, diffuse sources of pollution that is agricultural pollution on surface hydrology and water quality. For illustration purposes, this paper presents several management-oriented applications using GIBSI on the 6680 km2 Chaudière River watershed, located near Quebec City (Canada. They include impact assessments of: (i municipal clean water program; (ii agricultural nutrient management scenarios; (iii past and future land use changes, as well as (iv determination of achievable performance standards of pesticides management practices. Current and future developments of GIBSI are also presented as these will extend current uses of this tool and make it useable and applicable by stakeholders on other watersheds. Finally, the conclusion emphasizes some of the challenges that remain for a better use of DSS in integrated watershed management.

  12. Integration of sampling based battery state of health estimation method in electric vehicles

    International Nuclear Information System (INIS)

    Ozkurt, Celil; Camci, Fatih; Atamuradov, Vepa; Odorry, Christopher

    2016-01-01

    Highlights: • Presentation of a prototype system with full charge discharge cycling capability. • Presentation of SoH estimation results for systems degraded in the lab. • Discussion of integration alternatives of the presented method in EVs. • Simulation model based on presented SoH estimation for a real EV battery system. • Optimization of number of battery cells to be selected for SoH test. - Abstract: Battery cost is one of the crucial parameters affecting high deployment of Electric Vehicles (EVs) negatively. Accurate State of Health (SoH) estimation plays an important role in reducing the total ownership cost, availability, and safety of the battery avoiding early disposal of the batteries and decreasing unexpected failures. A circuit design for SoH estimation in a battery system that bases on selected battery cells and its integration to EVs are presented in this paper. A prototype microcontroller has been developed and used for accelerated aging tests for a battery system. The data collected in the lab tests have been utilized to simulate a real EV battery system. Results of accelerated aging tests and simulation have been presented in the paper. The paper also discusses identification of the best number of battery cells to be selected for SoH estimation test. In addition, different application options of the presented approach for EV batteries have been discussed in the paper.

  13. An integrated paper-based sample-to-answer biosensor for nucleic acid testing at the point of care.

    Science.gov (United States)

    Choi, Jane Ru; Hu, Jie; Tang, Ruihua; Gong, Yan; Feng, Shangsheng; Ren, Hui; Wen, Ting; Li, XiuJun; Wan Abas, Wan Abu Bakar; Pingguan-Murphy, Belinda; Xu, Feng

    2016-02-07

    With advances in point-of-care testing (POCT), lateral flow assays (LFAs) have been explored for nucleic acid detection. However, biological samples generally contain complex compositions and low amounts of target nucleic acids, and currently require laborious off-chip nucleic acid extraction and amplification processes (e.g., tube-based extraction and polymerase chain reaction (PCR)) prior to detection. To the best of our knowledge, even though the integration of DNA extraction and amplification into a paper-based biosensor has been reported, a combination of LFA with the aforementioned steps for simple colorimetric readout has not yet been demonstrated. Here, we demonstrate for the first time an integrated paper-based biosensor incorporating nucleic acid extraction, amplification and visual detection or quantification using a smartphone. A handheld battery-powered heating device was specially developed for nucleic acid amplification in POC settings, which is coupled with this simple assay for rapid target detection. The biosensor can successfully detect Escherichia coli (as a model analyte) in spiked drinking water, milk, blood, and spinach with a detection limit of as low as 10-1000 CFU mL(-1), and Streptococcus pneumonia in clinical blood samples, highlighting its potential use in medical diagnostics, food safety analysis and environmental monitoring. As compared to the lengthy conventional assay, which requires more than 5 hours for the entire sample-to-answer process, it takes about 1 hour for our integrated biosensor. The integrated biosensor holds great potential for detection of various target analytes for wide applications in the near future.

  14. An integrated rock magnetic and EPR study in soil samples from a hydrocarbon prospective area

    Science.gov (United States)

    González, F.; Aldana, M.; Costanzo-Álvarez, V.; Díaz, M.; Romero, I.

    Magnetic susceptibility (MS) and organic matter free radical concentration (OMFRC) determined by electron paramagnetic resonance, have been measured in soil samples (≈1.5 m depth) from an oil prospective area located at the southern flank of the Venezuelan Andean Range. S-ratios close to 1, as well as high temperature susceptibility analyses, reveal magnetite as the chief magnetic phase in most of these samples. Ethane concentrations, MS and OMFRC normalized data have been plotted against the relative position of 22 sampling sites sequentially arranged from north to south. Although there is not a linear correlation between MS and OMFRC data, these two profiles seem to vary in like fashion. A MS and OMFRC southern anomaly coincides with the zone of highest ethane concentration that overlies a “Cretaceous kitchen”. OMFRC highs could be linked to the degradation or alteration of organic matter, the possible result of hydrocarbon gas leakage, whose surface expression is the stressed fern observed by remote sensing studies previously performed in the area. Ethane anomalies are associated to this seepage that also produces changes in the magnetic mineralogies detected as MS positive anomalies.

  15. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  16. Use of spatially distributed time-integrated sediment sampling networks and distributed fine sediment modelling to inform catchment management.

    Science.gov (United States)

    Perks, M T; Warburton, J; Bracken, L J; Reaney, S M; Emery, S B; Hirst, S

    2017-11-01

    Under the EU Water Framework Directive, suspended sediment is omitted from environmental quality standards and compliance targets. This omission is partly explained by difficulties in assessing the complex dose-response of ecological communities. But equally, it is hindered by a lack of spatially distributed estimates of suspended sediment variability across catchments. In this paper, we demonstrate the inability of traditional, discrete sampling campaigns for assessing exposure to fine sediment. Sampling frequencies based on Environmental Quality Standard protocols, whilst reflecting typical manual sampling constraints, are unable to determine the magnitude of sediment exposure with an acceptable level of precision. Deviations from actual concentrations range between -35 and +20% based on the interquartile range of simulations. As an alternative, we assess the value of low-cost, suspended sediment sampling networks for quantifying suspended sediment transfer (SST). In this study of the 362 km 2 upland Esk catchment we observe that spatial patterns of sediment flux are consistent over the two year monitoring period across a network of 17 monitoring sites. This enables the key contributing sub-catchments of Butter Beck (SST: 1141 t km 2 yr -1 ) and Glaisdale Beck (SST: 841 t km 2 yr -1 ) to be identified. The time-integrated samplers offer a feasible alternative to traditional infrequent and discrete sampling approaches for assessing spatio-temporal changes in contamination. In conjunction with a spatially distributed diffuse pollution model (SCIMAP), time-integrated sediment sampling is an effective means of identifying critical sediment source areas in the catchment, which can better inform sediment management strategies for pollution prevention and control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  18. Integrated Circuits for Rapid Sample Processing and Electrochemical Detection of Biomarkers

    Science.gov (United States)

    Besant, Justin

    The trade-off between speed and sensitivity of detection is a fundamental challenge in the design of point-of-care diagnostics. As the relevant molecules in many diseases exist natively at extremely low levels, many gold-standard diagnostic tests are designed with high sensitivity at the expense of long incubations needed to amplify the target analytes. The central aim of this thesis is to design new strategies to detect biologically relevant analytes with both high speed and sensitivity. The response time of a biosensor is limited by the ability of the target analyte to accumulate to detectable levels at the sensor surface. We overcome this limitation by designing a range of integrated devices to optimize the flux of the analyte to the sensor by increasing the effective analyte concentration, shortening the required diffusion distance, and confining the analyte in close proximity to the sensor. We couple these devices with novel ultrasensitive electrochemical transduction strategies to convert rare analytes into a detectable signal. We showcase the clinical utility of these approaches with several applications including cancer diagnosis, bacterial identification, and antibiotic susceptibility profiling. We design and optimize a device to isolate rare cancer cells from the bloodstream with near 100% efficiency and 10 000-fold specificity. We analyse pathogen specific nucleic acids by lysing bacteria in close proximity to an electrochemical sensor and find that this approach has 10-fold higher sensitivity than standard lysis in bulk solution. We design an electronic chip to readout the antibiotic susceptibility profile with an hour-long incubation by concentrating bacteria into nanoliter chambers with integrated electrodes. Finally, we report a strategy for ultrasensitive visual readout of nucleic acids as low as 100 fM within 10 minutes using an amplification cascade. The strategies presented could guide the development of fast, sensitive and low-cost diagnostics

  19. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger [Department of Astronomy, Oskar Klein Centre, Stockholm University, AlbaNova University Centre, SE-106 91 Stockholm (Sweden); Adamo, Angela [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Schaerer, Daniel [Université de Toulouse, UPS-OMP, IRAP, F-31000 Toulouse (France); Verhamme, Anne; Orlitová, Ivana [Geneva Observatory, University of Geneva, 51 Chemin des Maillettes, CH-1290 Versoix (Switzerland); Mas-Hesse, J. Miguel; Otí-Floranes, Héctor [Centro de Astrobiología (CSIC-INTA), Departamento de Astrofísica, P.O. Box 78, E-28691 Villanueva de la Cañada (Spain); Cannon, John M.; Pardy, Stephen [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Atek, Hakim [Laboratoire dAstrophysique, École Polytechnique Fédérale de Lausanne (EPFL), Observatoire, CH-1290 Sauverny (Switzerland); Kunth, Daniel [Institut d' Astrophysique de Paris, UMR 7095, CNRS and UPMC, 98 bis Bd Arago, F-75014 Paris (France); Laursen, Peter [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark); Herenz, E. Christian, E-mail: matthew@astro.su.se [Leibniz-Institut für Astrophysik (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany)

    2014-02-10

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f{sub esc}{sup Lyα} of 80%; such objects have not previously been reported at low-z.

  20. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    International Nuclear Information System (INIS)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger; Adamo, Angela; Schaerer, Daniel; Verhamme, Anne; Orlitová, Ivana; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor; Cannon, John M.; Pardy, Stephen; Atek, Hakim; Kunth, Daniel; Laursen, Peter; Herenz, E. Christian

    2014-01-01

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f esc Lyα of 80%; such objects have not previously been reported at low-z.

  1. Calibration model maintenance in melamine resin production: Integrating drift detection, smart sample selection and model adaptation.

    Science.gov (United States)

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Cernuda, Carlos; Reischer, Thomas; Kantner, Wolfgang; Pawliczek, Marcin; Brandstetter, Markus

    2018-07-12

    The physico-chemical properties of Melamine Formaldehyde (MF) based thermosets are largely influenced by the degree of polymerization (DP) in the underlying resin. On-line supervision of the turbidity point by means of vibrational spectroscopy has recently emerged as a promising technique to monitor the DP of MF resins. However, spectroscopic determination of the DP relies on chemometric models, which are usually sensitive to drifts caused by instrumental and/or sample-associated changes occurring over time. In order to detect the time point when drifts start causing prediction bias, we here explore a universal drift detector based on a faded version of the Page-Hinkley (PH) statistic, which we test in three data streams from an industrial MF resin production process. We employ committee disagreement (CD), computed as the variance of model predictions from an ensemble of partial least squares (PLS) models, as a measure for sample-wise prediction uncertainty and use the PH statistic to detect changes in this quantity. We further explore supervised and unsupervised strategies for (semi-)automatic model adaptation upon detection of a drift. For the former, manual reference measurements are requested whenever statistical thresholds on Hotelling's T 2 and/or Q-Residuals are violated. Models are subsequently re-calibrated using weighted partial least squares in order to increase the influence of newer samples, which increases the flexibility when adapting to new (drifted) states. Unsupervised model adaptation is carried out exploiting the dual antecedent-consequent structure of a recently developed fuzzy systems variant of PLS termed FLEXFIS-PLS. In particular, antecedent parts are updated while maintaining the internal structure of the local linear predictors (i.e. the consequents). We found improved drift detection capability of the CD compared to Hotelling's T 2 and Q-Residuals when used in combination with the proposed PH test. Furthermore, we found that active

  2. Automation and integration of multiplexed on-line sample preparation with capillary electrophoresis for DNA sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Tan, H.

    1999-03-31

    The purpose of this research is to develop a multiplexed sample processing system in conjunction with multiplexed capillary electrophoresis for high-throughput DNA sequencing. The concept from DNA template to called bases was first demonstrated with a manually operated single capillary system. Later, an automated microfluidic system with 8 channels based on the same principle was successfully constructed. The instrument automatically processes 8 templates through reaction, purification, denaturation, pre-concentration, injection, separation and detection in a parallel fashion. A multiplexed freeze/thaw switching principle and a distribution network were implemented to manage flow direction and sample transportation. Dye-labeled terminator cycle-sequencing reactions are performed in an 8-capillary array in a hot air thermal cycler. Subsequently, the sequencing ladders are directly loaded into a corresponding size-exclusion chromatographic column operated at {approximately} 60 C for purification. On-line denaturation and stacking injection for capillary electrophoresis is simultaneously accomplished at a cross assembly set at {approximately} 70 C. Not only the separation capillary array but also the reaction capillary array and purification columns can be regenerated after every run. DNA sequencing data from this system allow base calling up to 460 bases with accuracy of 98%.

  3. Utility of the Community Integration Questionnaire in a sample of adults with neurological and neuropsychiatric disorders receiving prevocational training.

    Science.gov (United States)

    Tomaszewski, Robert; Mitrushina, Maura

    2015-08-03

    To investigate utility of the Community Integration Questionnaire (CIQ) in a mixed sample of adults with neurological and neuropsychiatric disorders. Cross-sectional, interview-based study. Participants were community-dwelling adults with disabilities resulting from neurological and neuropsychiatric disorders (N = 54), who participated in a pre-vocational readiness and social skills training program. Psychometric properties of the Community Integration Questionnaire (CIQ) were assessed and validated against Mayo-Portland Adaptability Inventory (MPAI) and The Problem Checklist from the New York University Head Injury Family Interview (PCL). Based on the revised scoring procedures, psychometric properties of the CIQ Home Competency scale were excellent, followed by the Total score and Social Integration scale. Productive Activity scale had low content validity and a weak association with the total score. Convergent and discriminant validity of the CIQ were demonstrated by correlation patterns with MPAI scales in the expected direction. Significant relationship was found with PCL Physical/Dependency scale. Significant associations were found with sex, living status, and record of subsequent employment. The results provide support for the use of the CIQ as a measure of participation in individuals with neurological and neuropsychiatric diagnoses and resulting disabilities. Implications for Rehabilitation An important goal of rehabilitation and training programs for individuals with dysfunction of the central nervous system is to promote their participation in social, vocational, and domestic activities. The Community Integration Questionnaire (CIQ) is a brief and efficient instrument for measuring these participation domains. This study demonstrated good psychometric properties and high utility of the CIQ in a sample of 54 individuals participating in a prevocational training program.

  4. Calibration and field performance of membrane-enclosed sorptive coating for integrative passive sampling of persistent organic pollutants in water

    International Nuclear Information System (INIS)

    Vrana, Branislav; Paschke, Albrecht; Popp, Peter

    2006-01-01

    Membrane-enclosed sorptive coating (MESCO) is a miniaturised monitoring device that enables integrative passive sampling of persistent, hydrophobic organic pollutants in water. The system combines the passive sampling with solventless preconcentration of organic pollutants from water and subsequent desorption of analytes on-line into a chromatographic system. Exchange kinetics of chemicals between water and MESCO was studied at different flow rates of water, in order to characterize the effect of variable environmental conditions on the sampler performance, and to identify a method for in situ correction of the laboratory-derived calibration data. It was found that the desorption of chemicals from MESCO into water is isotropic to the absorption of the analytes onto the sampler under the same exposure conditions. This allows for the in situ calibration of the uptake of pollutants using elimination kinetics of performance reference compounds and more accurate estimates of target analyte concentrations. A field study was conducted to test the sampler performance alongside spot sampling. A good agreement of contaminant patterns and water concentrations was obtained by the two sampling techniques. - A robust calibration method of a passive sampling device for monitoring of persistent organic pollutants in water is described

  5. Integrated preservation and sample clean up procedures for studying water ingestion by recreational swimmers via urinary biomarker determination.

    Science.gov (United States)

    Cantú, Ricardo; Shoemaker, Jody A; Kelty, Catherine A; Wymer, Larry J; Behymer, Thomas D; Dufour, Alfred P; Magnuson, Matthew L

    2017-08-22

    The use of cyanuric acid as a biomarker for ingestion of swimming pool water may lead to quantitative knowledge of the volume of water ingested during swimming, contributing to a better understanding of disease resulting from ingestion of environmental contaminants. When swimming pool water containing chlorinated cyanurates is inadvertently ingested, cyanuric acid is excreted quantitatively within 24 h as a urinary biomarker of ingestion. Because the volume of water ingested can be quantitatively estimated by calculation from the concentration of cyanuric acid in 24 h urine samples, a procedure for preservation, cleanup, and analysis of cyanuric acid was developed to meet the logistical demands of large scale studies. From a practical stand point, urine collected from swimmers cannot be analyzed immediately, given requirements of sample collection, shipping, handling, etc. Thus, to maintain quality control to allow confidence in the results, it is necessary to preserve the samples in a manner that ensures as quantitative analysis as possible. The preservation and clean-up of cyanuric acid in urine is complicated because typical approaches often are incompatible with the keto-enol tautomerization of cyanuric acid, interfering with cyanuric acid sample preparation, chromatography, and detection. Therefore, this paper presents a novel integration of sample preservation, clean-up, chromatography, and detection to determine cyanuric acid in 24 h urine samples. Fortification of urine with cyanuric acid (0.3-3.0 mg/L) demonstrated accuracy (86-93% recovery) and high reproducibility (RSD urine suggested sufficient cyanuric acid stability for sample collection procedures, while longer holding times suggested instability of the unpreserved urine. Preserved urine exhibited a loss of around 0.5% after 22 days at refrigerated storage conditions of 4 °C. Published by Elsevier B.V.

  6. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  7. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  8. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    Energy Technology Data Exchange (ETDEWEB)

    Thien, Mike G. [Washington River Protection Solutions, LLC, P.O Box 850, Richland WA, 99352 (United States); Barnes, Steve M. [Waste Treatment Plant, 2435 Stevens Center Place, Richland WA 99354 (United States)

    2013-07-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  9. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described

  10. Measurement of the integrated Luminosities of cross-section scan data samples around the {\\rm{\\psi }}(3770) mass region

    Science.gov (United States)

    Ablikim, M.; Achasov, M. N.; Ahmed, S.; Albrecht, M.; Alekseev, M.; Amoroso, A.; An, F. F.; An, Q.; Bai, Y.; Bakina, O.; Baldini Ferroli, R.; Ban, Y.; Begzsuren, K.; Bennett, D. W.; Bennett, J. V.; Berger, N.; Bertani, M.; Bettoni, D.; Bianchi, F.; Boger, E.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chai, J.; Chang, J. F.; Chang, W. L.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, J. C.; Chen, M. L.; Chen, P. L.; Chen, S. J.; Chen, X. R.; Chen, Y. B.; Chu, X. K.; Cibinetto, G.; Cossio, F.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Deng, Z. Y.; Denig, A.; Denysenko, I.; Destefanis, M.; De Mori, F.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Dou, Z. L.; Du, S. X.; Duan, P. F.; Fang, J.; Fang, S. S.; Fang, Y.; Farinelli, R.; Fava, L.; Fegan, S.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, X. L.; Gao, Y.; Gao, Y. G.; Gao, Z.; Garillon, B.; Garzia, I.; Gilman, A.; Goetzen, K.; Gong, L.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, L. M.; Gu, M. H.; Gu, Y. T.; Guo, A. Q.; Guo, L. B.; Guo, R. P.; Guo, Y. P.; Guskov, A.; Haddadi, Z.; Han, S.; Hao, X. Q.; Harris, F. A.; He, K. L.; He, X. Q.; Heinsius, F. H.; Held, T.; Heng, Y. K.; Holtmann, T.; Hou, Z. L.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. S.; Huang, J. S.; Huang, X. T.; Huang, X. Z.; Huang, Z. L.; Hussain, T.; Ikegami Andersson, W.; Irshad, M.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, X. S.; Jiang, X. Y.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Jin, Y.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. S.; Kavatsyuk, M.; Ke, B. C.; Khan, T.; Khoukaz, A.; Kiese, P.; Kliemt, R.; Koch, L.; Kolcu, O. B.; Kopf, B.; Kornicer, M.; Kuemmel, M.; Kuessner, M.; Kupsc, A.; Kurth, M.; Kühn, W.; Lange, J. S.; Lara, M.; Larin, P.; Lavezzi, L.; Leiber, S.; Leithoff, H.; Li, C.; Li, Cheng; Li, D. M.; Li, F.; Li, F. Y.; Li, G.; Li, H. B.; Li, H. J.; Li, J. C.; Li, J. W.; Li, K. J.; Li, Kang; Li, Ke; Li, Lei; Li, P. L.; Li, P. R.; Li, Q. Y.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. N.; Li, X. Q.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Liao, L. Z.; Libby, J.; Lin, C. X.; Lin, D. X.; Liu, B.; Liu, B. J.; Liu, C. X.; Liu, D.; Liu, D. Y.; Liu, F. H.; Liu, Fang; Liu, Feng; Liu, H. B.; Liu, H. L.; Liu, H. M.; Liu, Huanhuan; Liu, Huihui; Liu, J. B.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, Ke; Liu, L. D.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqing; Long, Y. F.; Lou, X. C.; Lu, H. J.; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, X. L.; Lusso, S.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, M. M.; Ma, Q. M.; Ma, X. N.; Ma, X. Y.; Ma, Y. M.; Maas, F. E.; Maggiora, M.; Malik, Q. A.; Mangoni, A.; Mao, Y. J.; Mao, Z. P.; Marcello, S.; Meng, Z. X.; Messchendorp, J. G.; Mezzadri, G.; Min, J.; Min, T. J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Morello, G.; Muchnoi, N. Yu; Muramatsu, H.; Mustafa, A.; Nakhoul, S.; Nefedov, Y.; Nerling, F.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pan, Y.; Papenbrock, M.; Patteri, P.; Pelizaeus, M.; Pellegrino, J.; Peng, H. P.; Peng, Z. Y.; Peters, K.; Pettersson, J.; Ping, J. L.; Ping, R. G.; Pitka, A.; Poling, R.; Prasad, V.; Qi, H. R.; Qi, M.; Qi, T. Y.; Qian, S.; Qiao, C. F.; Qin, N.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Richter, M.; Ripka, M.; Rolo, M.; Rong, G.; Rosner, Ch.; Ruan, X. D.; Sarantsev, A.; Savrié, M.; Schnier, C.; Schoenning, K.; Shan, W.; Shan, X. Y.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Shi, X.; Song, J. J.; Song, W. M.; Song, X. Y.; Sosio, S.; Sowa, C.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, L.; Sun, S. S.; Sun, X. H.; Sun, Y. J.; Sun, Y. K.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tan, Y. T.; Tang, C. J.; Tang, G. Y.; Tang, X.; Tapan, I.; Tiemens, M.; Tsednee, B.; Uman, I.; Varner, G. S.; Wang, B.; Wang, B. L.; Wang, C. W.; Wang, D.; Wang, D. Y.; Wang, Dan; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, Meng; Wang, P.; Wang, P. L.; Wang, W. P.; Wang, X. F.; Wang, Y.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. Y.; Wang, Zongyuan; Weber, T.; Wei, D. H.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, L. J.; Wu, Z.; Xia, L.; Xia, X.; Xia, Y.; Xiao, D.; Xiao, Y. J.; Xiao, Z. J.; Xie, Y. G.; Xie, Y. H.; Xiong, X. A.; Xiu, Q. L.; Xu, G. F.; Xu, J. J.; Xu, L.; Xu, Q. J.; Xu, Q. N.; Xu, X. P.; Yan, F.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. J.; Yang, H. X.; Yang, L.; Yang, S. L.; Yang, Y. H.; Yang, Y. X.; Yang, Yifan; Ye, M.; Ye, M. H.; Yin, J. H.; You, Z. Y.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, Y.; Yuncu, A.; Zafar, A. A.; Zallo, A.; Zeng, Y.; Zeng, Z.; Zhang, B. X.; Zhang, B. Y.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, S. F.; Zhang, T. J.; Zhang, X. Y.; Zhang, Y.; Zhang, Y. H.; Zhang, Y. T.; Zhang, Yang; Zhang, Yao; Zhang, Yu; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, Q.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, A. N.; Zhu, J.; Zhu, J.; Zhu, K.; Zhu, K. J.; Zhu, S.; Zhu, S. H.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zou, B. S.; Zou, J. H.; BESIII Collaboration

    2018-05-01

    To investigate the nature of the {{\\psi }}(3770) resonance and to measure the cross section for {{{e}}}+{{{e}}}-\\to {{D}}\\bar{{{D}}}, a cross-section scan data sample, distributed among 41 center-of-mass energy points from 3.73 to 3.89 GeV, was taken with the BESIII detector operated at the BEPCII collider in the year 2010. By analyzing the large angle Bhabha scattering events, we measure the integrated luminosity of the data sample at each center-of-mass energy point. The total integrated luminosity of the data sample is 76.16+/- 0.04+/- 0.61 {pb}}-1, where the first uncertainty is statistical and the second systematic. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (NSFC) (11235011, 11335008, 11425524, 11625523, 11635010), the Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, the CAS Center for Excellence in Particle Physics (CCEPP), Joint Large-Scale Scientific Facility Funds of the NSFC and CAS (U1332201, U1532257, U1532258), CAS Key Research Program of Frontier Sciences (QYZDJ-SSW-SLH003, QYZDJ-SSW-SLH040), 100 Talents Program of CAS, National 1000 Talents Program of China, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG under Contracts Nos. Collaborative Research Center CRC 1044, FOR 2359, Istituto Nazionale di Fisica Nucleare, Italy, Koninklijke Nederlandse Akademie van Wetenschappen (KNAW) (530-4CDP03), Ministry of Development of Turkey (DPT2006K-120470), National Science and Technology fund, The Swedish Research Council, U. S. Department of Energy (DE-FG02-05ER41374, DE-SC-0010118, DE-SC-0010504, DE-SC-0012069), University of Groningen (RuG) and the Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt, WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0)

  11. Integration

    DEFF Research Database (Denmark)

    Emerek, Ruth

    2004-01-01

    Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration......Bidraget diskuterer de forskellige intergrationsopfattelse i Danmark - og hvad der kan forstås ved vellykket integration...

  12. Wide-range bipolar pulse conductance instrument employing current and voltage modes with sampled or integrated signal acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Calhoun, R K; Holler, F J [Kentucky Univ., Lexington, KY (United States). Dept. of Chemistry; Geiger, jr, R F; Nieman, T A [Illinois Univ., Urbana, IL (United States). Dept. of Chemistry; Caserta, K J [Procter and Gamble Co., Cincinnati, OH (United States)

    1991-11-05

    An instrument for measuring solution conductance using the bipolar pulse technique is described. The instrument is capable of measuring conductances in the range of 5x10{sup -9}-10{Omega}{sup -1} with 1% accuracy or better in as little as 32 {mu}s. Accuracy of 0.001-0.01% is achievable over the range 1x10{sup -6}-1{Omega}{sup -1}. Circuitry and software are described that allow the instrument to adjust automatically the pulse height, pulse duration, excitation mode (current or voltage pulse) and data acquisition mode (sampled or integrated) to acquire data of optimum accuracy and precision. The urease-catalyzed decomposition of urea is used to illustrate the versality of the instrument, and other applications are cited. (author). 60 refs.; 7 figs.; 2 tabs.

  13. Labor market integration, immigration experience, and psychological distress in a multi-ethnic sample of immigrants residing in Portugal.

    Science.gov (United States)

    Teixeira, Ana F; Dias, Sónia F

    2018-01-01

    This study aims at examining how factors relating to immigrants' experience in the host country affect psychological distress (PD). Specifically, we analyzed the association among socio-economic status (SES), integration in the labor market, specific immigration experience characteristics, and PD in a multi-ethnic sample of immigrant individuals residing in Lisbon, Portugal. Using a sample (n = 1375) consisting of all main immigrant groups residing in Portugal's metropolitan area of Lisbon, we estimated multivariable linear regression models of PD regressed on selected sets of socio-economic independent variables. A psychological distress scale was constructed based on five items (feeling physically tired, feeling psychologically tired, feeling happy, feeling full of energy, and feeling lonely). Variables associated with a decrease in PD are being a male (demographic), being satisfied with their income level (SES), living with the core family and having higher number of children (social isolation), planning to remain for longer periods of time in Portugal (migration project), and whether respondents considered themselves to be in good health condition (subjective health status). Study variables negatively associated with immigrants' PD were job insecurity (labor market), and the perception that health professionals were not willing to understand immigrants during a clinical interaction. The study findings emphasized the importance of labor market integration and access to good quality jobs for immigrants' psychological well-being, as well as the existence of family ties in the host country, intention to reside long term in the host country, and high subjective (physical) health. Our research suggests the need to foster cross-national studies of immigrant populations in order to understand the social mechanisms that transverse all migrant groups and contribute to lower psychological well-being.

  14. An integrative pharmacological approach to radio telemetry and blood sampling in pharmaceutical drug discovery and safety assessment.

    Science.gov (United States)

    Litwin, Dennis C; Lengel, David J; Kamendi, Harriet W; Bialecki, Russell A

    2011-01-18

    A successful integration of the automated blood sampling (ABS) and telemetry (ABST) system is described. The new ABST system facilitates concomitant collection of physiological variables with blood and urine samples for determination of drug concentrations and other biochemical measures in the same rat without handling artifact. Integration was achieved by designing a 13 inch circular receiving antenna that operates as a plug-in replacement for the existing pair of DSI's orthogonal antennas which is compatible with the rotating cage and open floor design of the BASi Culex® ABS system. The circular receiving antenna's electrical configuration consists of a pair of electrically orthogonal half-toroids that reinforce reception of a dipole transmitter operating within the coil's interior while reducing both external noise pickup and interference from other adjacent dipole transmitters. For validation, measured baclofen concentration (ABST vs. satellite (μM): 69.6 ± 23.8 vs. 76.6 ± 19.5, p = NS) and mean arterial pressure (ABST vs. traditional DSI telemetry (mm Hg): 150 ± 5 vs.147 ± 4, p = NS) variables were quantitatively and qualitatively similar between rats housed in the ABST system and traditional home cage approaches. The ABST system offers unique advantages over traditional between-group study paradigms that include improved data quality and significantly reduced animal use. The superior within-group model facilitates assessment of multiple physiological and biochemical responses to test compounds in the same animal. The ABST also provides opportunities to evaluate temporal relations between parameters and to investigate anomalous outlier events because drug concentrations, physiological and biochemical measures for each animal are available for comparisons.

  15. [Integrity].

    Science.gov (United States)

    Gómez Rodríguez, Rafael Ángel

    2014-01-01

    To say that someone possesses integrity is to claim that that person is almost predictable about responses to specific situations, that he or she can prudentially judge and to act correctly. There is a closed interrelationship between integrity and autonomy, and the autonomy rests on the deeper moral claim of all humans to integrity of the person. Integrity has two senses of significance for medical ethic: one sense refers to the integrity of the person in the bodily, psychosocial and intellectual elements; and in the second sense, the integrity is the virtue. Another facet of integrity of the person is la integrity of values we cherish and espouse. The physician must be a person of integrity if the integrity of the patient is to be safeguarded. The autonomy has reduced the violations in the past, but the character and virtues of the physician are the ultimate safeguard of autonomy of patient. A field very important in medicine is the scientific research. It is the character of the investigator that determines the moral quality of research. The problem arises when legitimate self-interests are replaced by selfish, particularly when human subjects are involved. The final safeguard of moral quality of research is the character and conscience of the investigator. Teaching must be relevant in the scientific field, but the most effective way to teach virtue ethics is through the example of the a respected scientist.

  16. Integration of continuous-flow sampling with microchip electrophoresis using poly(dimethylsiloxane)-based valves in a reversibly sealed device.

    Science.gov (United States)

    Li, Michelle W; Martin, R Scott

    2007-07-01

    Here we describe a reversibly sealed microchip device that incorporates poly(dimethylsiloxane) (PDMS)-based valves for the rapid injection of analytes from a continuously flowing stream into a channel network for analysis with microchip electrophoresis. The microchip was reversibly sealed to a PDMS-coated glass substrate and microbore tubing was used for the introduction of gas and fluids to the microchip device. Two pneumatic valves were incorporated into the design and actuated on the order of hundreds of milliseconds, allowing analyte from a continuously flowing sampling stream to be injected into an electrophoresis separation channel. The device was characterized in terms of the valve actuation time and pushback voltage. It was also found that the addition of sodium dodecyl sulfate (SDS) to the buffer system greatly increased the reproducibility of the injection scheme and enabled the analysis of amino acids derivatized with naphthalene-2,3-dicarboxaldehyde/cyanide. Results from continuous injections of a 0.39 nL fluorescein plug into the optimized system showed that the injection process was reproducible (RSD of 0.7%, n = 10). Studies also showed that the device was capable of monitoring off-chip changes in concentration with a device lag time of 90 s. Finally, the ability of the device to rapidly monitor on-chip concentration changes was demonstrated by continually sampling from an analyte plug that was derivatized upstream from the electrophoresis/continuous flow interface. A reversibly sealed device of this type will be useful for the continuous monitoring and analysis of processes that occur either off-chip (such as microdialysis sampling) or on-chip from other integrated functions.

  17. Utility of the Croatian translation of the community integration questionnaire-revised in a sample of adults with moderate to severe traumatic brain injury.

    Science.gov (United States)

    Tršinski, Dubravko; Tadinac, Meri; Bakran, Žarko; Klepo, Ivana

    2018-02-23

    To examine the utility of the Community Integration Questionnaire-Revised, translated into Croatian, in a sample of adults with moderate to severe traumatic brain injury. The Community Integration Questionnaire-Revised was administered to a sample of 88 adults with traumatic brain injury and to a control sample matched by gender, age and education. Participants with traumatic brain injury were divided into four subgroups according to injury severity. The internal consistency of the Community Integration Questionnaire-Revised was satisfactory. The differences between the group with traumatic brain injury and the control group were statistically significant for the overall Community Integration Questionnaire-Revised score, as well as for all the subscales apart from the Home Integration subscale. The community Integration Questionnaire-Revised score varied significantly for subgroups with different severity of traumatic brain injury. The results show that the Croatian translation of the Community Integration Questionnaire-Revised is useful in assessing participation in adults with traumatic brain injury and confirm previous findings that severity of injury predicts community integration. Results of the new Electronic Social Networking scale indicate that persons who are more active on electronic social networks report better results for other domains of community integration, especially social activities. Implications for rehabilitation The Croatian translation of the Community Integration Questionnaire-Revised is a valid tool for long-term assessment of participation in various domains in persons with moderate to severe traumatic brain injury Persons with traumatic brain injury who are more active in the use of electronic social networking are also more integrated into social and productivity domains. Targeted training in the use of new technologies could enhance participation after traumatic brain injury.

  18. Impact of delay to cryopreservation on RNA integrity and genome-wide expression profiles in resected tumor samples.

    Directory of Open Access Journals (Sweden)

    Elodie Caboux

    Full Text Available The quality of tissue samples and extracted mRNA is a major source of variability in tumor transcriptome analysis using genome-wide expression microarrays. During and immediately after surgical tumor resection, tissues are exposed to metabolic, biochemical and physical stresses characterized as "warm ischemia". Current practice advocates cryopreservation of biosamples within 30 minutes of resection, but this recommendation has not been systematically validated by measurements of mRNA decay over time. Using Illumina HumanHT-12 v3 Expression BeadChips, providing a genome-wide coverage of over 24,000 genes, we have analyzed gene expression variation in samples of 3 hepatocellular carcinomas (HCC and 3 lung carcinomas (LC cryopreserved at times up to 2 hours after resection. RNA Integrity Numbers (RIN revealed no significant deterioration of mRNA up to 2 hours after resection. Genome-wide transcriptome analysis detected non-significant gene expression variations of -3.5%/hr (95% CI: -7.0%/hr to 0.1%/hr; p = 0.054. In LC, no consistent gene expression pattern was detected in relation with warm ischemia. In HCC, a signature of 6 up-regulated genes (CYP2E1, IGLL1, CABYR, CLDN2, NQO1, SCL13A5 and 6 down-regulated genes (MT1G, MT1H, MT1E, MT1F, HABP2, SPINK1 was identified (FDR <0.05. Overall, our observations support current recommendation of time to cryopreservation of up to 30 minutes and emphasize the need for identifying tissue-specific genes deregulated following resection to avoid misinterpreting expression changes induced by warm ischemia as pathologically significant changes.

  19. Method for more accurate transmittance measurements of low-angle scattering samples using an integrating sphere with an entry port beam diffuser

    International Nuclear Information System (INIS)

    Nilsson, Annica M.; Jonsson, Andreas; Jonsson, Jacob C.; Roos, Arne

    2011-01-01

    For most integrating sphere measurements, the difference in light distribution between a specular reference beam and a diffused sample beam can result in significant errors. The problem becomes especially pronounced in integrating spheres that include a port for reflectance or diffuse transmittance measurements. The port is included in many standard spectrophotometers to facilitate a multipurpose instrument, however, absorption around the port edge can result in a detected signal that is too low. The absorption effect is especially apparent for low-angle scattering samples, because a significant portion of the light is scattered directly onto that edge. In this paper, a method for more accurate transmittance measurements of low-angle light-scattering samples is presented. The method uses a standard integrating sphere spectrophotometer, and the problem with increased absorption around the port edge is addressed by introducing a diffuser between the sample and the integrating sphere during both reference and sample scan. This reduces the discrepancy between the two scans and spreads the scattered light over a greater portion of the sphere wall. The problem with multiple reflections between the sample and diffuser is successfully addressed using a correction factor. The method is tested for two patterned glass samples with low-angle scattering and in both cases the transmittance accuracy is significantly improved.

  20. Development of Solid Ceramic Dosimeters for the Time-Integrative Passive Sampling of Volatile Organic Compounds in Waters.

    Science.gov (United States)

    Bonifacio, Riza Gabriela; Nam, Go-Un; Eom, In-Yong; Hong, Yong-Seok

    2017-11-07

    Time-integrative passive sampling of volatile organic compounds (VOCs) in water can now be accomplished using a solid ceramic dosimeter. A nonporous ceramic, which excludes the permeation of water, allowing only gas-phase diffusion of VOCs into the resin inside the dosimeter, effectively captured the VOCs. The mass accumulation of 11 VOCs linearly increased with time over a wide range of aqueous-phase concentrations (16.9 to 1100 μg L -1 ), and the linearity was dependent upon the Henry's constant (H). The average diffusivity of the VOCs in the solid ceramic was 1.46 × 10 -10 m 2 s -1 at 25 °C, which was 4 orders of magnitude lower than that in air (8.09 × 10 -6 m 2 s -1 ). This value was 60% greater than that in the water-permeable porous ceramic (0.92 × 10 -10 m 2 s -1 ), suggesting that its mass accumulation could be more effective than that of porous ceramic dosimeters. The mass accumulation of the VOCs in the solid ceramic dosimeter increased in the presence of salt (≥0.1 M) and with increasing temperature (4 to 40 °C) but varied only slightly with dissolved organic matter concentration. The solid ceramic dosimeter was suitable for the field testing and measurement of time-weighted average concentrations of VOC-contaminated waters.

  1. Implementing reduced-risk integrated pest management in fresh-market cabbage: influence of sampling parameters, and validation of binomial sequential sampling plans for the cabbage looper (Lepidoptera Noctuidae).

    Science.gov (United States)

    Burkness, Eric C; Hutchison, W D

    2009-10-01

    Populations of cabbage looper, Trichoplusiani (Lepidoptera: Noctuidae), were sampled in experimental plots and commercial fields of cabbage (Brasicca spp.) in Minnesota during 1998-1999 as part of a larger effort to implement an integrated pest management program. Using a resampling approach and the Wald's sequential probability ratio test, sampling plans with different sampling parameters were evaluated using independent presence/absence and enumerative data. Evaluations and comparisons of the different sampling plans were made based on the operating characteristic and average sample number functions generated for each plan and through the use of a decision probability matrix. Values for upper and lower decision boundaries, sequential error rates (alpha, beta), and tally threshold were modified to determine parameter influence on the operating characteristic and average sample number functions. The following parameters resulted in the most desirable operating characteristic and average sample number functions; action threshold of 0.1 proportion of plants infested, tally threshold of 1, alpha = beta = 0.1, upper boundary of 0.15, lower boundary of 0.05, and resampling with replacement. We found that sampling parameters can be modified and evaluated using resampling software to achieve desirable operating characteristic and average sample number functions. Moreover, management of T. ni by using binomial sequential sampling should provide a good balance between cost and reliability by minimizing sample size and maintaining a high level of correct decisions (>95%) to treat or not treat.

  2. Modelling forest canopy height by integrating airborne LiDAR samples with satellite Radar and multispectral imagery

    Science.gov (United States)

    García, Mariano; Saatchi, Sassan; Ustin, Susan; Balzter, Heiko

    2018-04-01

    Spatially-explicit information on forest structure is paramount to estimating aboveground carbon stocks for designing sustainable forest management strategies and mitigating greenhouse gas emissions from deforestation and forest degradation. LiDAR measurements provide samples of forest structure that must be integrated with satellite imagery to predict and to map landscape scale variations of forest structure. Here we evaluate the capability of existing satellite synthetic aperture radar (SAR) with multispectral data to estimate forest canopy height over five study sites across two biomes in North America, namely temperate broadleaf and mixed forests and temperate coniferous forests. Pixel size affected the modelling results, with an improvement in model performance as pixel resolution coarsened from 25 m to 100 m. Likewise, the sample size was an important factor in the uncertainty of height prediction using the Support Vector Machine modelling approach. Larger sample size yielded better results but the improvement stabilised when the sample size reached approximately 10% of the study area. We also evaluated the impact of surface moisture (soil and vegetation moisture) on the modelling approach. Whereas the impact of surface moisture had a moderate effect on the proportion of the variance explained by the model (up to 14%), its impact was more evident in the bias of the models with bias reaching values up to 4 m. Averaging the incidence angle corrected radar backscatter coefficient (γ°) reduced the impact of surface moisture on the models and improved their performance at all study sites, with R2 ranging between 0.61 and 0.82, RMSE between 2.02 and 5.64 and bias between 0.02 and -0.06, respectively, at 100 m spatial resolution. An evaluation of the relative importance of the variables in the model performance showed that for the study sites located within the temperate broadleaf and mixed forests biome ALOS-PALSAR HV polarised backscatter was the most important

  3. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    Science.gov (United States)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    Accurate knowledge of fabric anisotropy is crucial to understand the mechanical behavior of snow and firn, but is also important for understanding metamorphism. Computer-Integrated Polarization Microscopy (CIP) method used for the fabric analysis was developed by Heilbronner and Pauli in the early 1990ies and uses a slightly modified traditional polarization microscope for the fabric analysis. First developed for quartz, it can be applied to other uniaxial minerals. Up to now this method was mainly used in structural geology. However, it is also well suited for the fabric analysis of snow, firn and ice. The method is based on the analysis of first- order interference colors images by a slightly modified optical polarization microscope, a grayscale camera and a computer. The optical polarization microscope is featured with high quality objectives, a rotating table and two polarizers that can be introduced above and below the thin section, as well as a full wave plate. Additionally, two quarter-wave plates for circular polarization are needed. Otherwise it is also possible to create circular polarization from a set of crossed polarized images through image processing. A narrow band interference filter transmitting a wavelength between 660 and 700 nm is also required. Finally a monochrome digital camera is used to capture the input images. The idea is to record the change of interference colors while the thin section is being rotated once through 180°. The azimuth and inclination of the c-axis are defined by the color change. Recording the color change through a red filter produces a signal with a well-defined amplitude and phase angle. An advantage of this method lies in the simple conversion of an ordinary optical microscope to a fabric analyzer. The Automatic Ice Texture Analyzer (AITA) as the first fully functional instrument to measure c-axis orientation was developed by Wilson and other (2003). Most recent fabric analysis of snow and firn samples was carried

  4. Asymptotic bounded consensus tracking of double-integrator multi-agent systems with bounded-jerk target based on sampled-data without velocity measurements

    International Nuclear Information System (INIS)

    Wu Shuang-Shuang; Wu Zhi-Hai; Peng Li; Xie Lin-Bo

    2017-01-01

    This paper investigates asymptotic bounded consensus tracking (ABCT) of double-integrator multi-agent systems (MASs) with an asymptotically-unbounded-acceleration and bounded-jerk target (AUABJT) available to partial agents based on sampled-data without velocity measurements. A sampled-data consensus tracking protocol (CTP) without velocity measurements is proposed to guarantee that double-integrator MASs track an AUABJT available to only partial agents. The eigenvalue analysis method together with the augmented matrix method is used to obtain the necessary and sufficient conditions for ABCT. A numerical example is provided to illustrate the effectiveness of theoretical results. (paper)

  5. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions.

    Science.gov (United States)

    Cendagorta, Joseph R; Bačić, Zlatko; Tuckerman, Mark E

    2018-03-14

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  6. An open-chain imaginary-time path-integral sampling approach to the calculation of approximate symmetrized quantum time correlation functions

    Science.gov (United States)

    Cendagorta, Joseph R.; Bačić, Zlatko; Tuckerman, Mark E.

    2018-03-01

    We introduce a scheme for approximating quantum time correlation functions numerically within the Feynman path integral formulation. Starting with the symmetrized version of the correlation function expressed as a discretized path integral, we introduce a change of integration variables often used in the derivation of trajectory-based semiclassical methods. In particular, we transform to sum and difference variables between forward and backward complex-time propagation paths. Once the transformation is performed, the potential energy is expanded in powers of the difference variables, which allows us to perform the integrals over these variables analytically. The manner in which this procedure is carried out results in an open-chain path integral (in the remaining sum variables) with a modified potential that is evaluated using imaginary-time path-integral sampling rather than requiring the generation of a large ensemble of trajectories. Consequently, any number of path integral sampling schemes can be employed to compute the remaining path integral, including Monte Carlo, path-integral molecular dynamics, or enhanced path-integral molecular dynamics. We believe that this approach constitutes a different perspective in semiclassical-type approximations to quantum time correlation functions. Importantly, we argue that our approximation can be systematically improved within a cumulant expansion formalism. We test this approximation on a set of one-dimensional problems that are commonly used to benchmark approximate quantum dynamical schemes. We show that the method is at least as accurate as the popular ring-polymer molecular dynamics technique and linearized semiclassical initial value representation for correlation functions of linear operators in most of these examples and improves the accuracy of correlation functions of nonlinear operators.

  7. Extreme-temperature lab on a chip for optogalvanic spectroscopy of ultra small samples - key components and a first integration attempt

    International Nuclear Information System (INIS)

    Berglund, Martin; Khaji, Zahra; Persson, Anders; Sturesson, Peter; Breivik, Johan Söderberg; Thornell, Greger; Klintberg, Lena

    2016-01-01

    This is a short summary of the authors’ recent R and D on valves, combustors, plasma sources, and pressure and temperature sensors, realized in high-temperature co-fired ceramics, and an account for the first attempt to monolithically integrate them to form a lab on a chip for sample administration, preparation and analysis, as a stage in optogalvanic spectroscopy. (paper)

  8. 'Integration'

    DEFF Research Database (Denmark)

    Olwig, Karen Fog

    2011-01-01

    , while the countries have adopted disparate policies and ideologies, differences in the actual treatment and attitudes towards immigrants and refugees in everyday life are less clear, due to parallel integration programmes based on strong similarities in the welfare systems and in cultural notions...... of equality in the three societies. Finally, it shows that family relations play a central role in immigrants’ and refugees’ establishment of a new life in the receiving societies, even though the welfare society takes on many of the social and economic functions of the family....

  9. Viral load and genomic integration of HPV 16 in cervical samples from HIV-1-infected and uninfected women in Burkina Faso.

    Science.gov (United States)

    Rousseau, Marie-Noelle Didelot; Costes, Valérie; Konate, Issouf; Nagot, Nicolas; Foulongne, Vincent; Ouedraogo, Abdoulaye; Van de Perre, Philippe; Mayaud, Philippe; Segondy, Michel

    2007-06-01

    The relationships between human papillomavirus type 16 (HPV 16) viral load, HPV 16 integration status, human immunodeficiency virus type 1 (HIV-1) status, and cervical cytology were studied among women enrolled in a cohort of female sex workers in Burkina Faso. The study focused on 24 HPV 16-infected women. The HPV 16 viral load in cervical samples was determined by real-time PCR. Integration ratio was estimated as the ratio between E2 and E6 genes DNA copy numbers. Integrated HPV16 viral load was defined as the product of HPV 16 viral load by the integration ratio. High HPV 16 viral load and high integration ratio were more frequent among women with squamous intraepithelial lesions compared with women with normal cytology (33% vs. 11%, and 33% vs. 0%, respectively), and among women with high-grade squamous intraepithelial lesions compared with women without high-grade squamous intraepithelial lesions (50% vs. 17%, and 50% vs. 11%, respectively). High HPV 16 DNA load, but not high integration ratio, was also more frequent among HIV-1-positive women (39% vs. 9%; and 23% vs. 18%, respectively). The absence of statistical significance of these differences might be explained by the small study sample size. High-integrated HPV 16 DNA load was significantly associated with the presence of high-grade squamous intraepithelial lesions (50% vs. 5%, P = 0.03) in univariate and multivariate analysis (adjusted odds-ratio: 19.05; 95% confidence interval (CI), 1.11-328.3, P = 0.03), but not with HIV-1 or other high-risk HPV types (HR-HPV). Integrated HPV 16 DNA load may be considered as a useful marker of high-grade cervical lesions in HPV 16-infected women. (c) 2007 Wiley-Liss, Inc.

  10. Comprehensive Interpretation of a Three-Point Gauss Quadrature with Variable Sampling Points and Its Application to Integration for Discrete Data

    Directory of Open Access Journals (Sweden)

    Young-Doo Kwon

    2013-01-01

    Full Text Available This study examined the characteristics of a variable three-point Gauss quadrature using a variable set of weighting factors and corresponding optimal sampling points. The major findings were as follows. The one-point, two-point, and three-point Gauss quadratures that adopt the Legendre sampling points and the well-known Simpson’s 1/3 rule were found to be special cases of the variable three-point Gauss quadrature. In addition, the three-point Gauss quadrature may have out-of-domain sampling points beyond the domain end points. By applying the quadratically extrapolated integrals and nonlinearity index, the accuracy of the integration could be increased significantly for evenly acquired data, which is popular with modern sophisticated digital data acquisition systems, without using higher-order extrapolation polynomials.

  11. A Prorating Method for Estimating MMPI-2-RF Scores From MMPI Responses: Examination of Score Fidelity and Illustration of Empirical Utility in the PERSEREC Police Integrity Study Sample.

    Science.gov (United States)

    Tarescavage, Anthony M; Corey, David M; Ben-Porath, Yossef S

    2016-04-01

    The purpose of the current study was to identify Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) correlates of police officer integrity violations and other problem behaviors in an archival database with original MMPI item responses and collateral information regarding integrity violations obtained for 417 male officers. In Study 1, we estimated MMPI-2-RF scores from the MMPI item pool (which includes approximately 80% of the MMPI-2-RF items) in a normative sample, a psychiatric inpatient sample, and a police officer sample, and conducted analyses that demonstrated the comparability of estimated and full scale scores for 41 of the 51 MMPI-2-RF scales. In Study 2, we correlated estimated MMPI-2-RF scores with information about subsequent integrity violations and problem behaviors from the integrity violation data set. Several meaningful associations were obtained, predominately with scales from the emotional, thought, and behavioral dysfunction domains of the MMPI-2-RF. Application of a correction for range restriction yielded substantially improved validity estimates. Finally, we calculated relative risk ratios for the statistically significant findings using cutoffs lower than 65T, which is traditionally used to identify clinically significant elevations, and found several meaningful relative risk ratios. © The Author(s) 2015.

  12. Augmenting comprehension of geological relationships by integrating 3D laser scanned hand samples within a GIS environment

    Science.gov (United States)

    Harvey, A. S.; Fotopoulos, G.; Hall, B.; Amolins, K.

    2017-06-01

    Geological observations can be made on multiple scales, including micro- (e.g. thin section), meso- (e.g. hand-sized to outcrop) and macro- (e.g. outcrop and larger) scales. Types of meso-scale samples include, but are not limited to, rocks (including drill cores), minerals, and fossils. The spatial relationship among samples paired with physical (e.g. granulometric composition, density, roughness) and chemical (e.g. mineralogical and isotopic composition) properties can aid in interpreting geological settings, such as paleo-environmental and formational conditions as well as geomorphological history. Field samples are collected along traverses in the area of interest based on characteristic representativeness of a region, predetermined rate of sampling, and/or uniqueness. The location of a sample can provide relative context in seeking out additional key samples. Beyond labelling and recording of geospatial coordinates for samples, further analysis of physical and chemical properties may be conducted in the field and laboratory. The main motivation for this paper is to present a workflow for the digital preservation of samples (via 3D laser scanning) paired with the development of cyber infrastructure, which offers geoscientists and engineers the opportunity to access an increasingly diverse worldwide collection of digital Earth materials. This paper describes a Web-based graphical user interface developed using Web AppBuilder for ArcGIS for digitized meso-scale 3D scans of geological samples to be viewed alongside the macro-scale environment. Over 100 samples of virtual rocks, minerals and fossils populate the developed geological database and are linked explicitly with their associated attributes, characteristic properties, and location. Applications of this new Web-based geological visualization paradigm in the geosciences demonstrate the utility of such a tool in an age of increasing global data sharing.

  13. Integration of two-dimensional LC-MS with multivariate statistics for comparative analysis of proteomic samples

    NARCIS (Netherlands)

    Gaspari, M.; Verhoeckx, K.C.M.; Verheij, E.R.; Greef, J. van der

    2006-01-01

    LC-MS-based proteomics requires methods with high peak capacity and a high degree of automation, integrated with data-handling tools able to cope with the massive data produced and able to quantitatively compare them. This paper describes an off-line two-dimensional (2D) LC-MS method and its

  14. Impact of collection container material and holding times on sample integrity for mercury and methylmercury in water

    Energy Technology Data Exchange (ETDEWEB)

    Riscassi, Ami L [ORNL; Miller, Carrie L [ORNL; Brooks, Scott C [ORNL

    2014-01-01

    Mercury (Hg) and methylmercury (MeHg) concentrations in streamwater can vary on short timescales (hourly or less) during storm flow and on a diel cycle; the frequency and timing of sampling required to accurately characterize these dynamics may be difficult to accomplish manually. Automated sampling can assist in sample collection; however use has been limited for Hg and MeHg analysis due to stability concerns of trace concentrations during extended storage times. We examined the viability of using automated samplers with disposable low-density polyethylene (LDPE) sample bags to collect industrially contaminated streamwater for unfiltered and filtered Hg and MeHg analysis. Specifically we investigated the effect of holding times ranging from hours to days on streamwater collected during baseflow and storm flow. Unfiltered and filtered Hg and MeHg concentrations decreased with increases in time prior to sample processing; holding times of 24 hours or less resulted in concentration changes (mean 11 7% different) similar to variability in duplicates collected manually during analogous field conditions (mean 7 10% different). Comparisons of samples collected with manual and automated techniques throughout a year for a wide range of stream conditions were also found to be similar to differences observed between duplicate grab samples. These results demonstrate automated sampling into LDPE bags with holding times of 24 hours or less can be effectively used to collect streamwater for Hg and MeHg analysis, and encourage the testing of these materials and methods for implementation in other aqueous systems where high-frequency sampling is warranted.

  15. Sampling and Mapping Soil Erosion Cover Factor for Fort Richardson, Alaska. Integrating Stratification and an Up-Scaling Method

    National Research Council Canada - National Science Library

    Wang, Guangxing; Gertner, George; Anderson, Alan B; Howard, Heidi

    2006-01-01

    When a ground and vegetation cover factor related to soil erosion is mapped with the aid of remotely sensed data, a cost-efficient sample design to collect ground data and obtain an accurate map is required...

  16. All-integrated and highly sensitive paper based device with sample treatment platform for Cd2+ immunodetection in drinking/tap waters.

    Science.gov (United States)

    López Marzo, Adaris M; Pons, Josefina; Blake, Diane A; Merkoçi, Arben

    2013-04-02

    Nowadays, the development of systems, devices, or methods that integrate several process steps into one multifunctional step for clinical, environmental, or industrial purposes constitutes a challenge for many ongoing research projects. Here, we present a new integrated paper based cadmium (Cd(2+)) immunosensing system in lateral flow format, which integrates the sample treatment process with the analyte detection process. The principle of Cd(2+) detection is based on competitive reaction between the cadmium-ethylenediaminetetraacetic acid-bovine serum albumin-gold nanoparticles (Cd-EDTA-BSA-AuNP) conjugate deposited on the conjugation pad strip and the Cd-EDTA complex formed in the analysis sample for the same binding sites of the 2A81G5 monoclonal antibody (mAb), specific to Cd-EDTA but not Cd(2+) free, which is immobilized onto the test line. This platform operates without any sample pretreatment step for Cd(2+) detection thanks to an extra conjugation pad that ensures Cd(2+) complexation with EDTA and interference masking through ovalbumin (OVA). The detection and quantification limits found for the device were 0.1 and 0.4 ppb, respectively, these being the lowest limits reported up to now for metal sensors based on paper. The accuracy of the device was evaluated by addition of known quantities of Cd(2+) to different drinking water samples and subsequent Cd(2+) content analysis. Sample recoveries ranged from 95 to 105% and the coefficient of variation for the intermediate precision assay was less than 10%. In addition, the results obtained here were compared with those obtained with the well-established inductively coupled plasma emission spectroscopy (ICPES) and the analysis of certificate standard samples.

  17. Estimating pesticide sampling rates by the polar organic chemical integrative sampler (POCIS) in the presence of natural organic matter and varying hydrodynamic conditions

    International Nuclear Information System (INIS)

    Charlestra, Lucner; Amirbahman, Aria; Courtemanch, David L.; Alvarez, David A.; Patterson, Howard

    2012-01-01

    The polar organic chemical integrative sampler (POCIS) was calibrated to monitor pesticides in water under controlled laboratory conditions. The effect of natural organic matter (NOM) on the sampling rates (R s ) was evaluated in microcosms containing −1 of total organic carbon (TOC). The effect of hydrodynamics was studied by comparing R s values measured in stirred (SBE) and quiescent (QBE) batch experiments and a flow-through system (FTS). The level of NOM in the water used in these experiments had no effect on the magnitude of the pesticide sampling rates (p > 0.05). However, flow velocity and turbulence significantly increased the sampling rates of the pesticides in the FTS and SBE compared to the QBE (p < 0.001). The calibration data generated can be used to derive pesticide concentrations in water from POCIS deployed in stagnant and turbulent environmental systems without correction for NOM. - Highlights: ► We assessed the effect of TOC and stirring on pesticide sampling rates by POCIS. ► Total organic carbon (TOC) had no effect on the sampling rates. ► Water flow and stirring significantly increased the magnitude of the sampling rates. ► The sampling rates generated are directly applicable to field conditions. - This study provides POCIS sampling rates data that can be used to estimate freely dissolved concentrations of toxic pesticides in natural waters.

  18. Integration of auto analysis program of gamma spectrum and software and determination of element content in sample by k-zero method

    International Nuclear Information System (INIS)

    Trinh Quang Vinh; Truong Thi Hong Loan; Mai Van Nhon; Huynh Truc Phuong

    2014-01-01

    Integrating the gamma spectrum auto-analysis program with elemental analysis software by k-zero method is the objective for many researchers. This work is the first stepin building an auto analysis program of gamma spectrum, which includes modules of reading spectrum, displaying spectrum, calibrating energy of peak, smoothing spectrum, calculating peak area and determining content of elements in sample. Then, the results from the measurements of standard samples by a low level spectrometer using HPGe detector are compared to those of other gamma spectrum auto-analysis programs. (author)

  19. An Integrated Solution-Based Rapid Sample Preparation Procedure for the Analysis of N-Glycans From Therapeutic Monoclonal Antibodies.

    Science.gov (United States)

    Aich, Udayanath; Liu, Aston; Lakbub, Jude; Mozdzanowski, Jacek; Byrne, Michael; Shah, Nilesh; Galosy, Sybille; Patel, Pramthesh; Bam, Narendra

    2016-03-01

    Consistent glycosylation in therapeutic monoclonal antibodies is a major concern in the biopharmaceutical industry as it impacts the drug's safety and efficacy and manufacturing processes. Large numbers of samples are created for the analysis of glycans during various stages of recombinant proteins drug development. Profiling and quantifying protein N-glycosylation is important but extremely challenging due to its microheterogeneity and more importantly the limitations of existing time-consuming sample preparation methods. Thus, a quantitative method with fast sample preparation is crucial for understanding, controlling, and modifying the glycoform variance in therapeutic monoclonal antibody development. Presented here is a rapid and highly quantitative method for the analysis of N-glycans from monoclonal antibodies. The method comprises a simple and fast solution-based sample preparation method that uses nontoxic reducing reagents for direct labeling of N-glycans. The complete work flow for the preparation of fluorescently labeled N-glycans takes a total of 3 h with less than 30 min needed for the release of N-glycans from monoclonal antibody samples. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  20. A laboratory assessment of the Waveband Integrated Bioaerosol Sensor (WIBS-4) using individual samples of pollen and fungal spore material

    Science.gov (United States)

    Healy, David A.; O'Connor, David J.; Burke, Aoife M.; Sodeau, John R.

    2012-12-01

    A Bioaerosol sensing instrument referred to as WIBS-4, designed to continuously monitor ambient bioaerosols on-line, has been used to record a multiparameter “signature” from each of a number of Primary Biological Aerosol Particulate (PBAP) samples found in air. These signatures were obtained in a controlled laboratory environment and are based on the size, asymmetry (“shape”) and auto-fluorescence of the particles. Fifteen samples from two separate taxonomic ranks (kingdoms), Plantae (×8) and Fungi (×7) were individually introduced to the WIBS-4 for measurement along with two non-fluorescing chemical solids, common salt and chalk. Over 2000 individual-particle measurements were recorded for each sample type and the ability of the WIBS spectroscopic technique to distinguish between chemicals, pollen and fungal spore material was examined by identifying individual PBAP signatures. The results obtained show that WIBS-4 could potentially be a very useful analytical tool for distinguishing between natural airborne PBAP samples, such as the fungal spores and may potentially play an important role in detecting and discriminating the toxic fungal spore, Aspergillus fumigatus, from others in real-time. If the sizing range of the commercial instrument was customarily increased and permitted to operate simultaneously in its two sizing ranges, pollen and spores could potentially be discriminated between. The data also suggest that the gain setting sensitivity on the detector would also have to be reduced by a factor >5, to routinely detect, in-range fluorescence measurements for pollen samples.

  1. An integrated sample preparation to determine coccidiostats and emerging Fusarium-mycotoxins in various poultry tissues with LC-MS/MS.

    Science.gov (United States)

    Jestoi, Marika; Rokka, Mervi; Peltonen, Kimmo

    2007-05-01

    The usefulness of an existing sample preparation technique used for ionophoric coccidiostats (lasalocid, monensin, salinomycin and narasin) was applied in the analysis of emerging Fusarium-mycotoxins beauvericin (BEA) and enniatins (ENNs) in poultry tissues (liver and meat). Also, maduramicin and liver as a new sample matrix was introduced. The developed methods were validated and applied for the determination of coccidiostats and BEA/ENNs in Finnish poultry tissues in 2004-2005. The validation parameters demonstrated that the integrated sample preparation technique is applicable to the parallel determination of these contaminants in poultry tissues. Of the samples analysed (276 meat and 43 liver), only trace levels of LAS, MON, SAL, NAR and MAD were detected in 7, 3, 5, 6 and 4% of the samples, respectively. Interestingly, for the first time, traces of BEA and ENNs could also be detected in animal tissues. BEA and ENNs A, A1, B and B1 were found in 2, 0.3, 0.6, 4 and 3% of the samples, respectively. The simultaneous presence of coccidiostats and mycotoxins was detected in three turkey samples in 2004.

  2. Integrated sampling and analysis unit for the determination of sexual pheromones in environmental air using fabric phase sorptive extraction and headspace-gas chromatography-mass spectrometry.

    Science.gov (United States)

    Alcudia-León, M Carmen; Lucena, Rafael; Cárdenas, Soledad; Valcárcel, Miguel; Kabir, Abuzar; Furton, Kenneth G

    2017-03-10

    This article presents a novel unit that integrates for the first time air sampling and preconcentration based on the use of fabric phase sorptive extraction principles. The determination of Tuta absoluta sexual pheromone traces in environmental air has been selected as analytical problem. For this aim, a novel laboratory-built unit made up of commercial brass elements as holder of the sol-gel coated fabric extracting phase has been designed and optimized. The performance of the integrated unit was evaluated analyzing environmental air sampled in tomato crops. The unit can work under sampling and analysis mode which eliminates any need for sorptive phase manipulation prior to instrumental analysis. In the sampling mode, the unit can be connected to a sampling pump to pass the air through the sorptive phase at a controlled flow-rate. In the analysis mode, it is placed in the gas chromatograph autosampler without any instrumental modification. It also diminishes the risk of cross contamination between sampling and analysis. The performance of the new unit has been evaluated using the main components of the sexual pheromone of Tuta absoluta [(3E,8Z,11Z)-tetradecatrien-1-yl acetate and (3E,8Z)-tetradecadien-1-yl acetate] as model analytes. The limits of detection for both compounds resulted to be 1.6μg and 0.8μg, respectively, while the precision (expressed as relative standard deviation) was better than 3.7%. Finally, the unit has been deployed in the field to analyze a number of real life samples, some of them were found positive. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Unprecedented Integral-Free Debye Temperature Formulas: Sample Applications to Heat Capacities of ZnSe and ZnTe

    Directory of Open Access Journals (Sweden)

    R. Pässler

    2017-01-01

    Full Text Available Detailed analytical and numerical analyses are performed for combinations of several complementary sets of measured heat capacities, for ZnSe and ZnTe, from the liquid-helium region up to 600 K. The isochoric (harmonic parts of heat capacities, CVh(T, are described within the frame of a properly devised four-oscillator hybrid model. Additional anharmonicity-related terms are included for comprehensive numerical fittings of the isobaric heat capacities, Cp(T. The contributions of Debye and non-Debye type due to the low-energy acoustical phonon sections are represented here for the first time by unprecedented, integral-free formulas. Indications for weak electronic contributions to the cryogenic heat capacities are found for both materials. A novel analytical framework has been constructed for high-accuracy evaluations of Debye function integrals via a couple of integral-free formulas, consisting of Debye’s conventional low-temperature series expansion in combination with an unprecedented high-temperature series representation for reciprocal values of the Debye function. The zero-temperature limits of Debye temperatures have been detected from published low-temperature Cp(T data sets to be significantly lower than previously estimated, namely, 270 (±3 K for ZnSe and 220 (±2 K for ZnTe. The high-temperature limits of the “true” (harmonic lattice Debye temperatures are found to be 317 K for ZnSe and 262 K for ZnTe.

  4. Time integrated Pesticide analysis in the tropical Rio Tapezco in Costa Rica by using passive sampling approaches

    Science.gov (United States)

    Weiss, Frederik; Stamm, Christian; Ruepert, Clemens; Zurbrügg, Christian; Eggen, Rik

    2016-04-01

    Tropical areas are pesticide "hot spots". Global data indicate that in these regions the annual average pesticide application rates and surface runoff potentials can be very high. However in tropical regions, information about the pesticide entry routes, their environmental behavior, and the degree of water pollution is often lacking. Catchment-scale monitoring data are required to fill these knowledge gaps and to gain a better systematic understanding of the environmental fate, behavior and impacts of pesticides in tropical aquatic environments. Accordingly, our study was conducted in the tropical Rio Tapezco catchment in the Zarcero canton, Costa Rica. The area covers 5112 ha, ranges between an altitude of 1580 and 2010 m above sea level and receives a total annual precipitation between 1500 and 3500 mm. The catchment is intensively used for the horticultural production of vegetables and herbs. It is a hot spots of pesticide use with an average application rate of about 22 kg/ha of arable land and cropping cycle. In conjunction with the poor pesticide application practices, the tropical climate, strong precipitations and the continuous pesticide application during the whole year, the risks for water pollution and environmental health are high. Indeed, previous spot sampling showed that in streams of the study area, several pesticides were found in concentrations up to 6.8 μg/L. While this data indicate the risk for the aquatic environment, the seasonal grab sampling reflects only poorly the highly dynamic concentration time-series. Additionally, the assessment of the actual pollution level was restricted by a limited analytical window. To close these research gaps, we sampled the rivers of the study area continuously between end of July and beginning of October 2015 by using three passive sampling systems (Camcather® with styrene-divinylbenzene reverse phase sulfonated discs, polydimethylsiloxane sheets, and a water level proportional water sampler). Samples

  5. An integrated sample pretreatment platform for quantitative N-glycoproteome analysis with combination of on-line glycopeptide enrichment, deglycosylation and dimethyl labeling

    Energy Technology Data Exchange (ETDEWEB)

    Weng, Yejing; Qu, Yanyan; Jiang, Hao; Wu, Qi [National Chromatographic Research and Analysis Center, Key Laboratory of Separation Science for Analytical Chemistry, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); University of the Chinese Academy of Sciences, Beijing 100039 (China); Zhang, Lihua, E-mail: lihuazhang@dicp.ac.cn [National Chromatographic Research and Analysis Center, Key Laboratory of Separation Science for Analytical Chemistry, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); Yuan, Huiming [National Chromatographic Research and Analysis Center, Key Laboratory of Separation Science for Analytical Chemistry, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); Zhou, Yuan [National Chromatographic Research and Analysis Center, Key Laboratory of Separation Science for Analytical Chemistry, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); University of the Chinese Academy of Sciences, Beijing 100039 (China); Zhang, Xiaodan; Zhang, Yukui [National Chromatographic Research and Analysis Center, Key Laboratory of Separation Science for Analytical Chemistry, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China)

    2014-06-23

    Highlights: • An integrated platform for quantitative N-glycoproteome analysis was established. • On-line enrichment, deglycosylation and labeling could be achieved within 160 min. • A N{sub 2}-assisted interface was applied to improve the compatibility of the platform. • The platform exhibited improved quantification accuracy, precision and throughput. - Abstract: Relative quantification of N-glycoproteomes shows great promise for the discovery of candidate biomarkers and therapeutic targets. The traditional protocol for quantitative analysis of glycoproteomes is usually off-line performed, and suffers from long sample preparation time, and the risk of sample loss or contamination due to manual manipulation. In this study, a novel integrated sample preparation platform for quantitative N-glycoproteome analysis was established, with combination of online N-glycopeptide capture by a HILIC column, sample buffer exchange by a N{sub 2}-assisted HILIC–RPLC interface, deglycosylation by a hydrophilic PNGase F immobilized enzymatic reactor (hIMER) and solid dimethyl labeling on a C18 precolumn. To evaluate the performance of such a platform, two equal aliquots of immunoglobulin G (IgG) digests were sequentially pretreated, followed by MALDI-TOF MS analysis. The signal intensity ratio of heavy/light (H/L) labeled deglycosylated peptides with the equal aliquots was 1.00 (RSD = 6.2%, n = 3), much better than those obtained by the offline protocol, with H/L ratio as 0.76 (RSD = 11.6%, n = 3). Additionally, the total on-line sample preparation time was greatly shortened to 160 min, much faster than that of offline approach (24 h). Furthermore, such an integrated pretreatment platform was successfully applied to analyze the two kinds of hepatocarcinoma ascites syngeneic cell lines with high (Hca-F) and low (Hca-P) lymph node metastasis rates. For H/L labeled Hca-P lysates with the equal aliquots, 99.6% of log 2 ratios (H/L) of quantified glycopeptides ranged from −1

  6. Estimating pesticide sampling rates by the polar organic chemical integrative sampler (POCIS) in the presence of natural organic matter and varying hydrodynamic conditions

    Science.gov (United States)

    Charlestra, Lucner; Amirbahman, Aria; Courtemanch, David L.; Alvarez, David A.; Patterson, Howard

    2012-01-01

    The polar organic chemical integrative sampler (POCIS) was calibrated to monitor pesticides in water under controlled laboratory conditions. The effect of natural organic matter (NOM) on the sampling rates (Rs) was evaluated in microcosms containing -1 of total organic carbon (TOC). The effect of hydrodynamics was studied by comparing Rs values measured in stirred (SBE) and quiescent (QBE) batch experiments and a flow-through system (FTS). The level of NOM in the water used in these experiments had no effect on the magnitude of the pesticide sampling rates (p > 0.05). However, flow velocity and turbulence significantly increased the sampling rates of the pesticides in the FTS and SBE compared to the QBE (p < 0.001). The calibration data generated can be used to derive pesticide concentrations in water from POCIS deployed in stagnant and turbulent environmental systems without correction for NOM.

  7. Analysis of Reflectance and Transmittance Measurements on Absorbing and Scattering Small Samples Using a Modified Integrating Sphere Setup

    DEFF Research Database (Denmark)

    Jernshøj, Kit Drescher; Hassing, Søren

    2009-01-01

    Formålet med artiklen er at anlysere reflektans og transmittans målinger på små spredende og absorberende emner. Små emner, som f.eks. grønne blade udgør en speciel eksperimentel udfordring, når sample beamet har et større tværsnit end emnet, der skal måles på. De eksperimentelle fejl, der indfør...

  8. Integration of GC-MSD and ER-Calux® assay into a single protocol for determining steroid estrogens in environmental samples.

    Science.gov (United States)

    Avberšek, Miha; Žegura, Bojana; Filipič, Metka; Heath, Ester

    2011-11-01

    There are many published studies that use either chemical or biological methods to investigate steroid estrogens in the aquatic environment, but rarer are those that combine both. In this study, gas chromatography with mass selective detection (GC-MSD) and the ER-Calux(®) estrogenicity assay were integrated into a single protocol for simultaneous determination of natural (estrone--E1, 17β-estradiol--E2, estriol--E3) and synthetic (17α-ethinylestradiol--EE2) steroid estrogens concentrations and the total estrogenic potential of environmental samples. For integration purposes, several solvents were investigated and the commonly used dimethyl sulphoxide (DMSO) in the ER-Calux(®) assay was replaced by ethyl acetate, which is more compatible with gas chromatography and enables the same sample to be analysed by both GC-MSD and the ER-Calux(®) assay. The integrated protocol was initially tested using a standard mixture of estrogens. The results for pure standards showed that the estrogenicity calculated on the basis of GC-MSD and the ER-Calux(®) assay exhibited good correlation (r(2)=0.96; α=0.94). The result remained the same when spiked waste water extracts were tested (r(2)=0.92, α=1.02). When applied to real waste water influent and effluent samples the results proved (r(2)=0.93; α=0.99) the applicability of the protocol. The main advantages of this newly developed protocol are simple sample handling for both methods, and reduced material consumption and labour. In addition, it can be applied as either a complete or sequential analysis where the ER-Calux(®) assay is used as a pre-screening method prior to the chemical analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. An Integrated Approach Using Chaotic Map & Sample Value Difference Method for Electrocardiogram Steganography and OFDM Based Secured Patient Information Transmission.

    Science.gov (United States)

    Pandey, Anukul; Saini, Barjinder Singh; Singh, Butta; Sood, Neetu

    2017-10-18

    This paper presents a patient's confidential data hiding scheme in electrocardiogram (ECG) signal and its subsequent wireless transmission. Patient's confidential data is embedded in ECG (called stego-ECG) using chaotic map and the sample value difference approach. The sample value difference approach effectually hides the patient's confidential data in ECG sample pairs at the predefined locations. The chaotic map generates these predefined locations through the use of selective control parameters. Subsequently, the wireless transmission of the stego-ECG is analyzed using the Orthogonal Frequency Division Multiplexing (OFDM) system in a Rayleigh fading scenario for telemedicine applications. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through the statistical and clinical performance measures. Statistical measures comprise of Percentage Root-mean-square Difference (PRD), Peak Signal to Noise Ratio (PSNR), and Kulback-Leibler Divergence (KL-Div), etc. while clinical metrics includes wavelet Energy Based Diagnostic Distortion (WEDD) and Wavelet based Weighted PRD (WWPRD). The various channel Signal-to-Noise Ratio scenarios are simulated for wireless communication of stego-ECG in OFDM system. The proposed method over all the 48 records of MIT-BIH arrhythmia database resulted in average, PRD = 0.26, PSNR = 55.49, KL-Div = 3.34 × 10 -6 , WEDD = 0.02, and WWPRD = 0.10 with secret data size of 21Kb. Further, a comparative analysis of proposed method and recent existing works was also performed. The results clearly, demonstrated the superiority of proposed method.

  10. A computer program integrating a multichannel analyzer with gamma analysis for the estimation of 226 Ra concentration in soil samples

    International Nuclear Information System (INIS)

    Wilson, J. E.

    1992-08-01

    A new hardware/software system has been implemented using the existing three-regions-of-interest method for determining the concentration of 226 Ra in soil samples for the Pollutant Assessment Group of the Oak Ridge National Laboratory. Consisting of a personal computer containing a multichannel analyzer, the system utilizes a new program combining the multichannel analyzer with a program analyzing gamma-radiation spectra for 226 Ra concentrations. This program uses a menu interface to minimize and simplify the tasks of system operation

  11. Deciphering the Correlation between Breast Tumor Samples and Cell Lines by Integrating Copy Number Changes and Gene Expression Profiles

    Directory of Open Access Journals (Sweden)

    Yi Sun

    2015-01-01

    Full Text Available Breast cancer is one of the most common cancers with high incident rate and high mortality rate worldwide. Although different breast cancer cell lines were widely used in laboratory investigations, accumulated evidences have indicated that genomic differences exist between cancer cell lines and tissue samples in the past decades. The abundant molecular profiles of cancer cell lines and tumor samples deposited in the Cancer Cell Line Encyclopedia and The Cancer Genome Atlas now allow a systematical comparison of the breast cancer cell lines with breast tumors. We depicted the genomic characteristics of breast primary tumors based on the copy number variation and gene expression profiles and the breast cancer cell lines were compared to different subgroups of breast tumors. We identified that some of the breast cancer cell lines show high correlation with the tumor group that agrees with previous knowledge, while a big part of them do not, including the most used MCF7, MDA-MB-231, and T-47D. We presented a computational framework to identify cell lines that mostly resemble a certain tumor group for the breast tumor study. Our investigation presents a useful guide to bridge the gap between cell lines and tumors and helps to select the most suitable cell line models for personalized cancer studies.

  12. Development and operation of an integrated sampling probe and gas analyzer for turbulent mixing studies in complex supersonic flows

    Science.gov (United States)

    Wiswall, John D.

    For many aerospace applications, mixing enhancement between co-flowing streams has been identified as a critical and enabling technology. Due to short fuel residence times in scramjet combustors, combustion is limited by the molecular mixing of hydrogen (fuel) and air. Determining the mixedness of fuel and air in these complex supersonic flowfields is critical to the advancement of novel injection schemes currently being developed at UTA in collaboration with NASA Langley and intended to be used on a future two-stage to orbit (~Mach 16) hypersonic air-breathing vehicle for space access. Expanding on previous work, an instrument has been designed, fabricated, and tested in order to measure mean concentrations of injected helium (a passive scalar used instead of hazardous hydrogen) and to quantitatively characterize the nature of the high-frequency concentration fluctuations encountered in the compressible, turbulent, and high-speed (up to Mach 3.5) complex flows associated with the new supersonic injection schemes. This important high-frequency data is not yet attainable when employing other techniques such as Laser Induced Fluorescence, Filtered Rayleigh Scattering or mass spectroscopy in the same complex supersonic flows. The probe operates by exploiting the difference between the thermodynamic properties of two species through independent massflow measurements and calibration. The probe samples isokinetically from the flowfield's area of interest and the helium concentration may be uniquely determined by hot-film anemometry and internally measured stagnation conditions. The final design has a diameter of 0.25" and is only 2.22" long. The overall accuracy of the probe is 3% in molar fraction of helium. The frequency response of mean concentration measurements is estimated at 103 Hz, while high-frequency hot-film measurements were conducted at 60 kHz. Additionally, the work presents an analysis of the probe's internal mixing effects and the effects of the spatial

  13. Integration of morphological data sets for phylogenetic analysis of Amniota: the importance of integumentary characters and increased taxonomic sampling.

    Science.gov (United States)

    Hill, Robert V

    2005-08-01

    Several mutually exclusive hypotheses have been advanced to explain the phylogenetic position of turtles among amniotes. Traditional morphology-based analyses place turtles among extinct anapsids (reptiles with a solid skull roof), whereas more recent studies of both morphological and molecular data support an origin of turtles from within Diapsida (reptiles with a doubly fenestrated skull roof). Evaluation of these conflicting hypotheses has been hampered by nonoverlapping taxonomic samples and the exclusion of significant taxa from published analyses. Furthermore, although data from soft tissues and anatomical systems such as the integument may be particularly relevant to this problem, they are often excluded from large-scale analyses of morphological systematics. Here, conflicting hypotheses of turtle relationships are tested by (1) combining published data into a supermatrix of morphological characters to address issues of character conflict and missing data; (2) increasing taxonomic sampling by more than doubling the number of operational taxonomic units to test internal relationships within suprageneric ingroup taxa; and (3) increasing character sampling by approximately 25% by adding new data on the osteology and histology of the integument, an anatomical system that has been historically underrepresented in morphological systematics. The morphological data set assembled here represents the largest yet compiled for Amniota. Reevaluation of character data from prior studies of amniote phylogeny favors the hypothesis that turtles indeed have diapsid affinities. Addition of new ingroup taxa alone leads to a decrease in overall phylogenetic resolution, indicating that existing characters used for amniote phylogeny are insufficient to explain the evolution of more highly nested taxa. Incorporation of new data from the soft and osseous components of the integument, however, helps resolve relationships among both basal and highly nested amniote taxa. Analysis of a

  14. A Pilot Study on Integrating Videography and Environmental Microbial Sampling to Model Fecal Bacterial Exposures in Peri-Urban Tanzania.

    Directory of Open Access Journals (Sweden)

    Timothy R Julian

    Full Text Available Diarrheal diseases are a leading cause of under-five mortality and morbidity in sub-Saharan Africa. Quantitative exposure modeling provides opportunities to investigate the relative importance of fecal-oral transmission routes (e.g. hands, water, food responsible for diarrheal disease. Modeling, however, requires accurate descriptions of individuals' interactions with the environment (i.e., activity data. Such activity data are largely lacking for people in low-income settings. In the present study, we collected activity data and microbiological sampling data to develop a quantitative microbial exposure model for two female caretakers in peri-urban Tanzania. Activity data were combined with microbiological data of contacted surfaces and fomites (e.g. broom handle, soil, clothing to develop example exposure profiles describing second-by-second estimates of fecal indicator bacteria (E. coli and enterococci concentrations on the caretaker's hands. The study demonstrates the application and utility of video activity data to quantify exposure factors for people in low-income countries and apply these factors to understand fecal contamination exposure pathways. This study provides both a methodological approach for the design and implementation of larger studies, and preliminary data suggesting contacts with dirt and sand may be important mechanisms of hand contamination. Increasing the scale of activity data collection and modeling to investigate individual-level exposure profiles within target populations for specific exposure scenarios would provide opportunities to identify the relative importance of fecal-oral disease transmission routes.

  15. A Pilot Study on Integrating Videography and Environmental Microbial Sampling to Model Fecal Bacterial Exposures in Peri-Urban Tanzania.

    Science.gov (United States)

    Julian, Timothy R; Pickering, Amy J

    2015-01-01

    Diarrheal diseases are a leading cause of under-five mortality and morbidity in sub-Saharan Africa. Quantitative exposure modeling provides opportunities to investigate the relative importance of fecal-oral transmission routes (e.g. hands, water, food) responsible for diarrheal disease. Modeling, however, requires accurate descriptions of individuals' interactions with the environment (i.e., activity data). Such activity data are largely lacking for people in low-income settings. In the present study, we collected activity data and microbiological sampling data to develop a quantitative microbial exposure model for two female caretakers in peri-urban Tanzania. Activity data were combined with microbiological data of contacted surfaces and fomites (e.g. broom handle, soil, clothing) to develop example exposure profiles describing second-by-second estimates of fecal indicator bacteria (E. coli and enterococci) concentrations on the caretaker's hands. The study demonstrates the application and utility of video activity data to quantify exposure factors for people in low-income countries and apply these factors to understand fecal contamination exposure pathways. This study provides both a methodological approach for the design and implementation of larger studies, and preliminary data suggesting contacts with dirt and sand may be important mechanisms of hand contamination. Increasing the scale of activity data collection and modeling to investigate individual-level exposure profiles within target populations for specific exposure scenarios would provide opportunities to identify the relative importance of fecal-oral disease transmission routes.

  16. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  17. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  18. A novel method for sampling the suspended sediment load in the tidal environment using bi-directional time-integrated mass-flux sediment (TIMS) samplers

    Science.gov (United States)

    Elliott, Emily A.; Monbureau, Elaine; Walters, Glenn W.; Elliott, Mark A.; McKee, Brent A.; Rodriguez, Antonio B.

    2017-12-01

    Identifying the source and abundance of sediment transported within tidal creeks is essential for studying the connectivity between coastal watersheds and estuaries. The fine-grained suspended sediment load (SSL) makes up a substantial portion of the total sediment load carried within an estuarine system and efficient sampling of the SSL is critical to our understanding of nutrient and contaminant transport, anthropogenic influence, and the effects of climate. Unfortunately, traditional methods of sampling the SSL, including instantaneous measurements and automatic samplers, can be labor intensive, expensive and often yield insufficient mass for comprehensive geochemical analysis. In estuaries this issue is even more pronounced due to bi-directional tidal flow. This study tests the efficacy of a time-integrated mass sediment sampler (TIMS) design, originally developed for uni-directional flow within the fluvial environment, modified in this work for implementation the tidal environment under bi-directional flow conditions. Our new TIMS design utilizes an 'L' shaped outflow tube to prevent backflow, and when deployed in mirrored pairs, each sampler collects sediment uniquely in one direction of tidal flow. Laboratory flume experiments using dye and particle image velocimetry (PIV) were used to characterize the flow within the sampler, specifically, to quantify the settling velocities and identify stagnation points. Further laboratory tests of sediment indicate that bidirectional TIMS capture up to 96% of incoming SSL across a range of flow velocities (0.3-0.6 m s-1). The modified TIMS design was tested in the field at two distinct sampling locations within the tidal zone. Single-time point suspended sediment samples were collected at high and low tide and compared to time-integrated suspended sediment samples collected by the bi-directional TIMS over the same four-day period. Particle-size composition from the bi-directional TIMS were representative of the array of

  19. Stratified Entomological Sampling in Preparation for an Area-Wide Integrated Pest Management Program: The Example of Glossina palpalis gambiensis (Diptera: Glossinidae) in the Niayes of Senegal

    International Nuclear Information System (INIS)

    Bouyer, Jeremy; Seck, Momar Talla; Guerrini, Laure; Sall, Baba; Ndiaye, Elhadji Youssou; Vreysen, Marc J.B.

    2010-01-01

    The riverine tsetse species Glossina palpalis gambiensis Vanderplank 1949 (Diptera: Glossinidae) inhabits riparian forests along river systems in West Africa. The government of Senegal has embarked on a project to eliminate this tsetse species, and African animal trypanosomoses, from the Niayes are using an area-wide integrated pest management approach. A stratified entomological sampling strategy was therefore developed using spatial analytical tools and mathematical modeling. A preliminary phytosociological census identified eight types of suitable habitat, which could be discriminated from LandSat 7ETM satellite images and denominated wet areas. At the end of March 2009, 683 unbaited Vavoua traps had been deployed, and the observed infested area in the Niayes was 525 km2. In the remaining area, a mathematical model was used to assess the risk that flies were present despite a sequence of zero catches. The analysis showed that this risk was above 0.05 in19% of this area that will be considered as infested during the control operations.The remote sensing analysis that identifed the wet areas allowed a restriction of the area to be surveyed to 4% of the total surface area (7,150km2), whereas the mathematical model provided an efficient method to improve the accuracy and the robustness of the sampling protocol. The final size of the control area will be decided based on the entomological collection data.This entomological sampling procedure might be used for other vector or pest control scenarios. (Authors)

  20. Monolithic Integration of Sampled Grating DBR with Electroabsorption Modulator by Combining Selective-Area-Growth MOCVD and Quantum-Well Intermixing

    International Nuclear Information System (INIS)

    Hong-Bo, Liu; Ling-Juan, Zhao; Jiao-Qing, Pan; Hong-Liang, Zhu; Fan, Zhou; Bao-Jun, Wang; Wei, Wang

    2008-01-01

    We present the monolithic integration of a sampled-grating distributed Bragg reflector (SG-DBR) laser with a quantum-well electroabsorption modulator (QW-EAM) by combining ultra-low-pressure (55mbar) selective-area-growth (SAG) metal-organic chemical vapour deposition (MOCVD) and quantum-well intermixing (QWI) for the first time. The QW-EAM and the gain section can be grown simultaneously by using SAG MOCVD technology. Meanwhile, the QWI technology offers an abrupt band-gap change between two functional sections, which reduces internal absorption loss. The experimental results show that the threshold current Ith = 62 mA, and output power reaches 3.6mW. The wavelength tuning range covers 30nm, and all the corresponding side mode suppression ratios are over 30 dB. The extinction ratios at available wavelength channels can reach more than 14 dB with bias of -5 V

  1. Comparison of a high temperature torch integrated sample introduction system with a desolvation system for the analysis of microsamples through inductively coupled plasma mass spectrometry

    Science.gov (United States)

    Sánchez, Raquel; Cañabate, Águeda; Bresson, Carole; Chartier, Frédéric; Isnard, Hélène; Maestre, Salvador; Nonell, Anthony; Todolí, José-Luis

    2017-03-01

    This work describes for the first time the comparison of the analytical performances obtained with a high temperature torch integrated sample introduction system (hTISIS) against those found with a commercially available desolvation system (APEX) associated with inductively coupled plasma mass spectrometry (ICP-MS). A double pass spray chamber was taken as the reference system. Similar detection limits and sensitivities were obtained in continuous injection mode at low liquid flow rates for the APEX and hTISIS operating at high temperatures. In contrast, in the air-segmented injection mode, the detection limits obtained with hTISIS at high temperatures were up to 12 times lower than those found for the APEX. Regarding memory effects, wash out times were shorter in continuous mode and peaks were narrower in air segmented mode for the hTISIS as compared to the APEX. Non spectral interferences (matrix effects) were studied with 10% nitric acid, 2% methanol, for an ICP multielemental solution and a hydro-organic matrix containing 70% (v/v) acetonitrile in water, 15 mmol L- 1 ammonium acetate and 0.5% formic acid containing lanthanide complexes. In all the cases, matrix effects were less severe for the hTISIS operating at 200 °C and the APEX than for the double pass spray chamber. Finally, two spiked reference materials (sea water and Antartic krill) were analyzed. The hTISIS operating at 200 °C gave the best results compared to those obtained with the APEX and the double pass spray chamber. In conclusion, despite the simplicity of the hTISIS, it provided, at low liquid flow rates, results similar to or better than those obtained with the by other sample introduction systems.

  2. The SAMPL5 challenge for embedded-cluster integral equation theory: solvation free energies, aqueous p$K_a$, and cyclohexane–water log D

    CERN Document Server

    Tielker, Nicolas; Heil, Jochen; Kloss, Thomas; Ehrhart, Sebastian; Güssregen, Stefan; Schmidt, K. Friedemann; Kast, Stefan M.

    2016-01-01

    We predict cyclohexane–water distribution coefficients (log D7.4) for drug-like molecules taken from the SAMPL5 blind prediction challenge by the “embedded cluster reference interaction site model” (EC-RISM) integral equation theory. This task involves the coupled problem of predicting both partition coefficients (log P) of neutral species between the solvents and aqueous acidity constants (pKa) in order to account for a change of protonation states. The first issue is addressed by calibrating an EC-RISM-based model for solvation free energies derived from the “Minnesota Solvation Database” (MNSOL) for both water and cyclohexane utilizing a correction based on the partial molar volume, yielding a root mean square error (RMSE) of 2.4 kcal mol−1 for water and 0.8–0.9 kcal mol−1 for cyclohexane depending on the parametrization. The second one is treated by employing on one hand an empirical pKa model (MoKa) and, on the other hand, an EC-RISM-derived regression of published acidity constants (RMSE...

  3. Maintaining Breast Cancer Specimen Integrity and Individual or Simultaneous Extraction of Quality DNA, RNA, and Proteins from Allprotect-Stabilized and Nonstabilized Tissue Samples

    LENUS (Irish Health Repository)

    Mee, Blanaid C.

    2011-12-29

    The Saint James\\'s Hospital Biobank was established in 2008, to develop a high-quality breast tissue BioResource, as a part of the breast cancer clinical care pathway. The aims of this work were: (1) to ascertain the quality of RNA, DNA, and protein in biobanked carcinomas and normal breast tissues, (2) to assess the efficacy of AllPrep® (Qiagen) in isolating RNA, DNA, and protein simultaneously, (3) to compare AllPrep with RNEasy® and QIAamp® (both Qiagen), and (4) to examine the effectiveness of Allprotect® (Qiagen), a new tissue stabilization medium in preserving DNA, RNA, and proteins. One hundred eleven frozen samples of carcinoma and normal breast tissue were analyzed. Tumor and normal tissue morphology were confirmed by frozen sections. Tissue type, tissue treatment (Allprotect vs. no Allprotect), extraction kit, and nucleic acid quantification were analyzed by utilizing a 4 factorial design (SPSS PASW 18 Statistics Software®). QIAamp (DNA isolation), AllPrep (DNA, RNA, and Protein isolation), and RNeasy (RNA isolation) kits were assessed and compared. Mean DNA yield and A260\\/280 values using QIAamp were 33.2 ng\\/μL and 1.86, respectively, and using AllPrep were 23.2 ng\\/μL and 1.94. Mean RNA yield and RNA Integrity Number (RIN) values with RNeasy were 73.4 ng\\/μL and 8.16, respectively, and with AllPrep were 74.8 ng\\/μL and 7.92. Allprotect-treated tissues produced higher RIN values of borderline significance (P=0.055). No discernible loss of RNA stability was detected after 6 h incubation of stabilized or nonstabilized tissues at room temperature or 4°C or in 9 freeze-thaw cycles. Allprotect requires further detailed evaluation, but we consider AllPrep to be an excellent option for the simultaneous extraction of RNA, DNA, and protein from tumor and normal breast tissues. The essential presampling procedures that maintain the diagnostic integrity of pathology specimens do not appear to compromise the quality of molecular isolates.

  4. High-Throughput Analysis With 96-Capillary Array Electrophoresis and Integrated Sample Preparation for DNA Sequencing Based on Laser Induced Fluorescence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Gang [Iowa State Univ., Ames, IA (United States)

    2001-01-01

    The purpose of this research was to improve the fluorescence detection for the multiplexed capillary array electrophoresis, extend its use beyond the genomic analysis, and to develop an integrated micro-sample preparation system for high-throughput DNA sequencing. The authors first demonstrated multiplexed capillary zone electrophoresis (CZE) and micellar electrokinetic chromatography (MEKC) separations in a 96-capillary array system with laser-induced fluorescence detection. Migration times of four kinds of fluoresceins and six polyaromatic hydrocarbons (PAHs) are normalized to one of the capillaries using two internal standards. The relative standard deviations (RSD) after normalization are 0.6-1.4% for the fluoresceins and 0.1-1.5% for the PAHs. Quantitative calibration of the separations based on peak areas is also performed, again with substantial improvement over the raw data. This opens up the possibility of performing massively parallel separations for high-throughput chemical analysis for process monitoring, combinatorial synthesis, and clinical diagnosis. The authors further improved the fluorescence detection by step laser scanning. A computer-controlled galvanometer scanner is adapted for scanning a focused laser beam across a 96-capillary array for laser-induced fluorescence detection. The signal at a single photomultiplier tube is temporally sorted to distinguish among the capillaries. The limit of detection for fluorescein is 3 x 10-11 M (S/N = 3) for 5-mW of total laser power scanned at 4 Hz. The observed cross-talk among capillaries is 0.2%. Advantages include the efficient utilization of light due to the high duty-cycle of step scan, good detection performance due to the reduction of stray light, ruggedness due to the small mass of the galvanometer mirror, low cost due to the simplicity of components, and flexibility due to the independent paths for excitation and emission.

  5. Preserving Samples and Their Scientific Integrity — Insights into MSR from the Astromaterials Acquisition and Curation Office at NASA Johnson Space Center

    Science.gov (United States)

    Calaway, M. J.; Regberg, A. B.; Mitchell, J. L.; Fries, M. D.; Zeigler, R. A.; McCubbin, F. M.; Harrington, A. D.

    2018-04-01

    Rigorous collection of samples for contamination knowledge, the information gained from the characterization of reference materials and witness plates in concurrence with sample return, is essential for MSR mission success.

  6. Subsurface seeding of surface harmful algal blooms observed through the integration of autonomous gliders, moored environmental sample processors, and satellite remote sensing in southern California

    KAUST Repository

    Seegers, Bridget N.; Birch, James M.; Marin, Roman; Scholin, Chris A.; Caron, David A.; Seubert, Erica L.; Howard, Meredith D. A.; Robertson, George L.; Jones, Burton

    2015-01-01

    effluent plumes, and other processes. Multi-month Webb Slocum glider deployments combined with MBARI environmental sample processors (ESPs), weekly pier sampling, and ocean color data provided a multidimensional characterization of the development

  7. 199. Disrupted Integration in Early Psychosis: A Preliminary Exploration of the Relationship Between Neural Synchronization and Higher Order Cognition in a First-Episode Psychosis Sample.

    Science.gov (United States)

    Leonhardt, Bethany; Vohs, Jennifer; Lysaker, Paul; Bartolomeo, Lisa; O’Donnell, Brian; Breier, Alan

    2017-01-01

    Abstract Background: Disruptions in the ability to integrate information into complex ideas needed to make sense of and recover from psychiatric challenges are considered a core source of dysfunction in schizophrenia spectrum disorders (SSD). These disruptions are believed to take place at the level of basic brain functioning through neural synchrony and neurocognitive functioning in which information is encountered, encoded and available for memory and at the level of higher order cognition in which ideas are formed and reflected upon. In this study, we sought to explore the link of difficulties in integration at the level of basic brain functioning with integration at the level of self-reflectivity and insight in first episode patients. The role of disrupted integration has particular importance in early phases of illness, as it may impact the likelihood that an individual is able to move toward recovery. As more work is done in early intervention in SSD, it is pivotal that underlying factors that impact ability to recover are investigated. Methods: To assess the ability to integrate information at the level of basic brain function we used electroencephalography (EEG) collected using an Auditory Steady State Response (ASSR) and the Brief Assessment of Cognition in Schizophrenia (BACS). To assess integration at the level of conscious reflection we used the Metacognition Assessment Scale Abbreviated and insight we used the Scale to Assess Awareness of Mental Disorders (SUMD). Participants were 14 adults with first episode psychosis. Results: Pearson correlations were calculated to assess the relationship of EEG power across a range of frequency bands and neurocognition with MAS-A total scores and SUMD insight score. These revealed that the MAS-A total score was significantly negatively correlated with gamma activity, and was positively correlated with BACS total score. SUMD insight was significantly positively correlated with gamma activity, and negatively

  8. POLAR ORGANIC CHEMICAL INTEGRATIVE SAMPLING AND LIQUID CHROMATOGRAPHY-ELECTROSPRAY/ION-TRAP MASS SPECTROMETRY FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS

    Science.gov (United States)

    The purpose of the research presented in this paper is two-fold: (1) to demonstrate the 4 coupling of two state-of-the-art techniques: a time-weighted polar organic integrative sampler (POCIS) and micro-liquid chromatography-electrospray/ion trap mass spectrometry (u-LC-6 ES/ITMS...

  9. Analysis of cryopreparated non-dehydrated sample systems by means of a newly developed Tof-SIMS instrument with integrated high-vacuum cutting apparature

    International Nuclear Information System (INIS)

    Moeller, Joerg

    2008-01-01

    Aim of the present thesis was to construct an analysis apparatus, which allows to perform on cryofixed samples a cryocutting respectively cryocracking preparation under vacuum conditions and in the following perform a Tof-SIMS analysis

  10. Pressure-driven one-step solid phase-based on-chip sample preparation on a microfabricated plastic device and integration with flow-through polymerase chain reaction (PCR).

    Science.gov (United States)

    Tran, Hong Hanh; Trinh, Kieu The Loan; Lee, Nae Yoon

    2013-10-01

    In this study, we fabricate a monolithic poly(methylmethacrylate) (PMMA) microdevice on which solid phase-based DNA preparation and flow-through polymerase chain reaction (PCR) units were functionally integrated for one-step sample preparation and amplification operated by pressure. Chelex resin, which is used as a solid support for DNA preparation, can capture denatured proteins but releases DNA, and the purified DNA can then be used as a template in a subsequent amplification process. Using the PMMA microdevices, DNA was successfully purified from both Escherichia coli and human hair sample, and the plasmid vector inserted in E. coli and the D1S80 locus in human genomic DNA were successfully amplified from on-chip purified E. coli and human hair samples. Furthermore, the integration potential of the proposed sample preparation and flow-through PCR units was successfully demonstrate on a monolithic PMMA microdevice with a seamless flow, which could pave the way for a pressure-driven, simple one-step sample preparation and amplification with greatly decreased manufacture cost and enhanced device disposability. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Note: A 102 dB dynamic-range charge-sampling readout for ionizing particle/radiation detectors based on an application-specific integrated circuit (ASIC)

    Science.gov (United States)

    Pullia, A.; Zocca, F.; Capra, S.

    2018-02-01

    An original technique for the measurement of charge signals from ionizing particle/radiation detectors has been implemented in an application-specific integrated circuit form. The device performs linear measurements of the charge both within and beyond its output voltage swing. The device features an unprecedented spectroscopic dynamic range of 102 dB and is suitable for high-resolution ion and X-γ ray spectroscopy. We believe that this approach may change a widespread paradigm according to which no high-resolution spectroscopy is possible when working close to or beyond the limit of the preamplifier's output voltage swing.

  12. Integral test of JENDL-3.2 data by re-analysis of sample reactivity measurements at SEG and STEK facilities

    International Nuclear Information System (INIS)

    Dietze, Klaus

    2001-01-01

    Sample reactivity measurements, which have been performed at the fast-thermal coupled facilities RRR/SEG and STEK, have been re-analyzed using the JNC route for reactor calculation JENDL-3.2 // SLAROM / CITATION / PERKY. C/E-values of central reactivity worths (CRW) of FP nuclides, structural materials, and standards are given. (author)

  13. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    : Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort......When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  14. An integrated geochemical, geophysical and mineralogical study of river sediments in alpine area and soil samples near steel plant, in Austria

    Science.gov (United States)

    Irfan, M. I.; Meisel, T.

    2012-04-01

    Concentration of nickel and chromium in any part of the ecosystem is important for environmental concerns in particular human health due to the reason that some species of them can cause health problem e.g. dermatitis and cancer. Sediment samples collected form a river Vordernberger Bach (Leoben, Austria) in an alpine region and soil samples collected in an area adjacent to steel production unit in same narrow valley were investigated. In previous studies a correlation between magnetic susceptibility values and concentration of nickel and chromium showed that a magnetic susceptibility meter can be used to point out the contaminated areas as in-situ device. The purpose of the whole study is to understand the real (point or diffuse, anthropogenic or geogenic) sources of contamination of soils, water and river sediments through heavy metal deposition. Unseparated, magnetic and non-magnetic fractions of soil samples were investigated for geochemical and mineralogical aspects with XRF, ICP-MS, EMPA, Multi-Functional Kappabridge (MFK1) and laser ablation coupled with ICP-MS. Mineralogical study of sediment samples for several sampling points with higher Ni and Cr content was performed. Sediment samples were sieved below 1.4 mm and then a concentrate of heavy minerals was prepared in the field through panning. Concentrated heavy minerals were then subjected for heavy liquid separation in the laboratory. Separated magnetic and non-magnetic fractions below 0.71/0.1 mm and density greater than 2.9 g/cm3 were selected for mineralogical investigation. The abundance of typical anthropogenic particles, e.g., spherical, tinder, roasted ores, iron and steel mill slag was observed under the microscope. Magnetite (mostly anthropogenic), maghemite, chromspinel, chromite (type I & II), (Ca,Al)-ferrite, wustite, apatite (anthropogenic), olivine mixed crystals, calcium silicate and spinel (anthropogenic) are found in magnetic fraction. Non-magnetic fractions contain hematite, siderite

  15. Subsurface seeding of surface harmful algal blooms observed through the integration of autonomous gliders, moored environmental sample processors, and satellite remote sensing in southern California

    KAUST Repository

    Seegers, Bridget N.

    2015-04-01

    An observational study was performed in the central Southern California Bight in Spring 2010 to understand the relationship between seasonal spring phytoplankton blooms and coastal processes that included nutrient input from upwelling, wastewater effluent plumes, and other processes. Multi-month Webb Slocum glider deployments combined with MBARI environmental sample processors (ESPs), weekly pier sampling, and ocean color data provided a multidimensional characterization of the development and evolution of harmful algal blooms (HABs). Results from the glider and ESP observations demonstrated that blooms of toxic Pseudo-nitzschia sp. can develop offshore and subsurface prior to their manifestation in the surface layer and/or near the coast. A significant outbreak and surface manifestation of the blooms coincided with periods of upwelling, or other processes that caused shallowing of the pycnocline and subsurface chlorophyll maximum. Our results indicate that subsurface populations can be an important source for “seeding” surface Pseudo-nitzschia HAB events in southern California.

  16. The Integration of Plant Sample Analysis, Laboratory Studies, and Thermodynamic Modeling to Predict Slag-Matte Equilibria in Nickel Sulfide Converting

    Science.gov (United States)

    Hidayat, Taufiq; Shishin, Denis; Grimsey, David; Hayes, Peter C.; Jak, Evgueni

    2018-02-01

    The Kalgoorlie Nickel Smelter (KNS) produces low Fe, low Cu nickel matte in its Peirce-Smith converter operations. To inform process development in the plant, new fundamental data are required on the effect of CaO in slag on the distribution of arsenic between slag and matte. A combination of plant sample analysis, high-temperature laboratory experiments, and thermodynamic modeling was carried out to identify process conditions in the converter and to investigate the effect of slag composition on the chemical behavior of the system. The high-temperature experiments involved re-equilibration of industrial matte-slag-lime samples at 1498 K (1225 °C) and P(SO2) = 0.12 atm on a magnetite/quartz substrate, rapid quenching in water, and direct measurement of phase compositions using electron probe X-ray microanalysis (EPMA) and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). A private thermodynamic database for the Ca-Cu-Fe-Mg-Ni-O-S-Si-(As) system was used together with the FactSage software package to assist in the analysis. Thermodynamic predictions combined with plant sample characterization and the present experimental data provide a quantitative basis for the analysis of the effect of CaO fluxing on the slag-matte thermochemistry during nickel sulfide converting, in particular on the spinel liquidus and the distribution of elements between slag and matte as a function of CaO addition.

  17. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    Science.gov (United States)

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  18. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    Science.gov (United States)

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  19. Time-dependent integrity during storage of natural surface water samples for the trace analysis of pharmaceutical products, feminizing hormones and pesticides

    Directory of Open Access Journals (Sweden)

    Prévost Michèle

    2010-04-01

    Full Text Available Abstract Monitoring and analysis of trace contaminants such as pharmaceuticals and pesticides require the preservation of the samples before they can be quantified using the appropriate analytical methods. Our objective is to determine the sample shelf life to insure proper quantification of ultratrace contaminants. To this end, we tested the stability of a variety of pharmaceutical products including caffeine, natural steroids, and selected pesticides under refrigerated storage conditions. The analysis was performed using multi-residue methods using an on-line solid-phase extraction combined with liquid chromatography tandem mass spectrometry (SPE-LC-MS/MS in the selected reaction monitoring mode. After 21 days of storage, no significant difference in the recoveries was observed compared to day 0 for pharmaceutical products, while for pesticides, significant losses occurred for DIA and simazine after 10 days (14% and 17% reduction respectively and a statistically significant decrease in the recovery was noted for cyanazine (78% disappearance. However, the estrogen and progestogen steroids were unstable during storage. The disappearance rates obtained after 21 days of storage vary from 63 to 72% for the feminizing hormones. Overall, pharmaceuticals and pesticides seem to be stable for refrigerated storage for up to about 10 days (except cyanazine and steroidal hormones can be quite sensitive to degradation and should not be stored for more than a few days.

  20. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  1. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  2. Time-integrated activity coefficient estimation for radionuclide therapy using PET and a pharmacokinetic model: A simulation study on the effect of sampling schedule and noise

    Energy Technology Data Exchange (ETDEWEB)

    Hardiansyah, Deni [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Universitätsmedizin Mannheim, Heidelberg University, Mannheim 68167, Germany and Department of Radiation Oncology, Medical Faculty Mannheim, Universitätsmedizin Mannheim, Heidelberg University, Mannheim 68167 (Germany); Guo, Wei; Glatting, Gerhard, E-mail: gerhard.glatting@medma.uni-heidelberg.de [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Universitätsmedizin Mannheim, Heidelberg University, Mannheim 68167 (Germany); Kletting, Peter [Department of Nuclear Medicine, Ulm University, Ulm 89081 (Germany); Mottaghy, Felix M. [Department of Nuclear Medicine, University Hospital, RWTH Aachen University, Aachen 52074, Germany and Department of Nuclear Medicine, Maastricht University Medical Center MUMC+, Maastricht 6229 (Netherlands)

    2016-09-15

    Purpose: The aim of this study was to investigate the accuracy of PET-based treatment planning for predicting the time-integrated activity coefficients (TIACs). Methods: The parameters of a physiologically based pharmacokinetic (PBPK) model were fitted to the biokinetic data of 15 patients to derive assumed true parameters and were used to construct true mathematical patient phantoms (MPPs). Biokinetics of 150 MBq {sup 68}Ga-DOTATATE-PET was simulated with different noise levels [fractional standard deviation (FSD) 10%, 1%, 0.1%, and 0.01%], and seven combinations of measurements at 30 min, 1 h, and 4 h p.i. PBPK model parameters were fitted to the simulated noisy PET data using population-based Bayesian parameters to construct predicted MPPs. Therapy simulations were performed as 30 min infusion of {sup 90}Y-DOTATATE of 3.3 GBq in both true and predicted MPPs. Prediction accuracy was then calculated as relative variability v{sub organ} between TIACs from both MPPs. Results: Large variability values of one time-point protocols [e.g., FSD = 1%, 240 min p.i., v{sub kidneys} = (9 ± 6)%, and v{sub tumor} = (27 ± 26)%] show inaccurate prediction. Accurate TIAC prediction of the kidneys was obtained for the case of two measurements (1 and 4 h p.i.), e.g., FSD = 1%, v{sub kidneys} = (7 ± 3)%, and v{sub tumor} = (22 ± 10)%, or three measurements, e.g., FSD = 1%, v{sub kidneys} = (7 ± 3)%, and v{sub tumor} = (22 ± 9)%. Conclusions: {sup 68}Ga-DOTATATE-PET measurements could possibly be used to predict the TIACs of {sup 90}Y-DOTATATE when using a PBPK model and population-based Bayesian parameters. The two time-point measurement at 1 and 4 h p.i. with a noise up to FSD = 1% allows an accurate prediction of the TIACs in kidneys.

  3. [A meta-analysis of the impact of sample, kind of outcome measurement and time of follow up on occupational re-integration after vocational retraining].

    Science.gov (United States)

    Streibelt, M; Egner, U

    2012-12-01

    Vocational Rehabilitation (VR) is an essential element of intervention to rehabilitate people with work disability due to chronic diseases. The activities are heterogeneous; of particular importance in the rehabilitation process is vocational retraining. These interventions are cost-intensive, take very long and have a decisive impact on the main goal, the return to work (RTW). However, vocational retraining is conducted under different settings: The question is to what extent this leads to specific methodical implications of RTW measurement that have an impact on the level of RTW. The analysis was concentrated on the main outcome RTW after vocational retraining. A structured review was conducted of all German-language publications from 2005 to 2010 which report an RTW quota after vocational retraining. The main methodical conditions were: kind of RTW measurement (point in time rate vs. cumulative course rate), time of follow-up in years, and sample definition (all participants vs. participants with regular completion of retraining, RC). The impact of these conditions on the level of RTW was predicted by using a meta-regression model. The time of follow-up was standardized on the mean (1 year). 20 publications from 10 studies were included in the analysis. In all, 23 RTW quota were observed, from which 22 were included in the regression model. A positive impact on the level of RTW was identified for reduction of the sample to participants with RC (b=10.04; p=0.001), the time of follow-up (b=5.47; p=0.052) and, by trend, the interaction of the kind of RTW measurement and time of follow-up (b=6.32; p=0.090). There was no significance given for the main effect of kind of RTW measurement (p=0.787). The model fit was calculated with an adjusted R2=75.17%. The level of RTW given in the publications is very heterogeneous, but a major part of the variance was explained by the 3 methodical conditions we examined. Based on this, a classification system for RTW measurement

  4. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  5. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  6. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  7. Comparison of the sampling rates and partitioning behaviour of polar and non-polar contaminants in the polar organic chemical integrative sampler and a monophasic mixed polymer sampler for application as an equilibrium passive sampler.

    Science.gov (United States)

    Jeong, Yoonah; Schäffer, Andreas; Smith, Kilian

    2018-06-15

    In this work, Oasis HLB® beads were embedded in a silicone matrix to make a single phase passive sampler with a higher affinity for polar and ionisable compounds than silicone alone. The applicability of this mixed polymer sampler (MPS) was investigated for 34 aquatic contaminants (log K OW -0.03 to 6.26) in batch experiments. The influence of flow was investigated by comparing uptake under static and stirred conditions. The sampler characteristics of the MPS was assessed in terms of sampling rates (R S ) and sampler-water partition coefficients (K SW ), and these were compared to those of the polar organic chemical integrative sampler (POCIS) as a reference kinetic passive sampler. The MPS was characterized as an equilibrium sampler for both polar and non-polar compounds, with faster uptake rates and a shorter time to reach equilibrium than the POCIS. Water flow rate impacted sampling rates by up to a factor of 12 when comparing static and stirred conditions. In addition, the relative accumulation of compounds in the polyethersulfone (PES) membranes versus the inner Oasis HLB sorbent was compared for the POCIS, and ranged from <1% to 83% depending on the analyte properties. This is indicative of a potentially significant lag-phase for less polar compounds within POCIS. The findings of this study can be used to quantitatively describe the partitioning and kinetic behaviour of MPS and POCIS for a range of aquatic organic contaminants for application in field sampling. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. SELECTED REQUIREMENTS OF INTEGRATED MANAGEMENT SYSTEMS BASED ON PAS 99 SPECIFICATION

    Directory of Open Access Journals (Sweden)

    Paweł Nowicki

    2013-03-01

    Full Text Available The aim this research was to analyze the ways of integration of management systems in food sector. The study involved the documentation, audits, corrective and preventive actions and management's review phases described in the specification PAS 99, which is one of common elements of integrated management systems. Four organizations were selected for the study. The organizations had introduced and certified at least two standardized management systems. It was assumed that the investigated organizations should have implemented the HACCP system. Studies were conducted as a case study. The employees responsible for the functioning of management systems were interviewed in all four organizations. The study was conducted in the form of in-depth interviews based on pre-prepared script. The scenario was developed based on the PAS 99 guideline. The process of integration of management systems implemented in the studied companies reveals the full compliance of an integrated management system with PASS 99 in the policy area.

  9. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  10. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  11. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  12. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  13. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  14. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  15. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  16. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  17. Integral or integrated marketing

    Directory of Open Access Journals (Sweden)

    Davčik Nebojša

    2006-01-01

    Full Text Available Marketing theorists and experts try to develop business efficient organization and to get marketing performance at higher, business integrated level since its earliest beginnings. The core issue in this paperwork is the dialectic and practical approach dilemma should we develop integrated or integral marketing approach in the organization. The presented company cases as well as dialectic and functional explanations of this dilemma clearly shows that integrated marketing is narrower approach than integral marketing if we take as focal point new, unique and completed entity. In the integration the essence is in getting different parts together, which do not have to make necessary the new entity. The key elements in the definition of the integral marketing are necessity and holistic, e.g. necessity to develop new, holistic entity.

  18. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  19. Integral-preserving integrators

    International Nuclear Information System (INIS)

    McLaren, D I; Quispel, G R W

    2004-01-01

    Ordinary differential equations having a first integral may be solved numerically using one of several methods, with the integral preserved to machine accuracy. One such method is the discrete gradient method. It is shown here that the order of the method can be bootstrapped repeatedly to higher orders of accuracy. The method is illustrated using the Henon-Heiles system. (letter to the editor)

  20. Interactive Sample Book (ISB)

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen; Lenau, Torben Anker; Guglielmi, Michel

    2009-01-01

    supervisor Torben A. Lenau. Inspiration to use smart materials Interactive textiles are still quite an unknown phenomenon to many. It is thus often difficult to communicate what kind of potentials lie within these materials. This is why the ISB project was started, as a practice based research project...... and senses in relation to integrated decoration and function primarily to indoor applications. The result of the project will be a number of interactive textiles, to be gathered in an interactive sample book (ISB), in a similar way as the sample books of wallpapers one can take home from the shop and choose...... from. In other words, it is a kind of display material, which in a simple manner can illustrate how different techniques and smart materials work. The sample book should display a number of possibilities where sensor technology, smart materials and textiles are mixed to such an extent that the textile...

  1. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  2. Integrated inventory information system

    Digital Repository Service at National Institute of Oceanography (India)

    Sarupria, J.S.; Kunte, P.D.

    The nature of oceanographic data and the management of inventory level information are described in Integrated Inventory Information System (IIIS). It is shown how a ROSCOPO (report on observations/samples collected during oceanographic programme...

  3. Nonadiabatic transition path sampling

    International Nuclear Information System (INIS)

    Sherman, M. C.; Corcelli, S. A.

    2016-01-01

    Fewest-switches surface hopping (FSSH) is combined with transition path sampling (TPS) to produce a new method called nonadiabatic path sampling (NAPS). The NAPS method is validated on a model electron transfer system coupled to a Langevin bath. Numerically exact rate constants are computed using the reactive flux (RF) method over a broad range of solvent frictions that span from the energy diffusion (low friction) regime to the spatial diffusion (high friction) regime. The NAPS method is shown to quantitatively reproduce the RF benchmark rate constants over the full range of solvent friction. Integrating FSSH within the TPS framework expands the applicability of both approaches and creates a new method that will be helpful in determining detailed mechanisms for nonadiabatic reactions in the condensed-phase.

  4. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  5. An Attempt to Determine the Construct Validity of Measures Hypothesized to Represent an Orientation to Right, Left, or Integrated Hemispheric Brain Function for a Sample of Primary School Children.

    Science.gov (United States)

    Dumbrower, Jule; And Others

    1981-01-01

    This study attempts to obtain evidence of the construct validity of pupil ability tests hypothesized to represent orientation to right, left, or integrated hemispheric function, and of teacher observation subscales intended to reveal behaviors in school setting that were hypothesized to portray preference for right or left brain function. (Author)

  6. Integrated economics

    International Nuclear Information System (INIS)

    Bratton, T.J.

    1992-01-01

    This article offers ideas for evaluating integrated solid waste management systems through the use of a conceptual cost overview. The topics of the article include the integrated solid waste management system; making assumptions about community characteristics, waste generation rates, waste collection responsibility, integrated system components, sizing and economic life of system facilities, system implementation schedule, facility ownership, and system administration; integrated system costs; integrated system revenues; system financing; cost projections; and making decisions

  7. Determinação de ácido acéttico em amostra de vinagre adulterada com ácido clorídrico - um experimento integrado de titulação potenciométrica e condutométrica Determination of acetic acid in vinegar adulterated sample with cloridric acid - an experiment integrated of potentiometric and conductometric titrations

    Directory of Open Access Journals (Sweden)

    José Vinicius Martins

    2010-01-01

    Full Text Available The determination of acetic acid in vinegar adulterated sample using simultaneous potentiometric and condutometric titrations was used as an example of integrated experiment in instrumental analysis. An Excel® spreadsheet, which allows the entry of simultaneous data and the construction of the superimposed experimental curves (condutometric, potentiometric, first and second derivative potentiometric curve and, distribution diagrama of the acetic species as function of pH, was used as powerful tool to discuss the fundamental concepts involved in each technique and choose the best of them to quantify, without mutual interference, H3CCOOH and HCl in vinegar adulterated sample.

  8. Sampling airborne radioactivity

    International Nuclear Information System (INIS)

    Cohen, B.S.

    1988-01-01

    Radioactive contaminants have historically been considered apart from chemical contaminants because it is their radiological properties that determine their biological and environmental impact. Additionally they have been regulated by special government agencies concerned with radiological protection. Radioactive contaminants are also distinguished by the specialized and very sensitive methods available for the detection of radioactivity. Measurements of a few thousand atoms per liter are not uncommon. Radiation detectors in common use are gas filled chambers, scintillation and semiconductor detectors, and the more recently developed thermoluminescent and etched track detectors. Solid-state nuclear track detectors consist of a large group of inorganic and organic dielectrics which register tracks when traversed by heavy charged particles. They do not respond to light, beta particles or gamma ray photons and thus provide a very low background system for the detection of extremely low levels of radioactivity. In addition, no power source or electronic equipment is required. Cellulose nitrate detectors are currently in use for long term integrated sampling of environmental radon. Thermoluminescent dosimeters (TID's) are crystalline materials, in which electrons which have been displaced by an interaction with ionizing radiation become trapped at an elevated energy level and emit visible light when released from that energy level. As which etched-track detectors no power or electronic equipment is needed for the TID's at a measurement site, but they respond to alpha, beta and gamma radiation. Thermoluminescent dosimeters are useful for long term environmental monitoring, and have also been newly incorporated into integrating radon detection systems

  9. Integridade e externalização: estudo exploratório em uma amostra de estudantes de psicologia Integrity and externalizing: an exploratory study in a sample of psychology students

    Directory of Open Access Journals (Sweden)

    Viviane Oliveira Baumgartl

    2009-12-01

    Full Text Available Testes psicológicos que avaliam o construto integridade são amplamente utilizados nos Estados Unidos com o objetivo de tentar prever a ocorrência de comportamentos contraprodutivos no ambiente de trabalho, tais como atrasos, roubos e abuso de substâncias químicas. O presente estudo buscou investigar a relação entre integridade e externalização (fator de personalidade ligado à disposição em apresentar problemas ligados ao controle de impulsos, tendo em vista o fato dos dois construtos estarem relacionados conceitualmente. Participaram da pesquisa 209 estudantes de psicologia, provenientes de duas Universidades de Minas Gerais (pública e particular, que foram submetidos à aplicação de uma versão traduzida e adaptada do teste de integridade Personnel Reaction Blank (PRB e do Inventário de Externalização-100. A investigação da relação entre integridade e externalização indicou uma associação moderada e negativa (r=-0,59 entre os escores globais dos dois instrumentos. Houve, portanto, uma associação entre maior manifestação de comportamentos dignos e honestos e menor manifestação de comportamentos antissociais impulsivos. Sugestões de estudos futuros são apontadas.Psychological tests that assess the construct integrity are widely used in the United States with the aim of preventing the occurrence of counter-productive behaviors in the workplace, such as delays, thefts and chemical substances abuse. The present study aimed to investigate the relation between integrity and externalizing (personality factor that reflects proneness to an array of impulse-control problems considering that both constructs are conceptually related. A total of 209 Psychology students from two universities in Minas Gerais participated in the present study. They were submitted to a translated and adapted version of the test of integrity Personnel Reaction Blank (PRB and to the Externalization Inventory-100. The investigation of the relation

  10. Gauge Integral

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2017-10-01

    Full Text Available Some authors have formalized the integral in the Mizar Mathematical Library (MML. The first article in a series on the Darboux/Riemann integral was written by Noboru Endou and Artur Korniłowicz: [6]. The Lebesgue integral was formalized a little later [13] and recently the integral of Riemann-Stieltjes was introduced in the MML by Keiko Narita, Kazuhisa Nakasho and Yasunari Shidama [12].

  11. Dissecting the pathobiology of altered MRI signal in amyotrophic lateral sclerosis: A post mortem whole brain sampling strategy for the integration of ultra-high-field MRI and quantitative neuropathology.

    Science.gov (United States)

    Pallebage-Gamarallage, Menuka; Foxley, Sean; Menke, Ricarda A L; Huszar, Istvan N; Jenkinson, Mark; Tendler, Benjamin C; Wang, Chaoyue; Jbabdi, Saad; Turner, Martin R; Miller, Karla L; Ansorge, Olaf

    2018-03-13

    Amyotrophic lateral sclerosis (ALS) is a clinically and histopathologically heterogeneous neurodegenerative disorder, in which therapy is hindered by the rapid progression of disease and lack of biomarkers. Magnetic resonance imaging (MRI) has demonstrated its potential for detecting the pathological signature and tracking disease progression in ALS. However, the microstructural and molecular pathological substrate is poorly understood and generally defined histologically. One route to understanding and validating the pathophysiological correlates of MRI signal changes in ALS is to directly compare MRI to histology in post mortem human brains. The article delineates a universal whole brain sampling strategy of pathologically relevant grey matter (cortical and subcortical) and white matter tracts of interest suitable for histological evaluation and direct correlation with MRI. A standardised systematic sampling strategy that was compatible with co-registration of images across modalities was established for regions representing phosphorylated 43-kDa TAR DNA-binding protein (pTDP-43) patterns that were topographically recognisable with defined neuroanatomical landmarks. Moreover, tractography-guided sampling facilitated accurate delineation of white matter tracts of interest. A digital photography pipeline at various stages of sampling and histological processing was established to account for structural deformations that might impact alignment and registration of histological images to MRI volumes. Combined with quantitative digital histology image analysis, the proposed sampling strategy is suitable for routine implementation in a high-throughput manner for acquisition of large-scale histology datasets. Proof of concept was determined in the spinal cord of an ALS patient where multiple MRI modalities (T1, T2, FA and MD) demonstrated sensitivity to axonal degeneration and associated heightened inflammatory changes in the lateral corticospinal tract. Furthermore

  12. VECTOR INTEGRATION

    NARCIS (Netherlands)

    Thomas, E. G. F.

    2012-01-01

    This paper deals with the theory of integration of scalar functions with respect to a measure with values in a, not necessarily locally convex, topological vector space. It focuses on the extension of such integrals from bounded measurable functions to the class of integrable functions, proving

  13. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-05-07

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  14. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming; Wallner, Johannes; Wonka, Peter

    2014-01-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  15. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  16. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  17. Interorganisational Integration

    DEFF Research Database (Denmark)

    Lyngsø, Anne Marie; Godtfredsen, Nina Skavlan; Frølich, Anne

    2016-01-01

    INTRODUCTION: Despite many initiatives to improve coordination of patient pathways and intersectoral cooperation, Danish health care is still fragmented, lacking intra- and interorganisational integration. This study explores barriers to and facilitators of interorganisational integration...... at a university hospital in the Capital Region of Denmark. RESULTS AND DISCUSSION: Our results can be grouped into five influencing areas for interorganisational integration: communication/information transfer, committed leadership, patient engagement, the role and competencies of the general practitioner...... and organisational culture. Proposed solutions to barriers in each area hold the potential to improve care integration as experienced by individuals responsible for supporting and facilitating it. Barriers and facilitators to integrating care relate to clinical, professional, functional and normative integration...

  18. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  19. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  20. Vertical integration

    International Nuclear Information System (INIS)

    Antill, N.

    1999-01-01

    This paper focuses on the trend in international energy companies towards vertical integration in the gas chain from wellhead to power generation, horizontal integration in refining and marketing businesses, and the search for larger projects with lower upstream costs. The shape of the petroleum industry in the next millennium, the creation of super-major oil companies, and the relationship between size and risk are discussed. The dynamics of vertical integration, present events and future developments are considered. (UK)

  1. Integral equations

    CERN Document Server

    Moiseiwitsch, B L

    2005-01-01

    Two distinct but related approaches hold the solutions to many mathematical problems--the forms of expression known as differential and integral equations. The method employed by the integral equation approach specifically includes the boundary conditions, which confers a valuable advantage. In addition, the integral equation approach leads naturally to the solution of the problem--under suitable conditions--in the form of an infinite series.Geared toward upper-level undergraduate students, this text focuses chiefly upon linear integral equations. It begins with a straightforward account, acco

  2. The effect of integrated reporting on integrated thinking between risk ...

    African Journals Online (AJOL)

    IIRC (2013b: 3), integrated thinking takes into account the connectivity and ... historical information and provides investors and other stakeholders with .... in the disclosure of risks and opportunities by using a sample of the top 100 JSE-.

  3. Integration of paper-based microarray and time-of-flight secondary ion mass spectrometry (ToF-SIMS) for parallel detection and quantification of molecules in multiple samples automatically.

    Science.gov (United States)

    Chu, Kuo-Jui; Chen, Po-Chun; You, Yun-Wen; Chang, Hsun-Yun; Kao, Wei-Lun; Chu, Yi-Hsuan; Wu, Chen-Yi; Shyue, Jing-Jong

    2018-04-16

    With its low-cost fabrication and ease of modification, paper-based analytical devices have developed rapidly in recent years. Microarrays allow automatic analysis of multiple samples or multiple reactions with minimal sample consumption. While cellulose paper is generally used, its high backgrounds in spectrometry outside of the visible range has limited its application to be mostly colorimetric analysis. In this work, glass-microfiber paper is used as the substrate for a microarray. The glass-microfiber is essentially chemically inert SiO x , and the lower background from this inorganic microfiber can avoid interference from organic analytes in various spectrometers. However, generally used wax printing fails to wet glass microfibers to form hydrophobic barriers. Therefore, to prepare the hydrophobic-hydrophilic pattern, the glass-microfiber paper was first modified with an octadecyltrichlorosilane (OTS) self-assembled monolayer (SAM) to make the paper hydrophobic. A hydrophilic microarray was then prepared using a CO 2 laser scriber that selectively removed the OTS layer with a designed pattern. One microliter of aqueous drops of peptides at various concentrations were then dispensed inside the round patterns where OTS SAM was removed while the surrounding area with OTS layer served as a barrier to separate each drop. The resulting specimen of multiple spots was automatically analyzed with a time-of-flight secondary ion mass spectrometer (ToF-SIMS), and all of the secondary ions were collected. Among the various cluster ions that have developed over the past decade, pulsed C 60 + was selected as the primary ion because of its high secondary ion intensity in the high mass region, its minimal alteration of the surface when operating within the static-limit and spatial resolution at the ∼μm level. In the resulting spectra, parent ions of various peptides (in the forms [M+H] + and [M+Na] + ) were readily identified for parallel detection of molecules in a mixture

  4. Integrated Design

    DEFF Research Database (Denmark)

    Lenau, Torben Anker

    1999-01-01

    A homepage on the internet with course material, lecture plan, student exercises, etc. Continuesly updated during the course Integrated Design (80402, 80403)......A homepage on the internet with course material, lecture plan, student exercises, etc. Continuesly updated during the course Integrated Design (80402, 80403)...

  5. 21 CFR 211.170 - Reserve samples.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Reserve samples. 211.170 Section 211.170 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL... of deterioration unless visual examination would affect the integrity of the reserve sample. Any...

  6. Organising integration

    DEFF Research Database (Denmark)

    Axelsson, Runo

    2013-01-01

    Background: In Sweden, as in many other countries, there has been a succession of trends in the organisation of health care and other welfare services. These trends have had different implications for the integration of services in the health and welfare system. Aims: One aim is to discuss...... the implications of different organisational trends for the integration of health and welfare services. Another aim is to introduce a Swedish model of financial coordination as a flexible way to organise integration. Organisational trends: In the 1960’s there was an expansion of health and welfare services leading...... an increasing lack of integration in the health and welfare system. In the 2000’s, there has been a re-centralisation through mergers of hospitals, regions and state agencies. It has become clear, however, that mergers do not promote integration but rather increase the bureaucratisation of the system. Model...

  7. Adaptation of a radiofrequency glow discharge optical emission spectrometer (RF-GD-OES) to the analysis of light elements (carbon, nitrogen, oxygen and hydrogen) in solids: glove box integration for the analysis of nuclear samples

    International Nuclear Information System (INIS)

    Hubinois, J.-C.

    2001-01-01

    The purpose of this work is to use the radiofrequency glow discharge optical emission spectrometry in order to quantitatively determine carbon, nitrogen, oxygen and hydrogen at low concentration (in the ppm range) in nuclear materials. In this study, and before the definitive contamination of the system, works are carried out on non radioactive materials (steel, pure iron, copper and titanium). As the initial apparatus could not deliver a RF power inducing a reproducible discharge and was not adapted to the analysis of light elements: 1- The radiofrequency system had to be changed, 2- The systems controlling gaseous atmospheres had to be improved in order to obtain analytical signals stemming strictly from the sample, 3- Three discharge lamps had to be tested and compared in terms of performances, 4- The system of collection of light had to be optimized. The modifications that were brought to the initial system improved intensities and stabilities of signals which allowed lower detection limits (1000 times lower for carbon). These latter are in the ppm range for carbon and about a few tens of ppm for nitrogen and oxygen in pure irons. Calibration curves were plotted in materials presenting very different sputtering rates in order to check the existence of a 'function of analytical transfer' with the purpose of palliating the lack of reference materials certified in light elements at low concentration. Transposition of this type of function to other matrices remains to be checked. Concerning hydrogen, since no usable reference material with our technique is available, certified materials in deuterium (chosen as a surrogate for hydrogen) were studied in order to exhibit the feasibility the analysis of hydrogen. Parallel to these works, results obtained by modeling a RF discharge show that the performances of the lamp can be improved and that the optical system must be strictly adapted to the glow discharge. (author) [fr

  8. The integration of immigrants

    OpenAIRE

    Bauböck, Rainer

    1995-01-01

    from the Table of Contents: Migration and integration - Basic concepts and definitions; Immigration and Integration policies; The legal framework for integration; Dimension of social integration; Cultural integration; Conclusions;

  9. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  10. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  11. Integrative psychotherapy.

    Science.gov (United States)

    Kozarić-Kovacić, Dragica

    2008-09-01

    The main purposes of the article are to present the history of integration in psychotherapy, the reasons of the development integrative approaches, and the approaches to integration in psychotherapy. Three approaches to integration in psychotherapy exist: theoretical integration, theoretical eclecticism, and common factors in different psychotherapeutic trends. In integrative psychotherapy, the basic epistemology, theory, and clinical practice are based on the phenomenology, field theory, holism, dialogue, and co-creation of dialogue in the therapeutic relationship. The main criticism is that integrative psychotherapy suffers from confusion and many unresolved controversies. It is difficult to theoretically and methodologically define the clinically applied model that is based on such a different epistemological and theoretical presumptions. Integrative psychotherapy is a synthesis of humanistic psychotherapy, object relations theory, and psychoanalytical self psychology. It focuses on the dynamics and potentials of human relationships, with a goal of changing the relations and understanding internal and external resistances. The process of integrative psychotherapy is primarily focused on the developmental-relational model and co-creation of psychotherapeutic relationship as a single interactive event, which is not unilateral, but rather a joint endeavor by both the therapist and the patient/client. The need for a relationship is an important human need and represents a process of attunement that occurs as a response to the need for a relationship, a unique interpersonal contact between two people. If this need is not met, it manifests with the different feelings and various defenses. To meet this need, we need to have another person with whom we can establish a sensitive, attuned relationship. Thus, the therapist becomes this person who tries to supplement what the person did not receive. Neuroscience can be a source of integration through different therapies. We

  12. Integration of Chandrasekhar's integral equation

    International Nuclear Information System (INIS)

    Tanaka, Tasuku

    2003-01-01

    We solve Chandrasekhar's integration equation for radiative transfer in the plane-parallel atmosphere by iterative integration. The primary thrust in radiative transfer has been to solve the forward problem, i.e., to evaluate the radiance, given the optical thickness and the scattering phase function. In the area of satellite remote sensing, our problem is the inverse problem: to retrieve the surface reflectance and the optical thickness of the atmosphere from the radiance measured by satellites. In order to retrieve the optical thickness and the surface reflectance from the radiance at the top-of-the atmosphere (TOA), we should express the radiance at TOA 'explicitly' in the optical thickness and the surface reflectance. Chandrasekhar formalized radiative transfer in the plane-parallel atmosphere in a simultaneous integral equation, and he obtained the second approximation. Since then no higher approximation has been reported. In this paper, we obtain the third approximation of the scattering function. We integrate functions derived from the second approximation in the integral interval from 1 to ∞ of the inverse of the cos of zenith angles. We can obtain the indefinite integral rather easily in the form of a series expansion. However, the integrals at the upper limit, ∞, are not yet known to us. We can assess the converged values of those series expansions at ∞ through calculus. For integration, we choose coupling pairs to avoid unnecessary terms in the outcome of integral and discover that the simultaneous integral equation can be deduced to the mere integral equation. Through algebraic calculation, we obtain the third approximation as a polynomial of the third degree in the atmospheric optical thickness

  13. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  14. Radioactivity in environmental samples

    International Nuclear Information System (INIS)

    Fornaro, Laura

    2001-01-01

    The objective of this practical work is to familiarize the student with radioactivity measures in environmental samples. For that were chosen samples a salt of natural potassium, a salt of uranium or torio and a sample of drinkable water

  15. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  16. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  17. Integrated services

    International Nuclear Information System (INIS)

    Chafcouloff, S.; Michel, G.; Trice, M.; Clark, G.; Cosad, C.; Forbes, K.

    1995-01-01

    Integrated services is the name given to several services grouped together under a single contract. Four key factors determine the success of integrated services projects: teamwork, common objectives, technology, and shared benefits. For oil companies, integration means smoother, more efficient operations by bringing service companies on board as part of the team. For the service industry, it means a radical change in the way business is conducted, taking on more responsibility in return for greater incentives. This article reviews the need for change and the approach Schlumberger has adopted to meet this challenge. 20 figs., 20 refs

  18. Integrated services

    Energy Technology Data Exchange (ETDEWEB)

    Chafcouloff, S.; Michel, G.; Trice, M. [Schlumberger Integrated Project Management Group, Montrouge (France); Clark, G. [Schlumberger Testing Services, Aberdeen (United Kingdom); Cosad, C.; Forbes, K. [Schlumberger Integrated Project Management Group, Aberdeen (United Kingdom)

    1995-12-31

    Integrated services is the name given to several services grouped together under a single contract. Four key factors determine the success of integrated services projects: teamwork, common objectives, technology, and shared benefits. For oil companies, integration means smoother, more efficient operations by bringing service companies on board as part of the team. For the service industry, it means a radical change in the way business is conducted, taking on more responsibility in return for greater incentives. This article reviews the need for change and the approach Schlumberger has adopted to meet this challenge. 20 figs., 20 refs

  19. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  20. Integrating Nephelometer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Uin, J. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Integrating Nephelometer (Figure 1) is an instrument that measures aerosol light scattering. It measures aerosol optical scattering properties by detecting (with a wide angular integration – from 7 to 170°) the light scattered by the aerosol and subtracting the light scattered by the carrier gas, the instrument walls and the background noise in the detector (zeroing). Zeroing is typically performed for 5 minutes every day at midnight UTC. The scattered light is split into red (700 nm), green (550 nm), and blue (450 nm) wavelengths and captured by three photomultiplier tubes. The instrument can measure total scatter as well as backscatter only (from 90 to 170°) (Heintzenberg and Charlson 1996; Anderson et al. 1996; Anderson and Ogren 1998; TSI 3563 2015) At ARM (Atmospheric Radiation Measurement), two identical Nephelometers are usually run in series with a sample relative humidity (RH) conditioner between them. This is possible because Nephelometer sampling is non-destructive and the sample can be passed on to another instrument. The sample RH conditioner scans through multiple RH values in cycles, treating the sample. This kind of setup allows to study how aerosol particles’ light scattering properties are affected by humidification (Anderson et al. 1996). For historical reasons, the two Nephelometers in this setup are labeled “wet” and “dry”, with the “dry” Nephelometer usually being the one before the conditioner and sampling ambient air (the names are switched for the MAOS measurement site due to the high RH of the ambient air).

  1. Sample of learners and teachers perception on the implementation of the affectivity and integral sexuality in the Costa Rican Educational Program Percepción de una muestra de educandos y docentes sobre la implementación del programa educación para la afectividad y la sexualidad integral

    Directory of Open Access Journals (Sweden)

    Giselle León León

    2013-04-01

    Full Text Available The case study focuses on the relevance of the Affectivity and Integral Sexuality in the Costa Rican Educational Program (EAIS implementation. This program was proposed by the Ministry of Public Education (MEP in 2013. It was implemented using a mixed approach under a dominant mixed classification. To gather data for the different study categories (knowledge of the topic, teaching models, and methodological strategies, a questionnaire was applied to nineteen students and some semi-structured interviews were carried out for five Science teachers from a high school within the Costa Rican Metropolitan Area. Some contributions were emphasized in matrices (charts and contrasted through of triangulation both of theory and participants. Within the main conclusions stands out the relevance of programs for the effectiveness and integral sexuality supported by professionals in education and the learning population of the study. In addition, it was identified that the participants’ ´knowledge of the study (learners and instructors was merely biological oriented.Recibido 12 de enero de 2013 • Corregido 08 de marzo de 2013 • Aceptado 13 de marzo de 2013La investigación versa sobre la pertinencia de la implementación del programa Educación para la Afectividad y la Sexualidad Integral (EASI propuesto por el Ministerio de Educación Pública de Costa Rica (MEP, a partir del año 2013. Esta se realizó  con el enfoque mixto, en la clasificación de mixto dominante. Para recopilar la información de las categorías del estudio (nivel de conocimiento del tema, modelos de enseñanza y estrategias metodológicas, se aplicó un cuestionario a diecinueve estudiantes (11 de sétimo y 8 de noveno año y se realizaron entrevistas semi-estructuradas a 5 docentes de enseñanza de las ciencias de un colegio diurno del área metropolitana costarricense. Algunos de los aportes fueron resaltados en matrices (tablas y contrastados mediante la triangulación de

  2. INTEGRATED EDUCATION

    Directory of Open Access Journals (Sweden)

    Lioara-Bianca BUBOIU

    2015-04-01

    Full Text Available Accepting and valuing people with disabilities is a key aspect of social policies promoted worldwide. The implementation of these policies aim normalize the lives of people with disabilities through full integration in the society to which they belong. Removing discrimination and social barriers equates to a maturing of the society, maturing translated by accepting diversity that surrounds us. Each person must be appreciated at its true value regardless of its condition of normality or deviation from it. Valuing individuals can be achieved only through a full acceptance in society, by assigning statuses and fulfilling social roles. School integration of children with special educational needs in mainstream education is a challenge and involves many aspects to be successful. It is the premise of social integration, the basis for future socio-professional insertion. Integrated education is the first step towards a world of equal opportunities, a world without discrimination.

  3. Integral cryptanalysis

    DEFF Research Database (Denmark)

    Knudsen, Lars Ramkilde; Wagner, David

    2002-01-01

    This paper considers a cryptanalytic approach called integral cryptanalysis. It can be seen as a dual to differential cryptanalysis and applies to ciphers not vulnerable to differential attacks. The method is particularly applicable to block ciphers which use bijective components only.......This paper considers a cryptanalytic approach called integral cryptanalysis. It can be seen as a dual to differential cryptanalysis and applies to ciphers not vulnerable to differential attacks. The method is particularly applicable to block ciphers which use bijective components only....

  4. Systems integration.

    Science.gov (United States)

    Siemieniuch, C E; Sinclair, M A

    2006-01-01

    The paper presents a view of systems integration, from an ergonomics/human factors perspective, emphasising the process of systems integration as is carried out by humans. The first section discusses some of the fundamental issues in systems integration, such as the significance of systems boundaries, systems lifecycle and systems entropy, issues arising from complexity, the implications of systems immortality, and so on. The next section outlines various generic processes for executing systems integration, to act as guides for practitioners. These address both the design of the system to be integrated and the preparation of the wider system in which the integration will occur. Then the next section outlines some of the human-specific issues that would need to be addressed in such processes; for example, indeterminacy and incompleteness, the prediction of human reliability, workload issues, extended situation awareness, and knowledge lifecycle management. For all of these, suggestions and further readings are proposed. Finally, the conclusions section reiterates in condensed form the major issues arising from the above.

  5. Functional Integration

    Science.gov (United States)

    Cartier, Pierre; DeWitt-Morette, Cecile

    2010-06-01

    Acknowledgements; List symbols, conventions, and formulary; Part I. The Physical and Mathematical Environment: 1. The physical and mathematical environment; Part II. Quantum Mechanics: 2. First lesson: gaussian integrals; 3. Selected examples; 4. Semiclassical expansion: WKB; 5. Semiclassical expansion: beyond WKB; 6. Quantum dynamics: path integrals and operator formalism; Part III. Methods from Differential Geometry: 7. Symmetries; 8. Homotopy; 9. Grassmann analysis: basics; 10. Grassmann analysis: applications; 11. Volume elements, divergences, gradients; Part IV. Non-Gaussian Applications: 12. Poisson processes in physics; 13. A mathematical theory of Poisson processes; 14. First exit time: energy problems; Part V. Problems in Quantum Field Theory: 15. Renormalization 1: an introduction; 16. Renormalization 2: scaling; 17. Renormalization 3: combinatorics; 18. Volume elements in quantum field theory Bryce DeWitt; Part VI. Projects: 19. Projects; Appendix A. Forward and backward integrals: spaces of pointed paths; Appendix B. Product integrals; Appendix C. A compendium of gaussian integrals; Appendix D. Wick calculus Alexander Wurm; Appendix E. The Jacobi operator; Appendix F. Change of variables of integration; Appendix G. Analytic properties of covariances; Appendix H. Feynman's checkerboard; Bibliography; Index.

  6. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  7. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  8. Effective sample labeling

    International Nuclear Information System (INIS)

    Rieger, J.T.; Bryce, R.W.

    1990-01-01

    Ground-water samples collected for hazardous-waste and radiological monitoring have come under strict regulatory and quality assurance requirements as a result of laws such as the Resource Conservation and Recovery Act. To comply with these laws, the labeling system used to identify environmental samples had to be upgraded to ensure proper handling and to protect collection personnel from exposure to sample contaminants and sample preservatives. The sample label now used as the Pacific Northwest Laboratory is a complete sample document. In the event other paperwork on a labeled sample were lost, the necessary information could be found on the label

  9. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  10. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  11. Absorption factor for cylindrical samples

    International Nuclear Information System (INIS)

    Sears, V.F.

    1984-01-01

    The absorption factor for the scattering of X-rays or neutrons in cylindrical samples is calculated by numerical integration for the case in which the absorption coefficients of the incident and scattered beams are not equal. An extensive table of values having an absolute accuracy of 10 -4 is given in a companion report [Sears (1983). Atomic Energy of Canada Limited, Report No. AECL-8176]. In the present paper an asymptotic expression is derived for the absorption factor which can be used with an error of less than 10 -3 for most cases of interest in both neutron inelastic scattering and neutron diffraction in crystals. (Auth.)

  12. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  13. Sampling in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim Harry; Petersen, Lars

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...... the necessary SUO’s (dependent on the practical situation) is the only prerequisite needed for eliminating all sampling bias and simultaneously minimizing sampling variance, and this is in addition a sure guarantee for making the final analytical results trustworthy. No reliable conclusions can be made unless...

  14. Integrated Toys

    DEFF Research Database (Denmark)

    Petersson, Eva

    2005-01-01

    the theoretical foundations of play and learning. In this presentation, we explore pedagogical potentials of new technologies and traditional toys integrated into a physical and virtual toy (hereinafter called integrated toy) with specific focus on the open-ended toy and non-formal learning. The integrated toy......Toys play a crucial role in supporting children’s learning and creation of meaning in their everyday life. Children also play with toys out of an interest to interact with others e.g. peers and adults. Tendencies of digital technology in toys have led to greater opportunities for manipulation...... and interaction supporting children’s play and learning such that technology is ever-present in the play environments of children. Although electronics have been deployed in tools for play and learning, most of it has facilitated individual learning. Computer games, for instance, most often are designed...

  15. Integrated, Continuous Emulsion Creamer.

    Science.gov (United States)

    Cochrane, Wesley G; Hackler, Amber L; Cavett, Valerie J; Price, Alexander K; Paegel, Brian M

    2017-12-19

    Automated and reproducible sample handling is a key requirement for high-throughput compound screening and currently demands heavy reliance on expensive robotics in screening centers. Integrated droplet microfluidic screening processors are poised to replace robotic automation by miniaturizing biochemical reactions to the droplet scale. These processors must generate, incubate, and sort droplets for continuous droplet screening, passively handling millions of droplets with complete uniformity, especially during the key step of sample incubation. Here, we disclose an integrated microfluidic emulsion creamer that packs ("creams") assay droplets by draining away excess oil through microfabricated drain channels. The drained oil coflows with creamed emulsion and then reintroduces the oil to disperse the droplets at the circuit terminus for analysis. Creamed emulsion assay incubation time dispersion was 1.7%, 3-fold less than other reported incubators. The integrated, continuous emulsion creamer (ICEcreamer) was used to miniaturize and optimize measurements of various enzymatic activities (phosphodiesterase, kinase, bacterial translation) under multiple- and single-turnover conditions. Combining the ICEcreamer with current integrated microfluidic DNA-encoded library bead processors eliminates potentially cumbersome instrumentation engineering challenges and is compatible with assays of diverse target class activities commonly investigated in drug discovery.

  16. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  17. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  18. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  19. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  20. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  1. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  2. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  3. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    environment, and by ingestion of foodstuffs that have incorporated C-14 by photosynthesis . Like tritium, C-14 is a very low energy beta emitter and is... bacterial growth and to minimize development of solids in the sample. • Properly identify each sample container with name, SSN, and collection start and...sampling in the same cardboard carton. The sample may be kept cool or frozen during collection to control odor and bacterial growth. • Once

  4. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  5. Scientific integrity

    DEFF Research Database (Denmark)

    Merlo, Domenico Franco; Vahakangas, Kirsi; Knudsen, Lisbeth E.

    2008-01-01

    consent was obtained.Integrity is central to environmental health research searching for causal relations. It requires open communication and trust and any violation (i.e., research misconduct, including fabrication or falsification of data, plagiarism, conflicting interests, etc.) may endanger...

  6. Integrated marketing.

    Science.gov (United States)

    2006-01-01

    St. John Health consists of nine hospitals throughout southern Michigan. Recently, in an attempt to brand the system as the state's premiere place for medical services, the system launched 'Real Medicine', a campaign that brands all nine hospitals together. Using print, radio, and television spots, the effort also integrates direct mail collateral and brochures to reach consumers.

  7. Integrative teaching

    NARCIS (Netherlands)

    Harris, Robert; Smids, Annejoke; Kors, Ninja

    2007-01-01

    This is an article about the integration of instrumental teaching, aural skills and keyboard skills and music theory at the pre-tertiary level. Team teaching and discipline crossover offer a possible solution to students’ inability to apply skills taught by specialists in separate fields. A personal

  8. Box Integrals

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.; Crandall, Richard E.

    2006-06-01

    By a "box integral" we mean here an expectation $\\langle|\\vec r - \\vec q|^s \\rangle$ where $\\vec r$runs over the unit $n$-cube,with $\\vec q$ and $s$ fixed, explicitly:\\begin eqnarray*&&\\int_01 \\cdots \\int_01 \\left((r_1 - q_1)2 + \\dots+(r_n-q_n)2\\right)^ s/2 \\ dr_1 \\cdots dr_n.\\end eqnarray* The study ofbox integrals leads one naturally into several disparate fields ofanalysis. While previous studies have focused upon symbolic evaluationand asymptotic analysis of special cases (notably $s = 1$), we workherein more generally--in interdisciplinary fashion--developing resultssuch as: (1) analytic continuation (in complex $s$), (2) relevantcombinatorial identities, (3) rapidly converging series, (4) statisticalinferences, (5) connections to mathematical physics, and (6)extreme-precision quadrature techniques appropriate for these integrals.These intuitions and results open up avenues of experimental mathematics,with a view to new conjectures and theorems on integrals of thistype.

  9. Mars Sample Handling Functionality

    Science.gov (United States)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  10. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  11. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  12. Generalized sampling in Julia

    DEFF Research Database (Denmark)

    Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud

    2017-01-01

    Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....

  13. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  14. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  15. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  16. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  17. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  18. Ground beetle habitat templets and riverbank integrity

    OpenAIRE

    Van Looy, Kris; Vanacker, Stijn; Jochems, Hans; De Blust, Geert; Dufrêne, M

    2006-01-01

    The habitat templet approach was used in a scale-sensitive bioindicator assessment for the ecological integrity of riverbanks and for specific responses to river management. Ground beetle habitat templets were derived from a catchment scale sampling, integrating the overall variety of bank types. This coarse-filter analysis was integrated in the reach scale fine-filtering approaches of community responses to habitat integrity and river management impacts. Higher species diversity was associat...

  19. Electrical discharge machining for vessel sample removal

    International Nuclear Information System (INIS)

    Litka, T.J.

    1993-01-01

    Due to aging-related problems or essential metallurgy information (plant-life extension or decommissioning) of nuclear plants, sample removal from vessels may be required as part of an examination. Vessel or cladding samples with cracks may be removed to determine the cause of cracking. Vessel weld samples may be removed to determine the weld metallurgy. In all cases, an engineering analysis must be done prior to sample removal to determine the vessel's integrity upon sample removal. Electrical discharge machining (EDM) is being used for in-vessel nuclear power plant vessel sampling. Machining operations in reactor coolant system (RCS) components must be accomplished while collecting machining chips that could cause damage if they become part of the flow stream. The debris from EDM is a fine talclike particulate (no chips), which can be collected by flushing and filtration

  20. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  1. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  2. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  3. Development of bull trout sampling protocols

    Science.gov (United States)

    R. F. Thurow; J. T. Peterson; J. W. Guzevich

    2001-01-01

    This report describes results of research conducted in Washington in 2000 through Interagency Agreement #134100H002 between the U.S. Fish and Wildlife Service (USFWS) and the U.S. Forest Service Rocky Mountain Research Station (RMRS). The purpose of this agreement is to develop a bull trout (Salvelinus confluentus) sampling protocol by integrating...

  4. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  5. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  6. Simple street tree sampling

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  7. Sampling or gambling

    Energy Technology Data Exchange (ETDEWEB)

    Gy, P.M.

    1981-12-01

    Sampling can be compared to no other technique. A mechanical sampler must above all be selected according to its aptitude for supressing or reducing all components of the sampling error. Sampling is said to be correct when it gives all elements making up the batch of matter submitted to sampling an uniform probability of being selected. A sampler must be correctly designed, built, installed, operated and maintained. When the conditions of sampling correctness are not strictly respected, the sampling error can no longer be controlled and can, unknown to the user, be unacceptably large: the sample is no longer representative. The implementation of an incorrect sampler is a form of gambling and this paper intends to show that at this game the user is nearly always the loser in the long run. The users' and the manufacturers' interests may diverge and the standards which should safeguard the users' interests very often fail to do so by tolerating or even recommending incorrect techniques such as the implementation of too narrow cutters traveling too fast through the stream to be sampled.

  8. Integrated Design

    DEFF Research Database (Denmark)

    Jørgensen, Michael; Nielsen, M. W.; Strømann-Andersen, Jakob Bjørn

    2011-01-01

    and describe the decision process. Specific attention is given to how the engineering input was presented and how it was able to facilitate the design development. Site and context, building shape, organization of functions and HVAC-systems were all included to obtain a complete picture of the building......, low-energy consumption, and high-quality indoor environment. We use this case study to investigate how technical knowledge about building performance can be integrated into the conceptual design stage. We have selected certain points during the design process that represented design challenges...

  9. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  10. Fluidics platform and method for sample preparation

    Science.gov (United States)

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  11. Integrated controls

    International Nuclear Information System (INIS)

    Hollaway, F.W.

    1985-01-01

    During 1984, all portions of the Nova control system that were necessary for the support of laser activation and completion of the Nova project were finished and placed in service on time. The Nova control system has been unique in providing, on schedule, the capabilities required in the central control room and in various local control areas throughout the facility. The ambitious goal of deploying this system early enough to use it as an aid in the activation of the laser was accomplished; thus the control system made a major contribution to the completion of Nova activation on schedule. Support and enhancement activities continued during the year on the VAX computer systems, central control room, operator consoles and displays, Novanet data communications network, system-level software for both the VAX and LSI-11 computers, Praxis control system computer language, software management tools, and the development system, which includes office terminals. Computational support was also supplied for a wide variety of test fixtures required by the optical and mechanical subsystems. Significant new advancements were made in four areas in integrated controls this year: the integration software (which includes the shot scheduler), the Praxis language, software quality assurance audit, and software development and data handling. A description of the accomplishments in each of these areas follows

  12. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  13. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  14. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  15. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  16. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2013-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  17. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2012-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author).

  18. How to take environmental samples for stable isotope analyses

    International Nuclear Information System (INIS)

    Rogers, K.M.

    2009-01-01

    It is possible to analyse a diverse range of samples for environmental investigations. The main types are soil/sediments, vegetation, fauna, shellfish, waste and water. Each type of samples requires different storage and collection methods. Outlined here are the preferred methods of collection to ensure maximum sample integrity and reliability. (author)

  19. Integrating Stealth

    Science.gov (United States)

    2015-12-01

    fact, these weapon systems should be portrayed as having stealth technology; specifically in the physical sense referring to low observable...a sampling of the current uses of the term “stealth” in Air Force Doctrine and Joint Publications: Reference Source “Precise planning will...destroy- americas-f-35-battle-13429 7. Russian / PLA Low Band Surveillance Radars. http://www.ausairpower.net/ APA -Rus-Low- Band-Radars.html 8

  20. Sample Return Robot

    Data.gov (United States)

    National Aeronautics and Space Administration — This Challenge requires demonstration of an autonomous robotic system to locate and collect a set of specific sample types from a large planetary analog area and...

  1. Ecotoxicology statistical sampling

    International Nuclear Information System (INIS)

    Saona, G.

    2012-01-01

    This presentation introduces to general concepts in toxicology sample designs such as the distribution of organic or inorganic contaminants, a microbiological contamination, and the determination of the position in an eco toxicological bioassays ecosystem.

  2. Mini MAX - Medicaid Sample

    Data.gov (United States)

    U.S. Department of Health & Human Services — To facilitate wider use of MAX, CMS contracted with Mathematica to convene a technical expert panel (TEP) and determine the feasibility of creating a sample file for...

  3. Operational air sampling report

    International Nuclear Information System (INIS)

    Lyons, C.L.

    1994-03-01

    Nevada Test Site vertical shaft and tunnel events generate beta/gamma fission products. The REECo air sampling program is designed to measure these radionuclides at various facilities supporting these events. The current testing moratorium and closure of the Decontamination Facility has decreased the scope of the program significantly. Of the 118 air samples collected in the only active tunnel complex, only one showed any airborne fission products. Tritiated water vapor concentrations were very similar to previously reported levels. The 206 air samples collected at the Area-6 decontamination bays and laundry were again well below any Derived Air Concentration calculation standard. Laboratory analyses of these samples were negative for any airborne fission products

  4. Collecting Samples for Testing

    Science.gov (United States)

    ... Creatinine Ratio Valproic Acid Vancomycin Vanillylmandelic Acid (VMA) VAP Vitamin A Vitamin B12 and Folate Vitamin D ... that used for CSF in that they require aspiration of a sample of the fluid through a ...

  5. Roadway sampling evaluation.

    Science.gov (United States)

    2014-09-01

    The Florida Department of Transportation (FDOT) has traditionally required that all sampling : and testing of asphalt mixtures be at the Contractors production facility. With recent staffing cuts, as : well as budget reductions, FDOT has been cons...

  6. Soil Gas Sampling

    Science.gov (United States)

    Field Branches Quality System and Technical Procedures: This document describes general and specific procedures, methods and considerations to be used and observed when collecting soil gas samples for field screening or laboratory analysis.

  7. Soil Sampling Operating Procedure

    Science.gov (United States)

    EPA Region 4 Science and Ecosystem Support Division (SESD) document that describes general and specific procedures, methods, and considerations when collecting soil samples for field screening or laboratory analysis.

  8. A Geology Sampling System for Small Bodies

    Science.gov (United States)

    Naids, Adam J.; Hood, Anthony D.; Abell, Paul; Graff, Trevor; Buffington, Jesse

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are being discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a small body. Currently, the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  9. A Geology Sampling System for Microgravity Bodies

    Science.gov (United States)

    Hood, Anthony; Naids, Adam

    2016-01-01

    Human exploration of microgravity bodies is being investigated as a precursor to a Mars surface mission. Asteroids, comets, dwarf planets, and the moons of Mars all fall into this microgravity category and some are been discussed as potential mission targets. Obtaining geological samples for return to Earth will be a major objective for any mission to a microgravity body. Currently the knowledge base for geology sampling in microgravity is in its infancy. Humans interacting with non-engineered surfaces in microgravity environment pose unique challenges. In preparation for such missions a team at the NASA Johnson Space Center has been working to gain experience on how to safely obtain numerous sample types in such an environment. This paper describes the type of samples the science community is interested in, highlights notable prototype work, and discusses an integrated geology sampling solution.

  10. Statistical sampling plans

    International Nuclear Information System (INIS)

    Jaech, J.L.

    1984-01-01

    In auditing and in inspection, one selects a number of items by some set of procedures and performs measurements which are compared with the operator's values. This session considers the problem of how to select the samples to be measured, and what kinds of measurements to make. In the inspection situation, the ultimate aim is to independently verify the operator's material balance. The effectiveness of the sample plan in achieving this objective is briefly considered. The discussion focuses on the model plant

  11. Two phase sampling

    CERN Document Server

    Ahmad, Zahoor; Hanif, Muhammad

    2013-01-01

    The development of estimators of population parameters based on two-phase sampling schemes has seen a dramatic increase in the past decade. Various authors have developed estimators of population using either one or two auxiliary variables. The present volume is a comprehensive collection of estimators available in single and two phase sampling. The book covers estimators which utilize information on single, two and multiple auxiliary variables of both quantitative and qualitative nature. Th...

  12. Multidimensional singular integrals and integral equations

    CERN Document Server

    Mikhlin, Solomon Grigorievich; Stark, M; Ulam, S

    1965-01-01

    Multidimensional Singular Integrals and Integral Equations presents the results of the theory of multidimensional singular integrals and of equations containing such integrals. Emphasis is on singular integrals taken over Euclidean space or in the closed manifold of Liapounov and equations containing such integrals. This volume is comprised of eight chapters and begins with an overview of some theorems on linear equations in Banach spaces, followed by a discussion on the simplest properties of multidimensional singular integrals. Subsequent chapters deal with compounding of singular integrals

  13. Integrated Microfluidic Gas Sensors for Water Monitoring

    Science.gov (United States)

    Zhu, L.; Sniadecki, N.; DeVoe, D. L.; Beamesderfer, M.; Semancik, S.; DeVoe, D. L.

    2003-01-01

    A silicon-based microhotplate tin oxide (SnO2) gas sensor integrated into a polymer-based microfluidic system for monitoring of contaminants in water systems is presented. This device is designed to sample a water source, control the sample vapor pressure within a microchannel using integrated resistive heaters, and direct the vapor past the integrated gas sensor for analysis. The sensor platform takes advantage of novel technology allowing direct integration of discrete silicon chips into a larger polymer microfluidic substrate, including seamless fluidic and electrical interconnects between the substrate and silicon chip.

  14. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  15. Reactor water sampling device

    International Nuclear Information System (INIS)

    Sakamaki, Kazuo.

    1992-01-01

    The present invention concerns a reactor water sampling device for sampling reactor water in an in-core monitor (neutron measuring tube) housing in a BWR type reactor. The upper end portion of a drain pipe of the reactor water sampling device is attached detachably to an in-core monitor flange. A push-up rod is inserted in the drain pipe vertically movably. A sampling vessel and a vacuum pump are connected to the lower end of the drain pipe. A vacuum pump is operated to depressurize the inside of the device and move the push-up rod upwardly. Reactor water in the in-core monitor housing flows between the drain pipe and the push-up rod and flows into the sampling vessel. With such a constitution, reactor water in the in-core monitor housing can be sampled rapidly with neither opening the lid of the reactor pressure vessel nor being in contact with air. Accordingly, operator's exposure dose can be reduced. (I.N.)

  16. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  17. Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  18. Nonuniform sampling by quantiles

    Science.gov (United States)

    Craft, D. Levi; Sonstrom, Reilly E.; Rovnyak, Virginia G.; Rovnyak, David

    2018-03-01

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license.

  19. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  20. Sample collection and documentation

    International Nuclear Information System (INIS)

    Cullings, Harry M.; Fujita, Shoichiro; Watanabe, Tadaaki; Yamashita, Tomoaki; Tanaka, Kenichi; Endo, Satoru; Shizuma, Kiyoshi; Hoshi, Masaharu; Hasai, Hiromi

    2005-01-01

    Beginning within a few weeks after the bombings and periodically during the intervening decades, investigators in Hiroshima and Nagasaki have collected samples of materials that were in the cities at the time of the bombings. Although some early efforts were not driven by specific measurement objectives, many others were. Even some of the very earliest samples collected in 1945 were based on carefully conceived research plans and detailed specifications for samples appropriate to particular retrospective measurements, i.e., of particular residual quantities remaining from exposure to the neutrons and gamma rays from the bombs. This chapter focuses mainly on the work of groups at two institutions that have actively collaborated since the 1980s in major collection efforts and have shared samples among themselves and with other investigators: the Radiation Effects Research Foundation (RERF) and its predecessor the Atomic Bomb Casualty Commission (ABCC), and Hiroshima University. In addition, a number of others are listed, who also contributed to the literature by their collection of samples. (J.P.N.)

  1. Applying recursive numerical integration techniques for solving high dimensional integrals

    International Nuclear Information System (INIS)

    Ammon, Andreas; Genz, Alan; Hartung, Tobias; Jansen, Karl; Volmer, Julia; Leoevey, Hernan

    2016-11-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  2. Applying recursive numerical integration techniques for solving high dimensional integrals

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, Andreas [IVU Traffic Technologies AG, Berlin (Germany); Genz, Alan [Washington State Univ., Pullman, WA (United States). Dept. of Mathematics; Hartung, Tobias [King' s College, London (United Kingdom). Dept. of Mathematics; Jansen, Karl; Volmer, Julia [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leoevey, Hernan [Humboldt Univ. Berlin (Germany). Inst. fuer Mathematik

    2016-11-15

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with N samples behaves like 1/√(N). This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points m that is at least exponential.

  3. Combining Electrochemical Sensors with Miniaturized Sample Preparation for Rapid Detection in Clinical Samples

    Science.gov (United States)

    Bunyakul, Natinan; Baeumner, Antje J.

    2015-01-01

    Clinical analyses benefit world-wide from rapid and reliable diagnostics tests. New tests are sought with greatest demand not only for new analytes, but also to reduce costs, complexity and lengthy analysis times of current techniques. Among the myriad of possibilities available today to develop new test systems, amperometric biosensors are prominent players—best represented by the ubiquitous amperometric-based glucose sensors. Electrochemical approaches in general require little and often enough only simple hardware components, are rugged and yet provide low limits of detection. They thus offer many of the desirable attributes for point-of-care/point-of-need tests. This review focuses on investigating the important integration of sample preparation with (primarily electrochemical) biosensors. Sample clean up requirements, miniaturized sample preparation strategies, and their potential integration with sensors will be discussed, focusing on clinical sample analyses. PMID:25558994

  4. Groundwater sampling: Chapter 5

    Science.gov (United States)

    Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati

    2011-01-01

    About the book: As water quality becomes a leading concern for people and ecosystems worldwide, it must be properly assessed in order to protect water resources for current and future generations. Water Quality Concepts, Sampling, and Analyses supplies practical information for planning, conducting, or evaluating water quality monitoring programs. It presents the latest information and methodologies for water quality policy, regulation, monitoring, field measurement, laboratory analysis, and data analysis. The book addresses water quality issues, water quality regulatory development, monitoring and sampling techniques, best management practices, and laboratory methods related to the water quality of surface and ground waters. It also discusses basic concepts of water chemistry and hydrology related to water sampling and analysis; instrumentation; water quality data analysis; and evaluation and reporting results.

  5. INEL Sample Management Office

    International Nuclear Information System (INIS)

    Watkins, C.

    1994-01-01

    The Idaho National Engineering Laboratory (INEL) Sample Management Office (SMO) was formed as part of the EG ampersand G Idaho Environmental Restoration Program (ERP) in June, 1990. Since then, the SMO has been recognized and sought out by other prime contractors and programs at the INEL. Since December 1991, the DOE-ID Division Directors for the Environmental Restoration Division and Waste Management Division supported the expansion of the INEL ERP SMO into the INEL site wide SMO. The INEL SMO serves as a point of contact for multiple environmental analytical chemistry and laboratory issues (e.g., capacity, capability). The SMO chemists work with project managers during planning to help develop data quality objectives, select appropriate analytical methods, identify special analytical services needs, identify a source for the services, and ensure that requirements for sampling and analysis (e.g., preservations, sample volumes) are clear and technically accurate. The SMO chemists also prepare work scope statements for the laboratories performing the analyses

  6. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  7. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  8. Integrated care.

    Science.gov (United States)

    Warwick-Giles, Lynsey; Checkland, Kath

    2018-03-19

    Purpose The purpose of this paper is to try and understand how several organisations in one area in England are working together to develop an integrated care programme. Weick's (1995) concept of sensemaking is used as a lens to examine how the organisations are working collaboratively and maintaining the programme. Design/methodology/approach Qualitative methods included: non-participant observations of meetings, interviews with key stakeholders and the collection of documents relating to the programme. These provided wider contextual information about the programme. Comprehensive field notes were taken during observations and analysed alongside interview transcriptions using NVIVO software. Findings This paper illustrates the importance of the construction of a shared identity across all organisations involved in the programme. Furthermore, the wider policy discourse impacted on how the programme developed and influenced how organisations worked together. Originality/value The role of leaders from all organisations involved in the programme was of significance to the overall development of the programme and the sustained momentum behind the programme. Leaders were able to generate a "narrative of success" to drive the programme forward. This is of particular relevance to evaluators, highlighting the importance of using multiple methods to allow researchers to probe beneath the surface of programmes to ensure that evidence moves beyond this public narrative.

  9. Analysis of monazite samples

    International Nuclear Information System (INIS)

    Kartiwa Sumadi; Yayah Rohayati

    1996-01-01

    The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis

  10. How to Deal with FFT Sampling Influences on ADEV Calculations

    National Research Council Canada - National Science Library

    Chang, Po-Cheng

    2007-01-01

    ...) values while the numerical integration is used for the time and frequency (T&F) conversion. These ADEV errors occur because parts of the FFT sampling have no contributions to the ADEV calculation...

  11. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.; Hijazi, Y.; Westerteiger, R.; Schott, M.; Hansen, C.; Hagen, H.

    2009-01-01

    classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches

  12. Analysis of metal samples

    International Nuclear Information System (INIS)

    Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.

    2001-01-01

    An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)

  13. Preeminence and prerequisites of sample size calculations in clinical trials

    OpenAIRE

    Richa Singhal; Rakesh Rana

    2015-01-01

    The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary out...

  14. Improving the Acquisition and Management of Sample Curation Data

    Science.gov (United States)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  15. Liquid waste sampling device

    International Nuclear Information System (INIS)

    Kosuge, Tadashi

    1998-01-01

    A liquid pumping pressure regulator is disposed on the midway of a pressure control tube which connects the upper portion of a sampling pot and the upper portion of a liquid waste storage vessel. With such a constitution, when the pressure in the sampling pot is made negative, and liquid wastes are sucked to the liquid pumping tube passing through the sampling pot, the difference between the pressure on the entrance of the liquid pumping pressure regulator of the pressure regulating tube and the pressure at the bottom of the liquid waste storage vessel is made constant. An opening degree controlling meter is disposed to control the degree of opening of a pressure regulating valve for sending actuation pressurized air to the liquid pumping pressure regulator. Accordingly, even if the liquid level of liquid wastes in the liquid waste storage vessel is changed, the height for the suction of the liquid wastes in the liquid pumping tube can be kept constant. With such procedures, sampling can be conducted correctly, and the discharge of the liquid wastes to the outside can be prevented. (T.M.)

  16. IXM gas sampling procedure

    International Nuclear Information System (INIS)

    Pingel, L.A.

    1995-01-01

    Ion Exchange Modules (IXMs) are used at the 105-KE and -KW Fuel Storage Basins to control radionuclide concentrations in the water. A potential safety concern relates to production of hydrogen gas by radiolysis of the water trapped in the ion exchange media of spent IXMs. This document provides a procedure for sampling the gases in the head space of the IXM

  17. Request for wood samples

    NARCIS (Netherlands)

    NN,

    1977-01-01

    In recent years the wood collection at the Rijksherbarium was greatly expanded following a renewed interest in wood anatomy as an aid for solving classification problems. Staff members of the Rijksherbarium added to the collection by taking interesting wood samples with them from their expeditions

  18. Check Sample Abstracts.

    Science.gov (United States)

    Alter, David; Grenache, David G; Bosler, David S; Karcher, Raymond E; Nichols, James; Rajadhyaksha, Aparna; Camelo-Piragua, Sandra; Rauch, Carol; Huddleston, Brent J; Frank, Elizabeth L; Sluss, Patrick M; Lewandrowski, Kent; Eichhorn, John H; Hall, Janet E; Rahman, Saud S; McPherson, Richard A; Kiechle, Frederick L; Hammett-Stabler, Catherine; Pierce, Kristin A; Kloehn, Erica A; Thomas, Patricia A; Walts, Ann E; Madan, Rashna; Schlesinger, Kathie; Nawgiri, Ranjana; Bhutani, Manoop; Kanber, Yonca; Abati, Andrea; Atkins, Kristen A; Farrar, Robert; Gopez, Evelyn Valencerina; Jhala, Darshana; Griffin, Sonya; Jhala, Khushboo; Jhala, Nirag; Bentz, Joel S; Emerson, Lyska; Chadwick, Barbara E; Barroeta, Julieta E; Baloch, Zubair W; Collins, Brian T; Middleton, Owen L; Davis, Gregory G; Haden-Pinneri, Kathryn; Chu, Albert Y; Keylock, Joren B; Ramoso, Robert; Thoene, Cynthia A; Stewart, Donna; Pierce, Arand; Barry, Michelle; Aljinovic, Nika; Gardner, David L; Barry, Michelle; Shields, Lisa B E; Arnold, Jack; Stewart, Donna; Martin, Erica L; Rakow, Rex J; Paddock, Christopher; Zaki, Sherif R; Prahlow, Joseph A; Stewart, Donna; Shields, Lisa B E; Rolf, Cristin M; Falzon, Andrew L; Hudacki, Rachel; Mazzella, Fermina M; Bethel, Melissa; Zarrin-Khameh, Neda; Gresik, M Vicky; Gill, Ryan; Karlon, William; Etzell, Joan; Deftos, Michael; Karlon, William J; Etzell, Joan E; Wang, Endi; Lu, Chuanyi M; Manion, Elizabeth; Rosenthal, Nancy; Wang, Endi; Lu, Chuanyi M; Tang, Patrick; Petric, Martin; Schade, Andrew E; Hall, Geraldine S; Oethinger, Margret; Hall, Geraldine; Picton, Avis R; Hoang, Linda; Imperial, Miguel Ranoa; Kibsey, Pamela; Waites, Ken; Duffy, Lynn; Hall, Geraldine S; Salangsang, Jo-Anne M; Bravo, Lulette Tricia C; Oethinger, Margaret D; Veras, Emanuela; Silva, Elvia; Vicens, Jimena; Silva, Elvio; Keylock, Joren; Hempel, James; Rushing, Elizabeth; Posligua, Lorena E; Deavers, Michael T; Nash, Jason W; Basturk, Olca; Perle, Mary Ann; Greco, Alba; Lee, Peng; Maru, Dipen; Weydert, Jamie Allen; Stevens, Todd M; Brownlee, Noel A; Kemper, April E; Williams, H James; Oliverio, Brock J; Al-Agha, Osama M; Eskue, Kyle L; Newlands, Shawn D; Eltorky, Mahmoud A; Puri, Puja K; Royer, Michael C; Rush, Walter L; Tavora, Fabio; Galvin, Jeffrey R; Franks, Teri J; Carter, James Elliot; Kahn, Andrea Graciela; Lozada Muñoz, Luis R; Houghton, Dan; Land, Kevin J; Nester, Theresa; Gildea, Jacob; Lefkowitz, Jerry; Lacount, Rachel A; Thompson, Hannis W; Refaai, Majed A; Quillen, Karen; Lopez, Ana Ortega; Goldfinger, Dennis; Muram, Talia; Thompson, Hannis

    2009-02-01

    The following abstracts are compiled from Check Sample exercises published in 2008. These peer-reviewed case studies assist laboratory professionals with continuing medical education and are developed in the areas of clinical chemistry, cytopathology, forensic pathology, hematology, microbiology, surgical pathology, and transfusion medicine. Abstracts for all exercises published in the program will appear annually in AJCP.

  19. Integrated reporting and board features

    Directory of Open Access Journals (Sweden)

    Rares HURGHIS

    2017-02-01

    Full Text Available In the last two decades the concept of sustainability reporting gained more importance in the companies’ annual reports, a trend which is embedded also in integrated reporting. Issuing an integrated report became a necessity, because the report explains to the investors how the organization creates value over time. The governance structure, more exactly the board of directors, decides whether or not the company will issue an integrated report. Thus, are there certain features of the board that might influence the issue of an integrated report? Do the companies which issue an integrated report have certain features of the governance structure? Looking for an answer to these questions, we seek for any possible correlations between a disclosure index and the corporate governance structure characteristics, on a sample from the companies participating at the International Integrated Reporting Council Examples Database. The results highlight that only the size of the board influences the extent to which the issued integrated report is in accordance with the International Framework.

  20. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  1. Cerenkov fiber sampling calorimeters

    International Nuclear Information System (INIS)

    Arrington, K.; Kefford, D.; Kennedy, J.; Pisani, R.; Sanzeni, C.; Segall, K.; Wall, D.; Winn, D.R.; Carey, R.; Dye, S.; Miller, J.; Sulak, L.; Worstell, W.; Efremenko, Y.; Kamyshkov, Y.; Savin, A.; Shmakov, K.; Tarkovsky, E.

    1994-01-01

    Clear optical fibers were used as a Cerenkov sampling media in Pb (electromagnetic) and Cu (hadron) absorbers in spaghetti calorimeters, for high rate and high radiation dose experiments, such as the forward region of high energy colliders. The fiber axes were aligned close to the direction of the incident particles (1 degree--7 degree). The 7 λ deep hadron tower contained 2.8% by volume 1.5 mm diameter core clear plastic fibers. The 27 radiation length deep electromagnetic towers had packing fractions of 6.8% and 7.2% of 1 mm diameter core quartz fibers as the active Cerenkov sampling medium. The energy resolution on electrons and pions, energy response, pulse shapes and angular studies are presented

  2. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  3. Underwater Sediment Sampling Research

    Science.gov (United States)

    2017-01-01

    impacted sediments was found to be directly related to the concentration of crude oil detected in the sediment pore waters . Applying this mathematical...Kurt.A.Hansen@uscg.mil. 16. Abstract (MAXIMUM 200 WORDS ) The USCG R&D Center sought to develop a bench top system to determine the amount of total...scattered. The approach here is to sample the interstitial water between the grains of sand and attempt to determine the amount of oil in and on

  4. ITOUGH2 sample problems

    International Nuclear Information System (INIS)

    Finsterle, S.

    1997-11-01

    This report contains a collection of ITOUGH2 sample problems. It complements the ITOUGH2 User's Guide [Finsterle, 1997a], and the ITOUGH2 Command Reference [Finsterle, 1997b]. ITOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media [Preuss, 1987, 1991a]. The report ITOUGH2 User's Guide [Finsterle, 1997a] describes the inverse modeling framework and provides the theoretical background. The report ITOUGH2 Command Reference [Finsterle, 1997b] contains the syntax of all ITOUGH2 commands. This report describes a variety of sample problems solved by ITOUGH2. Table 1.1 contains a short description of the seven sample problems discussed in this report. The TOUGH2 equation-of-state (EOS) module that needs to be linked to ITOUGH2 is also indicated. Each sample problem focuses on a few selected issues shown in Table 1.2. ITOUGH2 input features and the usage of program options are described. Furthermore, interpretations of selected inverse modeling results are given. Problem 1 is a multipart tutorial, describing basic ITOUGH2 input files for the main ITOUGH2 application modes; no interpretation of results is given. Problem 2 focuses on non-uniqueness, residual analysis, and correlation structure. Problem 3 illustrates a variety of parameter and observation types, and describes parameter selection strategies. Problem 4 compares the performance of minimization algorithms and discusses model identification. Problem 5 explains how to set up a combined inversion of steady-state and transient data. Problem 6 provides a detailed residual and error analysis. Finally, Problem 7 illustrates how the estimation of model-related parameters may help compensate for errors in that model

  5. Lunar sample studies

    International Nuclear Information System (INIS)

    1977-01-01

    Lunar samples discussed and the nature of their analyses are: (1) an Apollo 15 breccia which is thoroughly analyzed as to the nature of the mature regolith from which it derived and the time and nature of the lithification process, (2) two Apollo 11 and one Apollo 12 basalts analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography, (3) eight Apollo 17 mare basalts, also analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography. The first seven are shown to be chemically similar although of two main textural groups; the eighth is seen to be distinct in both chemistry and mineralogy, (4) a troctolitic clast from a Fra Mauro breccia, analyzed and contrasted with other high-temperature lunar mineral assemblages. Two basaltic clasts from the same breccia are shown to have affinities with rock 14053, and (5) the uranium-thorium-lead systematics of three Apollo 16 samples are determined; serious terrestrial-lead contamination of the first two samples is attributed to bandsaw cutting in the lunar curatorial facility

  6. Sustainable Mars Sample Return

    Science.gov (United States)

    Alston, Christie; Hancock, Sean; Laub, Joshua; Perry, Christopher; Ash, Robert

    2011-01-01

    The proposed Mars sample return mission will be completed using natural Martian resources for the majority of its operations. The system uses the following technologies: In-Situ Propellant Production (ISPP), a methane-oxygen propelled Mars Ascent Vehicle (MAV), a carbon dioxide powered hopper, and a hydrogen fueled balloon system (large balloons and small weather balloons). The ISPP system will produce the hydrogen, methane, and oxygen using a Sabatier reactor. a water electrolysis cell, water extracted from the Martian surface, and carbon dioxide extracted from the Martian atmosphere. Indigenous hydrogen will fuel the balloon systems and locally-derived methane and oxygen will fuel the MAV for the return of a 50 kg sample to Earth. The ISPP system will have a production cycle of 800 days and the estimated overall mission length is 1355 days from Earth departure to return to low Earth orbit. Combining these advanced technologies will enable the proposed sample return mission to be executed with reduced initial launch mass and thus be more cost efficient. The successful completion of this mission will serve as the next step in the advancement of Mars exploration technology.

  7. Bottom sample taker

    Energy Technology Data Exchange (ETDEWEB)

    Garbarenko, O V; Slonimskiy, L D

    1982-01-01

    In order to improve the quality of the samples taken during offshore exploration from benthic sediments, the proposed design of the sample taker has a device which makes it possible to regulate the depth of submersion of the core lifter. For this purpose the upper part of the core lifter has an inner delimiting ring, and within the core lifter there is a piston suspended on a cable. The position of the piston in relation to the core lifter is previously assigned depending on the compactness of the benthic sediments and is fixed by tension of the cable which is held by a clamp in the cover of the core taker housing. When lowered to the bottom, the core taker is released, and under the influence of hydrostatic pressure of sea water, it enters the sediments. The magnitude of penetration is limited by the distance between the piston and the stopping ring. The piston also guarantees better preservation of the sample when the instrument is lifted to the surface.

  8. Sample-taking apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Tanov, Y I; Ismailov, R A; Orazov, A

    1980-10-07

    The invention refers to the equipment for testing water-bearing levels in loose rocks. Its purpose is to simultaneously remove with the rock sample a separate fluid sample from the assigned interval. The sample-taking apparatus contains a core lifter which can be submerged into the casting string with housing and front endpiece in the form of a rod with a piston which covers the cavity of the core lifter, as well as mechanism for fixing and moving the endpiece within the core lifter cavity. The device differs from the known similar devices because the upper part of the housing of the core lifter is equipped with a filter and mobile casting which covers the filter. In this case the casing is connected to the endpiece rod and the endpiece is installed with the possibility of movement which is limited with fixing in the upper position and in the extreme upper position it divides the core lifter cavity into two parts, filter settling tank and core-receiving cavity.

  9. Visualizing the Sample Standard Deviation

    Science.gov (United States)

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  10. Shared or Integrated: Which Type of Integration is More Effective Improves Students’ Creativity?

    Science.gov (United States)

    Mariyam, M.; Kaniawati, I.; Sriyati, S.

    2017-09-01

    Integrated science learning has various types of integration. This study aims to apply shared and integrated type of integration with project based learning (PjBL) model to improve students’ creativity on waste recycling theme. The research method used is a quasi experiment with the matching-only pre test-post test design. The samples of this study are 108 students consisting of 36 students (experiment class 1st), 35 students (experiment class 2nd) and 37 students (control class 3rd) at one of Junior High School in Tanggamus, Lampung. The results show that there is difference of creativity improvement in the class applied by PjBL model with shared type of integration, integrated type of integration and without any integration in waste recycling theme. Class applied by PjBL model with shared type of integration has the higher creativity improvement than the PjBL model with integrated type of integration and without any integration. Integrated science learning using shared type only combines 2 lessons, hence an intact concept is resulted. So, PjBL model with shared type of integration more effective improves students’ creativity than integrated type.

  11. High priority tank sampling and analysis report

    International Nuclear Information System (INIS)

    Brown, T.M.

    1998-01-01

    or grab sampled and used. A total of condensed phase samples from 144 tanks were used. Vapor samples for 82 of the tanks were used to address questions needing vapor analysis results. Additional High Priority and other tanks used to address specific questions provided comparable information to that expected from the original plan. Simultaneously, a robust systems integrated approach for establishing near term sampling requirements has been established as part of the Tank Waste Remediation System's culture. No further sampling and analysis will be conducted for the sole purpose of addressing the 12 questions in the Implementation Plan. Characterization sampling and analysis will continue in support of other requirements and decision making as identified through application of the systems integrated approach

  12. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  13. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  14. Definite Integrals using Orthogonality and Integral Transforms

    Directory of Open Access Journals (Sweden)

    Howard S. Cohl

    2012-10-01

    Full Text Available We obtain definite integrals for products of associated Legendre functions with Bessel functions, associated Legendre functions, and Chebyshev polynomials of the first kind using orthogonality and integral transforms.

  15. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-09-12

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology, possibly one under development at Nonlinear Ion Dynamics (NID), will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL in January 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are reported here. A second sample of isotopically separated copper was provided by NID to PNNL in August 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are also reported here.

  16. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  17. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  18. Integrated project delivery : The designer as integrator

    NARCIS (Netherlands)

    Wamelink, J.W.F.; Koolwijk, J.S.J.; van Doorn, A.J.

    2012-01-01

    Process innovation related to integrated project delivery is an important topic in the building industry. Studies on process innovation through the use of integrated contracts usually focus on contractors, and particularly on the possibility of forward integration into the building process. Three

  19. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-02-01

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.

  20. Fluid sampling tool

    Science.gov (United States)

    Garcia, A.R.; Johnston, R.G.; Martinez, R.K.

    1999-05-25

    A fluid sampling tool is described for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall. 6 figs.

  1. NID Copper Sample Analysis

    International Nuclear Information System (INIS)

    Kouzes, Richard T.; Zhu, Zihua

    2011-01-01

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76 Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76 Ge. The DEMONSTRATOR will utilize 76 Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.

  2. Fluid sampling tool

    Science.gov (United States)

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    1999-05-25

    A fluid sampling tool for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall.

  3. Tritium sampling and measurement

    International Nuclear Information System (INIS)

    Wood, M.J.; McElroy, R.G.; Surette, R.A.; Brown, R.M.

    1993-01-01

    Current methods for sampling and measuring tritium are described. Although the basic techniques have not changed significantly over the last 10 y, there have been several notable improvements in tritium measurement instrumentation. The design and quality of commercial ion-chamber-based and gas-flow-proportional-counter-based tritium monitors for tritium-in-air have improved, an indirect result of fusion-related research in the 1980s. For tritium-in-water analysis, commercial low-level liquid scintillation spectrometers capable of detecting tritium-in-water concentrations as low as 0.65 Bq L-1 for counting times of 500 min are available. The most sensitive method for tritium-in-water analysis is still 3He mass spectrometry. Concentrations as low as 0.35 mBq L-1 can be detected with current equipment. Passive tritium-oxide-in-air samplers are now being used for workplace monitoring and even in some environmental sampling applications. The reliability, convenience, and low cost of passive tritium-oxide-in-air samplers make them attractive options for many monitoring applications. Airflow proportional counters currently under development look promising for measuring tritium-in-air in the presence of high gamma and/or noble gas backgrounds. However, these detectors are currently limited by their poor performance in humidities over 30%. 133 refs

  4. The iPSYCH2012 case-cohort sample

    DEFF Research Database (Denmark)

    Pedersen, C B; Bybjerg-Grauholm, J; Pedersen, M G

    2018-01-01

    The Integrative Psychiatric Research (iPSYCH) consortium has established a large Danish population-based Case-Cohort sample (iPSYCH2012) aimed at unravelling the genetic and environmental architecture of severe mental disorders. The iPSYCH2012 sample is nested within the entire Danish population...

  5. GNS Castor V/21 Headspace Gas Sampling 2014

    Energy Technology Data Exchange (ETDEWEB)

    Winston, Philip Lon [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-01

    Prior to performing an internal visual inspection, samples of the headspace gas of the GNS Castor V/21 cask were taken on June 12, 2014. These samples were taken in support of the CREIPI/Japanese nuclear industry effort to validate fuel integrity without visual inspection by measuring the 85Kr content of the cask headspace

  6. Integrated stationary Ornstein-Uhlenbeck process, and double integral processes

    Science.gov (United States)

    Abundo, Mario; Pirozzi, Enrica

    2018-03-01

    We find a representation of the integral of the stationary Ornstein-Uhlenbeck (ISOU) process in terms of Brownian motion Bt; moreover, we show that, under certain conditions on the functions f and g , the double integral process (DIP) D(t) = ∫βt g(s) (∫αs f(u) dBu) ds can be thought as the integral of a suitable Gauss-Markov process. Some theoretical and application details are given, among them we provide a simulation formula based on that representation by which sample paths, probability densities and first passage times of the ISOU process are obtained; the first-passage times of the DIP are also studied.

  7. Integrated proteomic and genomic analysis of colorectal cancer

    Science.gov (United States)

    Investigators who analyzed 95 human colorectal tumor samples have determined how gene alterations identified in previous analyses of the same samples are expressed at the protein level. The integration of proteomic and genomic data, or proteogenomics, pro

  8. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  9. The demagnetizing factors for the rectangular samples

    International Nuclear Information System (INIS)

    Akishin, P.G.; Gaganov, I.A.

    1990-01-01

    The influence of the demagnetization effect on the distribution of internal magnetic fields for finite samples is considered. The boundary integral method is used to compute the space distribution of the magnetic field in rectangular samples. On the basis of these calculations we compute the distribution of demagnetization factors in the sample for μSR experimental set-up with the real field geometry. The corresponding mathematical expectation and dispersion of this distribution are estimated. The results of the calculation are used in the analysis of the μSR data obtained for high T c superconductors. It is shown for these compounds that the correction to the penetration depth related to the broadening of the field distribution, is not more than 5%. 8 refs.; 2 figs.; 1 tab

  10. Characterization of Volatiles Loss from Soil Samples at Lunar Environments

    Science.gov (United States)

    Kleinhenz, Julie; Smith, Jim; Roush, Ted; Colaprete, Anthony; Zacny, Kris; Paulsen, Gale; Wang, Alex; Paz, Aaron

    2017-01-01

    Resource Prospector Integrated Thermal Vacuum Test Program A series of ground based dirty thermal vacuum tests are being conducted to better understand the subsurface sampling operations for RP Volatiles loss during sampling operations Hardware performance Sample removal and transfer Concept of operationsInstrumentation5 test campaigns over 5 years have been conducted with RP hardware with advancing hardware designs and additional RP subsystems Volatiles sampling 4 years Using flight-forward regolith sampling hardware, empirically determine volatile retention at lunar-relevant conditions Use data to improve theoretical predictions Determine driving variables for retention Bound water loss potential to define measurement uncertainties. The main goal of this talk is to introduce you to our approach to characterizing volatiles loss for RP. Introduce the facility and its capabilities Overview of the RP hardware used in integrated testing (most recent iteration) Summarize the test variables used thus farReview a sample of the results.

  11. Pro Spring Integration

    CERN Document Server

    Lui, M; Chan, Andy; Long, Josh

    2011-01-01

    Pro Spring Integration is an authoritative book from the experts that guides you through the vast world of enterprise application integration (EAI) and application of the Spring Integration framework towards solving integration problems. The book is:. * An introduction to the concepts of enterprise application integration * A reference on building event-driven applications using Spring Integration * A guide to solving common integration problems using Spring Integration What makes this book unique is its coverage of contemporary technologies and real-world information, with a focus on common p

  12. Tactical Systems Integration Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Tactical Systems Integration Laboratory is used to design and integrate computer hardware and software and related electronic subsystems for tactical vehicles....

  13. On sampling social networking services

    OpenAIRE

    Wang, Baiyang

    2012-01-01

    This article aims at summarizing the existing methods for sampling social networking services and proposing a faster confidence interval for related sampling methods. It also includes comparisons of common network sampling techniques.

  14. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  15. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... to determine how many languages from each phylum should be selected, given any required sample size....

  16. Gated integrator with signal baseline subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.

    1996-12-17

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

  17. Gated integrator with signal baseline subtraction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xucheng (Lisle, IL)

    1996-01-01

    An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

  18. Enhanced sampling algorithms.

    Science.gov (United States)

    Mitsutake, Ayori; Mori, Yoshiharu; Okamoto, Yuko

    2013-01-01

    In biomolecular systems (especially all-atom models) with many degrees of freedom such as proteins and nucleic acids, there exist an astronomically large number of local-minimum-energy states. Conventional simulations in the canonical ensemble are of little use, because they tend to get trapped in states of these energy local minima. Enhanced conformational sampling techniques are thus in great demand. A simulation in generalized ensemble performs a random walk in potential energy space and can overcome this difficulty. From only one simulation run, one can obtain canonical-ensemble averages of physical quantities as functions of temperature by the single-histogram and/or multiple-histogram reweighting techniques. In this article we review uses of the generalized-ensemble algorithms in biomolecular systems. Three well-known methods, namely, multicanonical algorithm, simulated tempering, and replica-exchange method, are described first. Both Monte Carlo and molecular dynamics versions of the algorithms are given. We then present various extensions of these three generalized-ensemble algorithms. The effectiveness of the methods is tested with short peptide and protein systems.

  19. Quantum Metropolis sampling.

    Science.gov (United States)

    Temme, K; Osborne, T J; Vollbrecht, K G; Poulin, D; Verstraete, F

    2011-03-03

    The original motivation to build a quantum computer came from Feynman, who imagined a machine capable of simulating generic quantum mechanical systems--a task that is believed to be intractable for classical computers. Such a machine could have far-reaching applications in the simulation of many-body quantum physics in condensed-matter, chemical and high-energy systems. Part of Feynman's challenge was met by Lloyd, who showed how to approximately decompose the time evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that has basically acquired a monopoly on the simulation of interacting particles. Here we demonstrate how to implement a quantum version of the Metropolis algorithm. This algorithm permits sampling directly from the eigenstates of the Hamiltonian, and thus evades the sign problem present in classical simulations. A small-scale implementation of this algorithm should be achievable with today's technology.

  20. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. [Quality of DNA from archival pathological samples of gallbladder cancer].

    Science.gov (United States)

    Roa, Iván; de Toro, Gonzalo; Sánchez, Tamara; Slater, Jeannie; Ziegler, Anne Marie; Game, Anakaren; Arellano, Leonardo; Schalper, Kurt; de Aretxabala, Xabier

    2013-12-01

    The quality of the archival samples stored at pathology services could be a limiting factor for molecular biology studies. To determine the quality of DNA extracted from gallbladder cancer samples at different institutions. One hundred ninety four samples coming from five medical centers in Chile, were analyzed. DNA extraction was quantified determining genomic DNA concentration. The integrity of DNA was determined by polymerase chain reaction amplification of different length fragments of a constitutive gene (β-globin products of 110, 268 and 501 base pairs). The mean DNA concentration obtained in 194 gallbladder cancer samples was 48 ± 43.1 ng/µl. In 22% of samples, no amplification was achieved despite obtaining a mean DNA concentration of 58.3 ng/ul. In 81, 67 and 22% of samples, a DNA amplification of at least 110, 268 or 501 base pairs was obtained, respectively. No differences in DNA concentration according to the source of the samples were demonstrated. However, there were marked differences in DNA integrity among participating centers. Samples from public hospitals were of lower quality than those from private clinics. Despite some limitations, in 80% of cases, the integrity of DNA in archival samples from pathology services in our country would allow the use of molecular biology techniques.

  2. Impact of Systematic Sampling on Causality in the presence of Unit Roots

    OpenAIRE

    Rajaguru GULASEKARAN

    2002-01-01

    Quite contrary to the stationary case where systematic sampling preserves the direction of Granger causality, this paper shows that systematic sampling of integrated series may induce spurious causality, even if they are used in differenced form.

  3. SAP crm integration testing

    OpenAIRE

    Černiavskaitė, Marija

    2017-01-01

    This Bachelor's thesis presents SAP CRM and integration systems testing analysis: investigation in SAP CRM and SAP PO systems, presentation of relationship between systems, introduction to third-party system (non-SAP) – Network Informational System (NIS) which has integration with SAP, presentation of best CRM testing practises, analysis and recommendation of integration testing. Practical integration testing is done in accordance to recommendations.

  4. Pipeline integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Guyt, J.; Macara, C.

    1997-12-31

    This paper focuses on some of the issues necessary for pipeline operators to consider when addressing the challenge of managing the integrity of their systems. Topics are: Definition; business justification; creation and safeguarding of technical integrity; control and deviation from technical integrity; pipelines; pipeline failure assessment; pipeline integrity assessment; leak detection; emergency response. 6 figs., 3 tabs.

  5. Physical Samples Linked Data in Action

    Science.gov (United States)

    Ji, P.; Arko, R. A.; Lehnert, K.; Bristol, S.

    2017-12-01

    Most data and metadata related to physical samples currently reside in isolated relational databases driven by diverse data models. How to approach the challenge for sharing, interchanging and integrating data from these difference relational databases motivated us to publish Linked Open Data for collections of physical samples, using Semantic Web technologies including the Resource Description Framework (RDF), RDF Query Language (SPARQL), and Web Ontology Language (OWL). In last few years, we have released four knowledge graphs concentrated on physical samples, including System for Earth Sample Registration (SESAR), USGS National Geochemical Database (NGDC), Ocean Biogeographic Information System (OBIS), and Earthchem Database. Currently the four knowledge graphs contain over 12 million facets (triples) about objects of interest to the geoscience domain. Choosing appropriate domain ontologies for representing context of data is the core of the whole work. Geolink ontology developed by Earthcube Geolink project was used as top level to represent common concepts like person, organization, cruise, etc. Physical sample ontology developed by Interdisciplinary Earth Data Alliance (IEDA) and Darwin Core vocabulary were used as second level to describe details about geological samples and biological diversity. We also focused on finding and building best tool chains to support the whole life cycle of publishing linked data we have, including information retrieval, linked data browsing and data visualization. Currently, Morph, Virtuoso Server, LodView, LodLive, and YASGUI were employed for converting, storing, representing, and querying data in a knowledge base (RDF triplestore). Persistent digital identifier is another main point we concentrated on. Open Researcher & Contributor IDs (ORCIDs), International Geo Sample Numbers (IGSNs), Global Research Identifier Database (GRID) and other persistent identifiers were used to link different resources from various graphs with

  6. Integral consideration of integrated management systems

    International Nuclear Information System (INIS)

    Frauenknecht, Stefan; Schmitz, Hans

    2010-01-01

    Aim of the project for the NPPs Kruemmel and Brunsbuettel (Vattenfall) is the integral view of the business process as basis for the implementation and operation of management systems in the domains quality, safety and environment. The authors describe the integral view of the business processes in the frame of integrated management systems with the focus nuclear safety, lessons learned in the past, the concept of a process-based controlling system and experiences from the practical realization.

  7. Sampling strategies for indoor radon investigations

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1983-01-01

    Recent investigations prompted by concern about the environmental effects of residential energy conservation have produced many accounts of indoor radon concentrations far above background levels. In many instances time-normalized annual exposures exceeded the 4 WLM per year standard currently used for uranium mining. Further investigations of indoor radon exposures are necessary to judge the extent of the problem and to estimate the practicality of health effects studies. A number of trends can be discerned as more indoor surveys are reported. It is becoming increasingly clear that local geological factors play a major, if not dominant role in determining the distribution of indoor radon concentrations in a given area. Within a giving locale, indoor radon concentrations tend to be log-normally distributed, and sample means differ markedly from one region to another. The appreciation of geological factors and the general log-normality of radon distributions will improve the accuracy of population dose estimates and facilitate the design of preliminary health effects studies. The relative merits of grab samples, short and long term integrated samples, and more complicated dose assessment strategies are discussed in the context of several types of epidemiological investigations. A new passive radon sampler with a 24 hour integration time is described and evaluated as a tool for pilot investigations

  8. Systematic sampling for suspended sediment

    Science.gov (United States)

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  9. Sampling Operations on Big Data

    Science.gov (United States)

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  10. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  11. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  12. Jenis Sample: Keuntungan dan Kerugiannya

    OpenAIRE

    Suprapto, Agus

    1994-01-01

    Sample is a part of a population that are used in a study for purposes of making estimation about the nature of the total population that is obtained with sampling technic. Sampling technic is more adventagous than cencus because it can reduce cost, time, and it can gather deeper information and more accurate data. It is useful to distinguish two major types of sampling technics. First, Prob bility sampling i.e. simple random sampling. Second, Non Probability sampling i.e. systematic sam­plin...

  13. Groundwater sampling in uranium reconnaissance

    International Nuclear Information System (INIS)

    Butz, T.R.

    1977-03-01

    The groundwater sampling program is based on the premise that ground water geochemistry reflects the chemical composition of, and geochemical processes active in the strata from which the sample is obtained. Pilot surveys have shown that wells are the best source of groundwater, although springs are sampled on occasion. The procedures followed in selecting a sampling site, the sampling itself, and the field measurements, as well as the site records made, are described

  14. The LITA Drill and Sample Delivery System

    Science.gov (United States)

    Paulsen, G.; Yoon, S.; Zacny, K.; Wettergreeng, D.; Cabrol, N. A.

    2013-12-01

    fall back material will be augered out during auger re-insertion. The next bite will be taken only once the auger has reached the true bottom. In the bite sampling approach the stratigraphy is somewhat preserved since every time the sample is taken, it more or less represents the depth interval in the hole. There is going to be some level of cross contamination due to smearing of cuttings on the flutes against the borehole as the auger is being pulled out, or when formation is very porous and unstable. The goal of the first drill campaign in Atacama in May of 2012 was to demonstrate successful operation of the bite sampling method and to learn about diversity of soils and rocks in the Atacama. In 2013, the sampling system has been integrated onto the CMU Zoe rover and autonomously deployed in Atacama. The drill penetrated various formations and delivered samples to a carousel. When soil was very porous, poor sample recovery was observed. When the soil was dense and cohesive, sample recovery was 100% with little cross contamination. To enable greater sample recovery in loose and unstable formations, the auger diameter will be increased from the current 12.5 mm to 19 mm. Acknowledgements: The project has been funded by the NASA ASTEP program.

  15. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  16. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  17. Sample Transport for a European Sample Curation Facility

    Science.gov (United States)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  18. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  19. Water born pollutants sampling using porous suction samples

    International Nuclear Information System (INIS)

    Baig, M.A.

    1997-01-01

    The common standard method of sampling water born pollutants in the vadoze zone is core sampling and it is followed by extraction of pore fluid. This method does not allow sampling at the same location next time and again later on. There is an alternative approach for sampling fluids (water born pollutants) from both saturated and unsaturated regions of vadose zone using porous suction samplers. There are three types of porous suction samplers, vacuum-operated, pressure-vacuum lysimeters, high pressure vacuum samples. The suction samples are operated in the range of 0-70 centi bars and usually consist of ceramic and polytetrafluorethylene (PTFE). The operation range of PTFE is higher than ceramic cups. These samplers are well suited for in situ and repeated sampling form the same location. This paper discusses the physical properties and operating condition of such samplers to the utilized under our environmental sampling. (author)

  20. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  1. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  2. Biopolymers for sample collection, protection, and preservation.

    Science.gov (United States)

    Sorokulova, Iryna; Olsen, Eric; Vodyanoy, Vitaly

    2015-07-01

    One of the principal challenges in the collection of biological samples from air, water, and soil matrices is that the target agents are not stable enough to be transferred from the collection point to the laboratory of choice without experiencing significant degradation and loss of viability. At present, there is no method to transport biological samples over considerable distances safely, efficiently, and cost-effectively without the use of ice or refrigeration. Current techniques of protection and preservation of biological materials have serious drawbacks. Many known techniques of preservation cause structural damages, so that biological materials lose their structural integrity and viability. We review applications of a novel bacterial preservation process, which is nontoxic and water soluble and allows for the storage of samples without refrigeration. The method is capable of protecting the biological sample from the effects of environment for extended periods of time and then allows for the easy release of these collected biological materials from the protective medium without structural or DNA damage. Strategies for sample collection, preservation, and shipment of bacterial, viral samples are described. The water-soluble polymer is used to immobilize the biological material by replacing the water molecules within the sample with molecules of the biopolymer. The cured polymer results in a solid protective film that is stable to many organic solvents, but quickly removed by the application of the water-based solution. The process of immobilization does not require the use of any additives, accelerators, or plastifiers and does not involve high temperature or radiation to promote polymerization.

  3. Buried waste integrated demonstration technology integration process

    International Nuclear Information System (INIS)

    Ferguson, J.S.; Ferguson, J.E.

    1992-04-01

    A Technology integration Process was developed for the Idaho National Energy Laboratories (INEL) Buried Waste Integrated Demonstration (BWID) Program to facilitate the transfer of technology and knowledge from industry, universities, and other Federal agencies into the BWID; to successfully transfer demonstrated technology and knowledge from the BWID to industry, universities, and other Federal agencies; and to share demonstrated technologies and knowledge between Integrated Demonstrations and other Department of Energy (DOE) spread throughout the DOE Complex. This document also details specific methods and tools for integrating and transferring technologies into or out of the BWID program. The document provides background on the BWID program and technology development needs, demonstrates the direction of technology transfer, illustrates current processes for this transfer, and lists points of contact for prospective participants in the BWID technology transfer efforts. The Technology Integration Process was prepared to ensure compliance with the requirements of DOE's Office of Technology Development (OTD)

  4. Power Systems Integration Laboratory | Energy Systems Integration Facility

    Science.gov (United States)

    | NREL Power Systems Integration Laboratory Power Systems Integration Laboratory Research in the Energy System Integration Facility's Power Systems Integration Laboratory focuses on the microgrid applications. Photo of engineers testing an inverter in the Power Systems Integration Laboratory

  5. Sampling and chemical analysis in environmental samples around Nuclear Power Plants and some environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yong Woo; Han, Man Jung; Cho, Seong Won; Cho, Hong Jun; Oh, Hyeon Kyun; Lee, Jeong Min; Chang, Jae Sook [KORTIC, Taejon (Korea, Republic of)

    2002-12-15

    Twelve kinds of environmental samples such as soil, seawater, underground water, etc. around Nuclear Power Plants(NPPs) were collected. Tritium chemical analysis was tried for the samples of rain water, pine-needle, air, seawater, underground water, chinese cabbage, a grain of rice and milk sampled around NPPs, and surface seawater and rain water sampled over the country. Strontium in the soil that sere sampled at 60 point of district in Korea were analyzed. Tritium were sampled at 60 point of district in Korea were analyzed. Tritium were analyzed in 21 samples of surface seawater around the Korea peninsular that were supplied from KFRDI(National Fisheries Research and Development Institute). Sampling and chemical analysis environmental samples around Kori, Woolsung, Youngkwang, Wooljin Npps and Taeduk science town for tritium and strontium analysis was managed according to plans. Succeed to KINS after all samples were tried.

  6. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    Science.gov (United States)

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  7. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    Science.gov (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  8. de Broglie Swapping Metadynamics for Quantum and Classical Sampling.

    Science.gov (United States)

    Nava, Marco; Quhe, Ruge; Palazzesi, Ferruccio; Tiwary, Pratyush; Parrinello, Michele

    2015-11-10

    This paper builds on our previous work on Path Integral Metadynamics [ Ruge et al. J. Chem. Theory Comput. 2015 , 11 , 1383 ] in which we have accelerated sampling in quantum systems described by Feynman's Path Integrals using Metadynamics. We extend the scope of Path Integral Metadynamics by combining it with a replica exchange scheme in which artificially enhanced quantum effects play the same role as temperature does in parallel tempering. Our scheme can be adapted so as to be used in an ancillary way to sample systems described by classical statistical mechanics. Contrary to Metadynamics and many other sampling methods no collective variables need to be defined. The method in its two variants, quantum and classical, is tested in a number of examples.

  9. A Note on Information-Directed Sampling and Thompson Sampling

    OpenAIRE

    Zhou, Li

    2015-01-01

    This note introduce three Bayesian style Multi-armed bandit algorithms: Information-directed sampling, Thompson Sampling and Generalized Thompson Sampling. The goal is to give an intuitive explanation for these three algorithms and their regret bounds, and provide some derivations that are omitted in the original papers.

  10. Sample summary report for KOR1 pressure tube sample

    International Nuclear Information System (INIS)

    Lee, Hee Jong; Nam, Min Woo; Choi, Young Ha

    2006-01-01

    This summary report includes basically the following: - The FLAW CHARACTERIZATION TABLE of KOR1 sample and supporting documentation. - The CROSS REFERENCE TABLES for each investigator, which is the SAMPLE INSPECTION TABLE that cross reference to the FLAW CHARACTERIZATION TABLE. - Each Sample Inspection Report as Appendices

  11. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  12. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...... to determine how many languages from each phylum should be selected, given any required sample size....

  13. The development of a Martian atmospheric Sample collection canister

    Science.gov (United States)

    Kulczycki, E.; Galey, C.; Kennedy, B.; Budney, C.; Bame, D.; Van Schilfgaarde, R.; Aisen, N.; Townsend, J.; Younse, P.; Piacentine, J.

    The collection of an atmospheric sample from Mars would provide significant insight to the understanding of the elemental composition and sub-surface out-gassing rates of noble gases. A team of engineers at the Jet Propulsion Laboratory (JPL), California Institute of Technology have developed an atmospheric sample collection canister for Martian application. The engineering strategy has two basic elements: first, to collect two separately sealed 50 cubic centimeter unpressurized atmospheric samples with minimal sensing and actuation in a self contained pressure vessel; and second, to package this atmospheric sample canister in such a way that it can be easily integrated into the orbiting sample capsule for collection and return to Earth. Sample collection and integrity are demonstrated by emulating the atmospheric collection portion of the Mars Sample Return mission on a compressed timeline. The test results achieved by varying the pressure inside of a thermal vacuum chamber while opening and closing the valve on the sample canister at Mars ambient pressure. A commercial off-the-shelf medical grade micro-valve is utilized in the first iteration of this design to enable rapid testing of the system. The valve has been independently leak tested at JPL to quantify and separate the leak rates associated with the canister. The results are factored in to an overall system design that quantifies mass, power, and sensing requirements for a Martian atmospheric Sample Collection (MASC) canister as outlined in the Mars Sample Return mission profile. Qualitative results include the selection of materials to minimize sample contamination, preliminary science requirements, priorities in sample composition, flight valve selection criteria, a storyboard from sample collection to loading in the orbiting sample capsule, and contributions to maintaining “ Earth” clean exterior surfaces on the orbiting sample capsule.

  14. Preeminence and prerequisites of sample size calculations in clinical trials

    Directory of Open Access Journals (Sweden)

    Richa Singhal

    2015-01-01

    Full Text Available The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary outcome is a continuous variable and when it is a proportion or a qualitative variable.

  15. Grid Integration Research | Wind | NREL

    Science.gov (United States)

    Grid Integration Research Grid Integration Research Researchers study grid integration of wind three wind turbines with transmission lines in the background. Capabilities NREL's grid integration electric power system operators to more efficiently manage wind grid system integration. A photo of

  16. Distribution Integration | Grid Modernization | NREL

    Science.gov (United States)

    Distribution Integration Distribution Integration The goal of NREL's distribution integration research is to tackle the challenges facing the widespread integration of distributed energy resources NREL engineers mapping out a grid model on a whiteboard. NREL's research on the integration of

  17. Transmission Integration | Grid Modernization | NREL

    Science.gov (United States)

    Transmission Integration Transmission Integration The goal of NREL's transmission integration integration issues and provide data, analysis, and models to enable the electric power system to more and finding solutions to address them to enable transmission grid integration. Capabilities Power

  18. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. Sampling and estimating recreational use.

    Science.gov (United States)

    Timothy G. Gregoire; Gregory J. Buhyoff

    1999-01-01

    Probability sampling methods applicable to estimate recreational use are presented. Both single- and multiple-access recreation sites are considered. One- and two-stage sampling methods are presented. Estimation of recreational use is presented in a series of examples.

  20. Unit 06 - Sampling the World

    OpenAIRE

    Unit 06, CC in GIS; Parson, Charles; Nyerges, Timothy

    1990-01-01

    This unit begins the section on data acquisition by looking at how the infinite complexity of the real world can be discretized and sampled. It considers sampling techniques and associated issues of accuracy and standards.

  1. Public Use Microdata Samples (PUMS)

    Data.gov (United States)

    National Aeronautics and Space Administration — Public Use Microdata Samples (PUMS) are computer-accessible files containing records for a sample of housing units, with information on the characteristics of each...

  2. Guided episodic sampling for capturing and characterizing industrial plumes

    Science.gov (United States)

    Ou-Yang, Chang-Feng; Liao, Wei-Cheng; Chang, Chih-Chung; Hsieh, Hsin-Cheng; Wang, Jia-Lin

    2018-02-01

    An integrated sampling technique, dubbed trigger sampling, was developed to capture characteristic industrial emissions or plumes. In the field experiment, a hydrogen sulfide (H2S) analyzer was used as the triggering instrument at the boundary of a refinery plant due to frequent complaints of foul smell from local residents. Ten episodic samples were captured when the H2S level surpassed the prescribed trigger level of 8.5 ppbv over a three-day period. Three non-episodic (blank) samples and 23 road-side samples were also collected for comparison. All the 36 flask samples were analyzed by gas chromatography-mass spectrometry/flame ionization detection (GC-MS/FID) for 108 volatile organic compounds (VOCs). The total VOC abundance of the event samples was exceedingly higher than the non-episodic samples by over 80 times in the extreme case. Alkanes were found to be the dominant constituents in the event samples, amounting to over 90% of the total VOC concentrations vs. only 30-40% for the blank and metropolitan samples. In addition, light alkanes in the event samples were highly correlated with the trigger species H2S (R2 = 0.82), implying their common origin. The matrix of chemical composition vs. sample types permitted easy visualization of the dominance of light alkanes for the event samples compared to other types of samples. Principle component analysis (PCA) identified two major contributors to cover 93% of the total variance arising from the 36 samples, further quantifying the distinction of the triggered episodic samples from the contrast samples. The proposed trigger sampling is a coupling of fast-and-slow measurement techniques. In this example, the fast-response H2S analyzer served to "guide" sampling to capture industrial plumes which were then characterized by a relatively slow method of GC-MS/FID for detailed chemical composition representative of the prominent sources.

  3. A simple air sampling technique for monitoring nitrous oxide pollution

    Energy Technology Data Exchange (ETDEWEB)

    Austin, J C; Shaw, R; Moyes, D; Cleaton-Jones, P E

    1981-01-01

    A simple, inexpensive device for the continuous low-flow sampling of air was devised to permit monitoring of pollution by gaseous anaesthetics. The device consisted of a water-filled Perspex cylinder in which a double-walled flexible-film gas sample collection bag was suspended. Air samples could be aspirated into the collection bag at flow rates of as low as 1 ml min-1 by allowing the water to drain from the cylinder at a controlled rate. The maintenance of sample integrity with aspiration and storage of samples of nitrous oxide in air at concentrations of 1000, 100 and 30 p.p.m. v/v was examined using gas chromatography. The sample bags retained a mean 94% of the nitrous oxide in air samples containing nitrous oxide 25 p.p.m. over a 72-h storage period.

  4. Soil sampling in emergency situations

    International Nuclear Information System (INIS)

    Carvalho, Zenildo Lara de; Ramos Junior, Anthenor Costa

    1997-01-01

    The soil sampling methods used in Goiania's accident (1987) by the environmental team of Brazilian Nuclear Energy Commission (CNEN) are described. The development of this method of soil sampling to a emergency sampling method used in a Nuclear Emergency Exercise in Angra dos Reis Reactor Site (1991) is presented. A new method for soil sampling based on a Chernobyl environmental monitoring experience (1995) is suggested. (author)

  5. Patient identification in blood sampling.

    Science.gov (United States)

    Davidson, Anne; Bolton-Maggs, Paula

    The majority of adverse reports relating to blood transfusions result from human error, including misidentification of patients and incorrect labelling of samples. This article outlines best practice in blood sampling for transfusion (but is recommended for all pathology samples) and the role of patient empowerment in improving safety.

  6. Mars Sample Return Architecture Overview

    Science.gov (United States)

    Edwards, C. D.; Vijendran, S.

    2018-04-01

    NASA and ESA are exploring potential concepts for a Sample Retrieval Lander and Earth Return Orbiter that could return samples planned to be collected and cached by the Mars 2020 rover mission. We provide an overview of the Mars Sample Return architecture.

  7. The rise of survey sampling

    NARCIS (Netherlands)

    Bethlehem, J.

    2009-01-01

    This paper is about the history of survey sampling. It describes how sampling became an accepted scientific method. From the first ideas in 1895 it took some 50 years before the principles of probability sampling were widely accepted. This papers has a focus on developments in official statistics in

  8. Collection of biological samples in forensic toxicology.

    Science.gov (United States)

    Dinis-Oliveira, R J; Carvalho, F; Duarte, J A; Remião, F; Marques, A; Santos, A; Magalhães, T

    2010-09-01

    Forensic toxicology is the study and practice of the application of toxicology to the purposes of the law. The relevance of any finding is determined, in the first instance, by the nature and integrity of the specimen(s) submitted for analysis. This means that there are several specific challenges to select and collect specimens for ante-mortem and post-mortem toxicology investigation. Post-mortem specimens may be numerous and can endow some special difficulties compared to clinical specimens, namely those resulting from autolytic and putrefactive changes. Storage stability is also an important issue to be considered during the pre-analytic phase, since its consideration should facilitate the assessment of sample quality and the analytical result obtained from that sample. The knowledge on degradation mechanisms and methods to increase storage stability may enable the forensic toxicologist to circumvent possible difficulties. Therefore, advantages and limitations of specimen preservation procedures are thoroughfully discussed in this review. Presently, harmonized protocols for sampling in suspected intoxications would have obvious utility. In the present article an overview is given on sampling procedures for routinely collected specimens as well as on alternative specimens that may provide additional information on the route and timing of exposure to a specific xenobiotic. Last, but not least, a discussion on possible bias that can influence the interpretation of toxicological results is provided. This comprehensive review article is intented as a significant help for forensic toxicologists to accomplish their frequently overwhelming mission.

  9. Bessel beam CARS of axially structured samples

    Science.gov (United States)

    Heuke, Sandro; Zheng, Juanjuan; Akimov, Denis; Heintzmann, Rainer; Schmitt, Michael; Popp, Jürgen

    2015-06-01

    We report about a Bessel beam CARS approach for axial profiling of multi-layer structures. This study presents an experimental implementation for the generation of CARS by Bessel beam excitation using only passive optical elements. Furthermore, an analytical expression is provided describing the generated anti-Stokes field by a homogeneous sample. Based on the concept of coherent transfer functions, the underling resolving power of axially structured geometries is investigated. It is found that through the non-linearity of the CARS process in combination with the folded illumination geometry continuous phase-matching is achieved starting from homogeneous samples up to spatial sample frequencies at twice of the pumping electric field wave. The experimental and analytical findings are modeled by the implementation of the Debye Integral and scalar Green function approach. Finally, the goal of reconstructing an axially layered sample is demonstrated on the basis of the numerically simulated modulus and phase of the anti-Stokes far-field radiation pattern.

  10. Analysis of cryopreparated non-dehydrated sample systems by means of a newly developed Tof-SIMS instrument with integrated high-vacuum cutting apparature; Analysen kryopraeparierter nicht-dehydrierter Probensysteme mit Hilfe eines neu entwickelten ToF-SIMS-Instruments mit integrierter Hochvakuumkryoschnittapparatur

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, Joerg

    2008-12-05

    Aim of the present thesis was to construct an analysis apparatus, which allows to perform on cryofixed samples a cryocutting respectively cryocracking preparation under vacuum conditions and in the following perform a Tof-SIMS analysis.

  11. One loop integrals reduction

    International Nuclear Information System (INIS)

    Sun Yi; Chang Haoran

    2012-01-01

    By further examining the symmetry of external momenta and masses in Feynman integrals, we fulfilled the method proposed by Battistel and Dallabona, and showed that recursion relations in this method can be applied to simplify Feynman integrals directly. (authors)

  12. Laplace Transforms without Integration

    Science.gov (United States)

    Robertson, Robert L.

    2017-01-01

    Calculating Laplace transforms from the definition often requires tedious integrations. This paper provides an integration-free technique for calculating Laplace transforms of many familiar functions. It also shows how the technique can be applied to probability theory.

  13. Sesquilinear uniform vector integral

    Indian Academy of Sciences (India)

    theory, together with his integral, dominate contemporary mathematics. ... directions belonging to Bartle and Dinculeanu (see [1], [6], [7] and [2]). ... in this manner, namely he integrated vector functions with respect to measures of bounded.

  14. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...

  15. Spring integration essentials

    CERN Document Server

    Pandey, Chandan

    2015-01-01

    This book is intended for developers who are either already involved with enterprise integration or planning to venture into the domain. Basic knowledge of Java and Spring is expected. For newer users, this book can be used to understand an integration scenario, what the challenges are, and how Spring Integration can be used to solve it. Prior experience of Spring Integration is not expected as this book will walk you through all the code examples.

  16. Boolean integral calculus

    Science.gov (United States)

    Tucker, Jerry H.; Tapia, Moiez A.; Bennett, A. Wayne

    1988-01-01

    The concept of Boolean integration is developed, and different Boolean integral operators are introduced. Given the changes in a desired function in terms of the changes in its arguments, the ways of 'integrating' (i.e. realizing) such a function, if it exists, are presented. The necessary and sufficient conditions for integrating, in different senses, the expression specifying the changes are obtained. Boolean calculus has applications in the design of logic circuits and in fault analysis.

  17. Integrated vs. Federated Search

    DEFF Research Database (Denmark)

    Løvschall, Kasper

    2009-01-01

    Oplæg om forskelle og ligheder mellem integrated og federated search i bibliotekskontekst. Holdt ved temadag om "Integrated Search - samsøgning i alle kilder" på Danmarks Biblioteksskole den 22. januar 2009.......Oplæg om forskelle og ligheder mellem integrated og federated search i bibliotekskontekst. Holdt ved temadag om "Integrated Search - samsøgning i alle kilder" på Danmarks Biblioteksskole den 22. januar 2009....

  18. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  19. Integrating EPICS and MDSplus

    International Nuclear Information System (INIS)

    Mastrovito, D.; Davis, W.; Dong, J.; Roney, P.; Sichta, P.

    2006-01-01

    The National Spherical Torus Experiment (NSTX) has been in operation at the Princeton Plasma Physics Laboratory (PPPL) since 1999. Since then, NSTX has made use of the Experimental Physics and Industrial Control System (EPICS) and MDSplus software packages, among others for control and data acquisition. To date, the two products have been integrated using special 'bridging' programs that include client-components for the EPICS and MDSplus servers. Recent improvements in the EPICS software have made it easier to develop a direct interface with MDSplus. This paper will describe the new EPICS extensions developed at PPPL that provide: (1) a direct data interface between EPICS process variables and MDSplus nodes; and (2) an interface between EPICS events and MDSplus events. These extensions have been developed for use with EPICS on Solaris and are currently being modified for use on real-time operating systems. Separately, an XML-RPC client was written to access EPICS 'trended' data, sampled usually once per minute during a 24 h period. The client extracts and writes a day's worth of trended data to a 'daily' MDSplus tree

  20. Integrated nursery pest management

    Science.gov (United States)

    R. Kasten Dumroese

    2012-01-01

    What is integrated pest management? Take a look at the definition of each word to better understand the concept. Two of the words (integrated and management) are relatively straightforward. Integrated means to blend pieces or concepts into a unified whole, and management is the wise use of techniques to successfully accomplish a desired outcome. A pest is any biotic (...

  1. Sledge-Hammer Integration

    Science.gov (United States)

    Ahner, Henry

    2009-01-01

    Integration (here visualized as a pounding process) is mathematically realized by simple transformations, successively smoothing the bounding curve into a straight line and the region-to-be-integrated into an area-equivalent rectangle. The relationship to Riemann sums, and to the trapezoid and midpoint methods of numerical integration, is…

  2. Discrete bipolar universal integrals

    Czech Academy of Sciences Publication Activity Database

    Greco, S.; Mesiar, Radko; Rindone, F.

    2014-01-01

    Roč. 252, č. 1 (2014), s. 55-65 ISSN 0165-0114 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : bipolar integral * universal integral * Choquet integral Subject RIV: BA - General Mathematics Impact factor: 1.986, year: 2014 http://library.utia.cas.cz/separaty/2014/E/mesiar-0432224.pdf

  3. Responsibility and Integrated Thinking

    OpenAIRE

    Robinson, SJ

    2014-01-01

    Integrated thinking is essentially focused in dialogue and communication. This is partly because relationships and related purpose focus on action, which itself acts as a means of integration, and partly because critical dialogue enables better, more responsive, integrated thinking and action.

  4. Ramjets: Airframe integration

    NARCIS (Netherlands)

    Moerel, J.L.; Halswijk, W.

    2010-01-01

    These notes deal with the integration of a (sc)ramjet engine in either an axisymmetric or a waverider type of cruise missile configuration. The integration aspects relate to the integration of the external and internal flow paths in geometrical configurations that are being considered worldwide.

  5. Foundations for Psychotherapy Integration

    Directory of Open Access Journals (Sweden)

    António Branco Vasco

    2014-10-01

    Full Text Available The movement for integration in psychotheray is clearly one of the main trends that can be observed in the field. The author stresses three main reasons for this state of affairs and as a way of justifying the importance of integration: historical and psychosocial, empirical and philosophical. A specific way of thinking in integrative terms is also outlined - "paradigmatic complementarity."

  6. Wideband 4-diode sampling circuit

    Science.gov (United States)

    Wojtulewicz, Andrzej; Radtke, Maciej

    2016-09-01

    The objective of this work was to develop a wide-band sampling circuit. The device should have the ability to collect samples of a very fast signal applied to its input, strengthen it and prepare for further processing. The study emphasizes the method of sampling pulse shaping. The use of ultrafast pulse generator allows sampling signals with a wide frequency spectrum, reaching several gigahertzes. The device uses a pulse transformer to prepare symmetrical pulses. Their final shape is formed with the help of the step recovery diode, two coplanar strips and Schottky diode. Made device can be used in the sampling oscilloscope, as well as other measurement system.

  7. Steffensen's integral inequality for conformable fractional integrals

    Directory of Open Access Journals (Sweden)

    Mehmet Zeki Sarikaya

    2017-09-01

    Full Text Available The aim of this paper is to establish some Steffensen’s type inequalities for conformable fractional integral. The results presented here would provide generalizations of those given in earlier works.

  8. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  9. Manual versus automated blood sampling

    DEFF Research Database (Denmark)

    Teilmann, A C; Kalliokoski, Otto; Sørensen, Dorte B

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters......, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal...... corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters...

  10. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  11. Sample representativeness verification of the FADN CZ farm business sample

    Directory of Open Access Journals (Sweden)

    Marie Prášilová

    2011-01-01

    Full Text Available Sample representativeness verification is one of the key stages of statistical work. After having joined the European Union the Czech Republic joined also the Farm Accountancy Data Network system of the Union. This is a sample of bodies and companies doing business in agriculture. Detailed production and economic data on the results of farming business are collected from that sample annually and results for the entire population of the country´s farms are then estimated and assessed. It is important hence, that the sample be representative. Representativeness is to be assessed as to the number of farms included in the survey and also as to the degree of accordance of the measures and indices as related to the population. The paper deals with the special statistical techniques and methods of the FADN CZ sample representativeness verification including the necessary sample size statement procedure. The Czech farm population data have been obtained from the Czech Statistical Office data bank.

  12. Sample summary report for ARG 1 pressure tube sample

    International Nuclear Information System (INIS)

    Belinco, C.

    2006-01-01

    The ARG 1 sample is made from an un-irradiated Zr-2.5% Nb pressure tube. The sample has 103.4 mm ID, 112 mm OD and approximately 500 mm length. A punch mark was made very close to one end of the sample. The punch mark indicates the 12 O'clock position and also identifies the face of the tube for making all the measurements. ARG 1 sample contains flaws on ID and OD surface. There was no intentional flaw within the wall of the pressure tube sample. Once the flaws are machined the pressure tube sample was covered from outside to hide the OD flaws. Approximately 50 mm length of pressure tube was left open at both the ends to facilitate the holding of sample in the fixtures for inspection. No flaw was machined in this zone of 50 mm on either end of the pressure tube sample. A total of 20 flaws were machined in ARG 1 sample. Out of these, 16 flaws were on the OD surface and the remaining 4 on the ID surface of the pressure tube. The flaws were characterized in to various groups like axial flaws, circumferential flaws, etc

  13. Integrated cryogenic sensors

    International Nuclear Information System (INIS)

    Juanarena, D.B.; Rao, M.G.

    1991-01-01

    Integrated cryogenic pressure-temperature, level-temperature, and flow-temperature sensors have several advantages over the conventional single parameter sensors. Such integrated sensors were not available until recently. Pressure Systems, Inc. (PSI) of Hampton, Virginia, has introduced precalibrated precision cryogenic pressure sensors at the Los Angeles Cryogenic Engineering Conference in 1989. Recently, PSI has successfully completed the development of integrated pressure-temperature and level-temperature sensors for use in the temperature range 1.5-375K. In this paper, performance characteristics of these integrated sensors are presented. Further, the effects of irradiation and magnetic fields on these integrated sensors are also reviewed

  14. Nature-based integration

    DEFF Research Database (Denmark)

    Pitkänen, Kati; Oratuomi, Joose; Hellgren, Daniela

    Increased attention to, and careful planning of the integration of migrants into Nordic societies is ever more important. Nature based integration is a new solution to respond to this need. This report presents the results of a Nordic survey and workshop and illustrates current practices of nature...... based integration by case study descriptions from Denmark, Sweden Norway and Finland. Across Nordic countries several practical projects and initiatives have been launched to promote the benefits of nature in integration and there is also growing academic interest in the topic. Nordic countries have...... the potential of becoming real forerunners in nature based integration even at the global scale....

  15. Photonic Integrated Circuits

    Science.gov (United States)

    Krainak, Michael; Merritt, Scott

    2016-01-01

    Integrated photonics generally is the integration of multiple lithographically defined photonic and electronic components and devices (e.g. lasers, detectors, waveguides passive structures, modulators, electronic control and optical interconnects) on a single platform with nanometer-scale feature sizes. The development of photonic integrated circuits permits size, weight, power and cost reductions for spacecraft microprocessors, optical communication, processor buses, advanced data processing, and integrated optic science instrument optical systems, subsystems and components. This is particularly critical for small spacecraft platforms. We will give an overview of some NASA applications for integrated photonics.

  16. Integration of generic issues

    International Nuclear Information System (INIS)

    Thatcher, D.

    1989-01-01

    The NRC has recognized the need to integrate generic issues (GIs). The GI process includes a number of phases, all of which should recognize the potential for overlap and conflict among related issues. In addition to the issues themselves, other related NRC and industry programs and activities need to be factored into the GI process. Integration has taken place, or is taking place, for a number of GIs. Each case of integration involves a specific set of circumstances and, as a result, the way in which integration proceeds can vary. This paper discusses the integration of issues in the generic issue process and provides a number of examples

  17. On the Use of Importance Sampling in Particle Transport Problems

    International Nuclear Information System (INIS)

    Eriksson, B.

    1965-06-01

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice

  18. The iPSYCH2012 case-cohort sample

    DEFF Research Database (Denmark)

    Pedersen, C B; Bybjerg-Grauholm, J; Pedersen, M G

    2017-01-01

    The Integrative Psychiatric Research (iPSYCH) consortium has established a large Danish population-based Case-Cohort sample (iPSYCH2012) aimed at unravelling the genetic and environmental architecture of severe mental disorders. The iPSYCH2012 sample is nested within the entire Danish population ...... and environmental aetiologies of severe mental disorders.Molecular Psychiatry advance online publication, 19 September 2017; doi:10.1038/mp.2017.196....

  19. On the Use of Importance Sampling in Particle Transport Problems

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, B

    1965-06-15

    The idea of importance sampling is applied to the problem of solving integral equations of Fredholm's type. Especially Bolzmann's neutron transport equation is taken into consideration. For the solution of the latter equation, an importance sampling technique is derived from some simple transformations at the original transport equation into a similar equation. Examples of transformations are given, which have been used with great success in practice.

  20. Comet coma sample return instrument

    Science.gov (United States)

    Albee, A. L.; Brownlee, Don E.; Burnett, Donald S.; Tsou, Peter; Uesugi, K. T.

    1994-01-01

    The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.

  1. Chorionic villus sampling and amniocentesis.

    Science.gov (United States)

    Brambati, Bruno; Tului, Lucia

    2005-04-01

    The advantages and disadvantages of common invasive methods for prenatal diagnosis are presented in light of new investigations. Several aspects of first-trimester chorionic villus sampling and mid-trimester amniocentesis remain controversial, especially fetal loss rate, feto-maternal complications, and the extension of both sampling methods to less traditional gestational ages (early amniocentesis, late chorionic villus sampling), all of which complicate genetic counseling. A recent randomized trial involving early amniocentesis and late chorionic villus sampling has confirmed previous studies, leading to the unquestionable conclusion that transabdominal chorionic villus sampling is safer. The old dispute over whether limb reduction defects are caused by chorionic villus sampling gains new vigor, with a paper suggesting that this technique has distinctive teratogenic effects. The large experience involving maternal and fetal complications following mid-trimester amniocentesis allows a better estimate of risk for comparison with chorionic villus sampling. Transabdominal chorionic villus sampling, which appears to be the gold standard sampling method for genetic investigations between 10 and 15 completed weeks, permits rapid diagnosis in high-risk cases detected by first-trimester screening of aneuploidies. Sampling efficiency and karyotyping reliability are as high as in mid-trimester amniocentesis with fewer complications, provided the operator has the required training, skill and experience.

  2. Air sampling in the workplace

    International Nuclear Information System (INIS)

    Hickey, E.E.; Stoetzel, G.A.; Strom, D.J.; Cicotte, G.R.; Wiblin, C.M.; McGuire, S.A.

    1993-09-01

    This report provides technical information on air sampling that will be useful for facilities following the recommendations in the NRC's Regulatory Guide 8.25, Revision 1, ''Air sampling in the Workplace.'' That guide addresses air sampling to meet the requirements in NRC's regulations on radiation protection, 10 CFR Part 20. This report describes how to determine the need for air sampling based on the amount of material in process modified by the type of material, release potential, and confinement of the material. The purposes of air sampling and how the purposes affect the types of air sampling provided are discussed. The report discusses how to locate air samplers to accurately determine the concentrations of airborne radioactive materials that workers will be exposed to. The need for and the methods of performing airflow pattern studies to improve the accuracy of air sampling results are included. The report presents and gives examples of several techniques that can be used to evaluate whether the airborne concentrations of material are representative of the air inhaled by workers. Methods to adjust derived air concentrations for particle size are described. Methods to calibrate for volume of air sampled and estimate the uncertainty in the volume of air sampled are described. Statistical tests for determining minimum detectable concentrations are presented. How to perform an annual evaluation of the adequacy of the air sampling is also discussed

  3. Four integration patterns

    DEFF Research Database (Denmark)

    Bygstad, Bendik; Nielsen, Peter Axel; Munkvold, Bjørn Erik

    2010-01-01

    This paper aims to contribute to a theory of integration within the field of IS project management. Integration is a key IS project management issue when new systems are developed and implemented into an increasingly integrated information infrastructure in corporate and governmental organizations....... Expanding the perspective of traditional project management research, we draw extensively on central insights from IS research. Building on socio-technical IS research and Software Engineering research we suggest four generic patterns of integration: Big Bang, Stakeholder Integration, Technical Integration...... and Socio-Technical Integration. We analyze and describe the advantages and disadvantages of each pattern. The four patterns are ideal types. To explore the forces and challenges in these patterns three longitudinal case studies were conducted. In particular we investigate the management challenges for each...

  4. Searching for integrable systems

    International Nuclear Information System (INIS)

    Cary, J.R.

    1984-01-01

    Lack of integrability leads to undesirable consequences in a number of physical systems. The lack of integrability of the magnetic field leads to enhanced particle transport in stellarators and tokamaks with tearing-mode turbulence. Limitations of the luminosity of colliding beams may be due to the onset of stochasticity. Enhanced radial transport in mirror machines caused by the lack of integrability and/or the presence of resonances may be a significant problem in future devices. To improve such systems one needs a systematic method for finding integrable systems. Of course, it is easy to find integrable systems if no restrictions are imposed; textbooks are full of such examples. The problem is to find integrable systems given a set of constraints. An example of this type of problem is that of finding integrable vacuum magnetic fields with rotational transform. The solution to this problem is relevant to the magnetic-confinement program

  5. A 172 $\\mu$W Compressively Sampled Photoplethysmographic (PPG) Readout ASIC With Heart Rate Estimation Directly From Compressively Sampled Data.

    Science.gov (United States)

    Pamula, Venkata Rajesh; Valero-Sarmiento, Jose Manuel; Yan, Long; Bozkurt, Alper; Hoof, Chris Van; Helleputte, Nick Van; Yazicioglu, Refet Firat; Verhelst, Marian

    2017-06-01

    A compressive sampling (CS) photoplethysmographic (PPG) readout with embedded feature extraction to estimate heart rate (HR) directly from compressively sampled data is presented. It integrates a low-power analog front end together with a digital back end to perform feature extraction to estimate the average HR over a 4 s interval directly from compressively sampled PPG data. The application-specified integrated circuit (ASIC) supports uniform sampling mode (1x compression) as well as CS modes with compression ratios of 8x, 10x, and 30x. CS is performed through nonuniformly subsampling the PPG signal, while feature extraction is performed using least square spectral fitting through Lomb-Scargle periodogram. The ASIC consumes 172  μ W of power from a 1.2 V supply while reducing the relative LED driver power consumption by up to 30 times without significant loss of relevant information for accurate HR estimation.

  6. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  7. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  8. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  9. Gravimetric dust sampling for control purposes and occupational dust sampling.

    CSIR Research Space (South Africa)

    Unsted, AD

    1997-02-01

    Full Text Available Prior to the introduction of gravimetric dust sampling, konimeters had been used for dust sampling, which was largely for control purposes. Whether or not absolute results were achievable was not an issue since relative results were used to evaluate...

  10. Monte carlo sampling of fission multiplicity.

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J. S. (John S.)

    2004-01-01

    Two new methods have been developed for fission multiplicity modeling in Monte Carlo calculations. The traditional method of sampling neutron multiplicity from fission is to sample the number of neutrons above or below the average. For example, if there are 2.7 neutrons per fission, three would be chosen 70% of the time and two would be chosen 30% of the time. For many applications, particularly {sup 3}He coincidence counting, a better estimate of the true number of neutrons per fission is required. Generally, this number is estimated by sampling a Gaussian distribution about the average. However, because the tail of the Gaussian distribution is negative and negative neutrons cannot be produced, a slight positive bias can be found in the average value. For criticality calculations, the result of rejecting the negative neutrons is an increase in k{sub eff} of 0.1% in some cases. For spontaneous fission, where the average number of neutrons emitted from fission is low, the error also can be unacceptably large. If the Gaussian width approaches the average number of fissions, 10% too many fission neutrons are produced by not treating the negative Gaussian tail adequately. The first method to treat the Gaussian tail is to determine a correction offset, which then is subtracted from all sampled values of the number of neutrons produced. This offset depends on the average value for any given fission at any energy and must be computed efficiently at each fission from the non-integrable error function. The second method is to determine a corrected zero point so that all neutrons sampled between zero and the corrected zero point are killed to compensate for the negative Gaussian tail bias. Again, the zero point must be computed efficiently at each fission. Both methods give excellent results with a negligible computing time penalty. It is now possible to include the full effects of fission multiplicity without the negative Gaussian tail bias.

  11. An automated synthesis-purification-sample-management platform for the accelerated generation of pharmaceutical candidates.

    Science.gov (United States)

    Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y

    2014-04-01

    A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.

  12. PIXE analysis of thin samples

    International Nuclear Information System (INIS)

    Kiss, Ildiko; Koltay, Ede; Szabo, Gyula; Laszlo, S.; Meszaros, A.

    1985-01-01

    Particle-induced X-ray emission (PIXE) multielemental analysis of thin film samples are reported. Calibration methods of K and L X-lines are discussed. Application of PIXE analysis to aerosol monitoring, multielement aerosol analysis is described. Results of PIXE analysis of samples from two locations in Hungary are compared with the results of aerosol samples from Scandinavia and the USA. (D.Gy.)

  13. Lateral sample motion in the plate-rod impact experiments

    International Nuclear Information System (INIS)

    Zaretsky, Eugene; Levi-Hevroni, David; Shvarts, Dov; Ofer, Dror

    2000-01-01

    Velocity of the lateral motion of cylindrical, 9 mm diameter 20 mm length, samples impacted by WHA impactors of 5-mm thickness was monitored by VISAR at the different points of the sample surface at distance of 1 to 4 mm from the sample impacted edge. The impactors were accelerated in the 25-mm pneumatic gun up to velocities of about 300 m/sec. Integrating the VISAR data recorded at the different surface points after the impact with the same velocity allows to obtain the changes of the sample shape during the initial period of the sample deformation. It was found that the character of the lateral motion is different for samples made of WHA and commercial Titanium alloy Ti-6Al-4V. 2-D numerical simulation of the impact allows to conclude that the work hardening of the alloys is responsible for this difference

  14. Sample size determination in clinical trials with multiple endpoints

    CERN Document Server

    Sozu, Takashi; Hamasaki, Toshimitsu; Evans, Scott R

    2015-01-01

    This book integrates recent methodological developments for calculating the sample size and power in trials with more than one endpoint considered as multiple primary or co-primary, offering an important reference work for statisticians working in this area. The determination of sample size and the evaluation of power are fundamental and critical elements in the design of clinical trials. If the sample size is too small, important effects may go unnoticed; if the sample size is too large, it represents a waste of resources and unethically puts more participants at risk than necessary. Recently many clinical trials have been designed with more than one endpoint considered as multiple primary or co-primary, creating a need for new approaches to the design and analysis of these clinical trials. The book focuses on the evaluation of power and sample size determination when comparing the effects of two interventions in superiority clinical trials with multiple endpoints. Methods for sample size calculation in clin...

  15. PROSPER: an integrated feature-based tool for predicting protease substrate cleavage sites.

    Directory of Open Access Journals (Sweden)

    Jiangning Song

    Full Text Available The ability to catalytically cleave protein substrates after synthesis is fundamental for all forms of life. Accordingly, site-specific proteolysis is one of the most important post-translational modifications. The key to understanding the physiological role of a protease is to identify its natural substrate(s. Knowledge of the substrate specificity of a protease can dramatically improve our ability to predict its target protein substrates, but this information must be utilized in an effective manner in order to efficiently identify protein substrates by in silico approaches. To address this problem, we present PROSPER, an integrated feature-based server for in silico identification of protease substrates and their cleavage sites for twenty-four different proteases. PROSPER utilizes established specificity information for these proteases (derived from the MEROPS database with a machine learning approach to predict protease cleavage sites by using different, but complementary sequence and structure characteristics. Features used by PROSPER include local amino acid sequence profile, predicted secondary structure, solvent accessibility and predicted native disorder. Thus, for proteases with known amino acid specificity, PROSPER provides a convenient, pre-prepared tool for use in identifying protein substrates for the enzymes. Systematic prediction analysis for the twenty-four proteases thus far included in the database revealed that the features we have included in the tool strongly improve performance in terms of cleavage site prediction, as evidenced by their contribution to performance improvement in terms of identifying known cleavage sites in substrates for these enzymes. In comparison with two state-of-the-art prediction tools, PoPS and SitePrediction, PROSPER achieves greater accuracy and coverage. To our knowledge, PROSPER is the first comprehensive server capable of predicting cleavage sites of multiple proteases within a single substrate

  16. Environmental sampling for trace analysis

    International Nuclear Information System (INIS)

    Markert, B.

    1994-01-01

    Often too little attention is given to the sampling before and after actual instrumental measurement. This leads to errors, despite increasingly sensitive analytical systems. This is one of the first books to pay proper attention to representative sampling. It offers an overview of the most common techniques used today for taking environmental samples. The techniques are clearly presented, yield accurate and reproducible results and can be used to sample -air - water - soil and sediments - plants and animals. A comprehensive handbook, this volume provides an excellent starting point for researchers in the rapidly expanding field of environmental analysis. (orig.)

  17. Sample Preprocessing For Atomic Spectrometry

    International Nuclear Information System (INIS)

    Kim, Sun Tae

    2004-08-01

    This book gives descriptions of atomic spectrometry, which deals with atomic absorption spectrometry such as Maxwell-Boltzmann equation and Beer-Lambert law, atomic absorption spectrometry for solvent extraction, HGAAS, ETASS, and CVAAS and inductively coupled plasma emission spectrometer, such as basic principle, generative principle of plasma and device and equipment, and interferences, and inductively coupled plasma mass spectrometry like device, pros and cons of ICP/MS, sample analysis, reagent, water, acid, flux, materials of experiments, sample and sampling and disassembling of sample and pollution and loss in open system and closed system.

  18. Energy Systems Integration Facility Videos | Energy Systems Integration

    Science.gov (United States)

    Facility | NREL Energy Systems Integration Facility Videos Energy Systems Integration Facility Integration Facility NREL + SolarCity: Maximizing Solar Power on Electrical Grids Redefining What's Possible for Renewable Energy: Grid Integration Robot-Powered Reliability Testing at NREL's ESIF Microgrid

  19. Energy Systems Integration Laboratory | Energy Systems Integration Facility

    Science.gov (United States)

    | NREL Integration Laboratory Energy Systems Integration Laboratory Research in the Energy Systems Integration Laboratory is advancing engineering knowledge and market deployment of hydrogen technologies. Applications include microgrids, energy storage for renewables integration, and home- and station

  20. Transdisciplinary knowledge integration : cases from integrated assessment and vulnerability assessment

    NARCIS (Netherlands)

    Hinkel, J.

    2008-01-01

    Keywords: climate change, integrated assessment, knowledge integration, transdisciplinary research, vulnerability, vulnerability assessment.
    This thesis explores how transdisciplinary knowledge integration can be facilitated in the context of integrated assessments and vulnerability