WorldWideScience

Sample records for sampling method results

  1. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  2. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  3. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  4. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  5. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    Science.gov (United States)

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  6. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    Science.gov (United States)

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  7. Subrandom methods for multidimensional nonuniform sampling.

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Method validation in plasma source optical emission spectroscopy (ICP-OES) - From samples to results

    International Nuclear Information System (INIS)

    Pilon, Fabien; Vielle, Karine; Birolleau, Jean-Claude; Vigneau, Olivier; Labet, Alexandre; Arnal, Nadege; Adam, Christelle; Camilleri, Virginie; Amiel, Jeanine; Granier, Guy; Faure, Joel; Arnaud, Regine; Beres, Andre; Blanchard, Jean-Marc; Boyer-Deslys, Valerie; Broudic, Veronique; Marques, Caroline; Augeray, Celine; Bellefleur, Alexandre; Bienvenu, Philippe; Delteil, Nicole; Boulet, Beatrice; Bourgarit, David; Brennetot, Rene; Fichet, Pascal; Celier, Magali; Chevillotte, Rene; Klelifa, Aline; Fuchs, Gilbert; Le Coq, Gilles; Mermet, Jean-Michel

    2017-01-01

    Even though ICP-OES (Inductively Coupled Plasma - Optical Emission Spectroscopy) is now a routine analysis technique, requirements for measuring processes impose a complete control and mastering of the operating process and of the associated quality management system. The aim of this (collective) book is to guide the analyst during all the measurement validation procedure and to help him to guarantee the mastering of its different steps: administrative and physical management of samples in the laboratory, preparation and treatment of the samples before measuring, qualification and monitoring of the apparatus, instrument setting and calibration strategy, exploitation of results in terms of accuracy, reliability, data covariance (with the practical determination of the accuracy profile). The most recent terminology is used in the book, and numerous examples and illustrations are given in order to a better understanding and to help the elaboration of method validation documents

  9. Standard methods for sampling and sample preparation for gamma spectroscopy

    International Nuclear Information System (INIS)

    Taskaeva, M.; Taskaev, E.; Nikolov, P.

    1993-01-01

    The strategy for sampling and sample preparation is outlined: necessary number of samples; analysis and treatment of the results received; quantity of the analysed material according to the radionuclide concentrations and analytical methods; the minimal quantity and kind of the data needed for making final conclusions and decisions on the base of the results received. This strategy was tested in gamma spectroscopic analysis of radionuclide contamination of the region of Eleshnitsa Uranium Mines. The water samples was taken and stored according to the ASTM D 3370-82. The general sampling procedures were in conformity with the recommendations of ISO 5667. The radionuclides was concentrated by coprecipitation with iron hydroxide and ion exchange. The sampling of soil samples complied with the rules of ASTM C 998, and their sample preparation - with ASTM C 999. After preparation the samples were sealed hermetically and measured. (author)

  10. Molecular Weights of Bovine and Porcine Heparin Samples: Comparison of Chromatographic Methods and Results of a Collaborative Survey

    Directory of Open Access Journals (Sweden)

    Sabrina Bertini

    2017-07-01

    Full Text Available In a collaborative study involving six laboratories in the USA, Europe, and India the molecular weight distributions of a panel of heparin sodium samples were determined, in order to compare heparin sodium of bovine intestinal origin with that of bovine lung and porcine intestinal origin. Porcine samples met the current criteria as laid out in the USP Heparin Sodium monograph. Bovine lung heparin samples had consistently lower average molecular weights. Bovine intestinal heparin was variable in molecular weight; some samples fell below the USP limits, some fell within these limits and others fell above the upper limits. These data will inform the establishment of pharmacopeial acceptance criteria for heparin sodium derived from bovine intestinal mucosa. The method for MW determination as described in the USP monograph uses a single, broad standard calibrant to characterize the chromatographic profile of heparin sodium on high-resolution silica-based GPC columns. These columns may be short-lived in some laboratories. Using the panel of samples described above, methods based on the use of robust polymer-based columns have been developed. In addition to the use of the USP’s broad standard calibrant for heparin sodium with these columns, a set of conditions have been devised that allow light-scattering detected molecular weight characterization of heparin sodium, giving results that agree well with the monograph method. These findings may facilitate the validation of variant chromatographic methods with some practical advantages over the USP monograph method.

  11. Molecular Weights of Bovine and Porcine Heparin Samples: Comparison of Chromatographic Methods and Results of a Collaborative Survey.

    Science.gov (United States)

    Bertini, Sabrina; Risi, Giulia; Guerrini, Marco; Carrick, Kevin; Szajek, Anita Y; Mulloy, Barbara

    2017-07-19

    In a collaborative study involving six laboratories in the USA, Europe, and India the molecular weight distributions of a panel of heparin sodium samples were determined, in order to compare heparin sodium of bovine intestinal origin with that of bovine lung and porcine intestinal origin. Porcine samples met the current criteria as laid out in the USP Heparin Sodium monograph. Bovine lung heparin samples had consistently lower average molecular weights. Bovine intestinal heparin was variable in molecular weight; some samples fell below the USP limits, some fell within these limits and others fell above the upper limits. These data will inform the establishment of pharmacopeial acceptance criteria for heparin sodium derived from bovine intestinal mucosa. The method for MW determination as described in the USP monograph uses a single, broad standard calibrant to characterize the chromatographic profile of heparin sodium on high-resolution silica-based GPC columns. These columns may be short-lived in some laboratories. Using the panel of samples described above, methods based on the use of robust polymer-based columns have been developed. In addition to the use of the USP's broad standard calibrant for heparin sodium with these columns, a set of conditions have been devised that allow light-scattering detected molecular weight characterization of heparin sodium, giving results that agree well with the monograph method. These findings may facilitate the validation of variant chromatographic methods with some practical advantages over the USP monograph method.

  12. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  13. Comparability of river suspended-sediment sampling and laboratory analysis methods

    Science.gov (United States)

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  14. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  15. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  16. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  17. Air sampling methods to evaluate microbial contamination in operating theatres: results of a comparative study in an orthopaedics department.

    Science.gov (United States)

    Napoli, C; Tafuri, S; Montenegro, L; Cassano, M; Notarnicola, A; Lattarulo, S; Montagna, M T; Moretti, B

    2012-02-01

    To evaluate the level of microbial contamination of air in operating theatres using active [i.e. surface air system (SAS)] and passive [i.e. index of microbial air contamination (IMA) and nitrocellulose membranes positioned near the wound] sampling systems. Sampling was performed between January 2010 and January 2011 in the operating theatre of the orthopaedics department in a university hospital in Southern Italy. During surgery, the mean bacterial loads recorded were 2232.9 colony-forming units (cfu)/m(2)/h with the IMA method, 123.2 cfu/m(3) with the SAS method and 2768.2 cfu/m(2)/h with the nitrocellulose membranes. Correlation was found between the results of the three methods. Staphylococcus aureus was detected in 12 of 60 operations (20%) with the membranes, five (8.3%) operations with the SAS method, and three operations (5%) with the IMA method. Use of nitrocellulose membranes placed near a wound is a valid method for measuring the microbial contamination of air. This method was more sensitive than the IMA method and was not subject to any calibration bias, unlike active air monitoring systems. Copyright © 2011 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  18. Statistical sampling method for releasing decontaminated vehicles

    International Nuclear Information System (INIS)

    Lively, J.W.; Ware, J.A.

    1996-01-01

    Earth moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method (MIL-STD-105E, open-quotes Sampling Procedures and Tables for Inspection by Attributesclose quotes) for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium mill site in Monticello, Utah (a CERCLA regulated clean-up site). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello Projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  19. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  20. Comparison between two sampling methods by results obtained using petrographic techniques, specially developed for minerals of the Itataia uranium phosphate deposit, Ceara, Brazil

    International Nuclear Information System (INIS)

    Salas, H.T.; Murta, R.L.L.

    1985-01-01

    The results of comparison of two sampling methods applied to a gallery of the uranium-phosphate ore body of Itataia-Ceara State, Brazil, along 235 metres of mineralized zone, are presented. The results were obtained through petrographic techniques especially developed and applied to both samplings. In the first one it was studied hand samples from a systematically sampling made at intervals of 2 metres. After that, the estimated mineralogical composition studies were carried out. Some petrogenetic observations were for the first time verified. The second sampling was made at intervals of 20 metres and 570 tons of ore extracted and distributed in sections and a sample representing each section was studied after crushing at -65. Their mineralogy were quantified and the degree of liberation of apatite calculated. Based on the mineralogical data obtained it was possible to represent both samplings and to make the comparison of the main mineralogical groups (phosphates, carbonates and silicates). In spite of utilizing different methods and methodology and the kind of mineralization, stockwork, being quite irregular, the results were satisfactory. (Author) [pt

  1. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  2. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Adaptive sampling method in deep-penetration particle transport problem

    International Nuclear Information System (INIS)

    Wang Ruihong; Ji Zhicheng; Pei Lucheng

    2012-01-01

    Deep-penetration problem has been one of the difficult problems in shielding calculation with Monte Carlo method for several decades. In this paper, a kind of particle transport random walking system under the emission point as a sampling station is built. Then, an adaptive sampling scheme is derived for better solution with the achieved information. The main advantage of the adaptive scheme is to choose the most suitable sampling number from the emission point station to obtain the minimum value of the total cost in the process of the random walk. Further, the related importance sampling method is introduced. Its main principle is to define the importance function due to the particle state and to ensure the sampling number of the emission particle is proportional to the importance function. The numerical results show that the adaptive scheme under the emission point as a station could overcome the difficulty of underestimation of the result in some degree, and the adaptive importance sampling method gets satisfied results as well. (authors)

  4. Approximation of the exponential integral (well function) using sampling methods

    Science.gov (United States)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  5. Sampling methods for low-frequency electromagnetic imaging

    International Nuclear Information System (INIS)

    Gebauer, Bastian; Hanke, Martin; Schneider, Christoph

    2008-01-01

    For the detection of hidden objects by low-frequency electromagnetic imaging the linear sampling method works remarkably well despite the fact that the rigorous mathematical justification is still incomplete. In this work, we give an explanation for this good performance by showing that in the low-frequency limit the measurement operator fulfils the assumptions for the fully justified variant of the linear sampling method, the so-called factorization method. We also show how the method has to be modified in the physically relevant case of electromagnetic imaging with divergence-free currents. We present numerical results to illustrate our findings, and to show that similar performance can be expected for the case of conducting objects and layered backgrounds

  6. Direct sampling methods for inverse elastic scattering problems

    Science.gov (United States)

    Ji, Xia; Liu, Xiaodong; Xi, Yingxia

    2018-03-01

    We consider the inverse elastic scattering of incident plane compressional and shear waves from the knowledge of the far field patterns. Specifically, three direct sampling methods for location and shape reconstruction are proposed using the different component of the far field patterns. Only inner products are involved in the computation, thus the novel sampling methods are very simple and fast to be implemented. With the help of the factorization of the far field operator, we give a lower bound of the proposed indicator functionals for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functionals decay like the Bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functionals continuously dependent on the far field patterns, which further implies that the novel sampling methods are extremely stable with respect to data error. For the case when the observation directions are restricted into the limited aperture, we firstly introduce some data retrieval techniques to obtain those data that can not be measured directly and then use the proposed direct sampling methods for location and shape reconstructions. Finally, some numerical simulations in two dimensions are conducted with noisy data, and the results further verify the effectiveness and robustness of the proposed sampling methods, even for multiple multiscale cases and limited-aperture problems.

  7. An improved selective sampling method

    International Nuclear Information System (INIS)

    Miyahara, Hiroshi; Iida, Nobuyuki; Watanabe, Tamaki

    1986-01-01

    The coincidence methods which are currently used for the accurate activity standardisation of radio-nuclides, require dead time and resolving time corrections which tend to become increasingly uncertain as countrates exceed about 10 K. To reduce the dependence on such corrections, Muller, in 1981, proposed the selective sampling method using a fast multichannel analyser (50 ns ch -1 ) for measuring the countrates. It is, in many ways, more convenient and possibly potentially more reliable to replace the MCA with scalers and a circuit is described employing five scalers; two of them serving to measure the background correction. Results of comparisons using our new method and the coincidence method for measuring the activity of 60 Co sources yielded agree-ment within statistical uncertainties. (author)

  8. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  9. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...... to determine how many languages from each phylum should be selected, given any required sample size....

  10. On-capillary sample cleanup method for the electrophoretic determination of carbohydrates in juice samples.

    Science.gov (United States)

    Morales-Cid, Gabriel; Simonet, Bartolomé M; Cárdenas, Soledad; Valcárcel, Miguel

    2007-05-01

    On many occasions, sample treatment is a critical step in electrophoretic analysis. As an alternative to batch procedures, in this work, a new strategy is presented with a view to develop an on-capillary sample cleanup method. This strategy is based on the partial filling of the capillary with carboxylated single-walled carbon nanotube (c-SWNT). The nanoparticles retain interferences from the matrix allowing the determination and quantification of carbohydrates (viz glucose, maltose and fructose). The precision of the method for the analysis of real samples ranged from 5.3 to 6.4%. The proposed method was compared with a method based on a batch filtration of the juice sample through diatomaceous earth and further electrophoretic determination. This method was also validated in this work. The RSD for this other method ranged from 5.1 to 6%. The results obtained by both methods were statistically comparable demonstrating the accuracy of the proposed methods and their effectiveness. Electrophoretic separation of carbohydrates was achieved using 200 mM borate solution as a buffer at pH 9.5 and applying 15 kV. During separation, the capillary temperature was kept constant at 40 degrees C. For the on-capillary cleanup method, a solution containing 50 mg/L of c-SWNTs prepared in 300 mM borate solution at pH 9.5 was introduced for 60 s into the capillary just before sample introduction. For the electrophoretic analysis of samples cleaned in batch with diatomaceous earth, it is also recommended to introduce into the capillary, just before the sample, a 300 mM borate solution as it enhances the sensitivity and electrophoretic resolution.

  11. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    Science.gov (United States)

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  12. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  13. Field evaluation of personal sampling methods for multiple bioaerosols.

    Science.gov (United States)

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  14. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  15. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  16. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees

    1993-01-01

    In recent years more attention is paid to the quality of language samples in typological work. Without an adequate sampling strategy, samples may suffer from various kinds of bias. In this article we propose a sampling method in which the genetic criterion is taken as the most important: samples...... to determine how many languages from each phylum should be selected, given any required sample size....

  17. Results of Plutonium Intercalibration in Seawater and Seaweed Samples

    International Nuclear Information System (INIS)

    Fukai, R.; Murray, C.N.

    1976-01-01

    The results of the intercalibration exercise for the measurement of plutonium-239 and 228 in two seawater samples SW-I-1 and SW-I-2 and a marine algae sample AG-I-1 are presented. Seventeen laboratories from 8 countries as well as the IAEA International Laboratory of Marine Radioactivity took part. A discussion of the results and methods used in the analysis is given. It is concluded that in spite of the complicated chemical procedures involved in plutonium analysis, the scatter of the reported results was much smaller than that for fission product radionuclides such as strontium-90, ruthenium-106, cesium-137 etc. (author)

  18. A comparison of fitness-case sampling methods for genetic programming

    Science.gov (United States)

    Martínez, Yuliana; Naredo, Enrique; Trujillo, Leonardo; Legrand, Pierrick; López, Uriel

    2017-11-01

    Genetic programming (GP) is an evolutionary computation paradigm for automatic program induction. GP has produced impressive results but it still needs to overcome some practical limitations, particularly its high computational cost, overfitting and excessive code growth. Recently, many researchers have proposed fitness-case sampling methods to overcome some of these problems, with mixed results in several limited tests. This paper presents an extensive comparative study of four fitness-case sampling methods, namely: Interleaved Sampling, Random Interleaved Sampling, Lexicase Selection and Keep-Worst Interleaved Sampling. The algorithms are compared on 11 symbolic regression problems and 11 supervised classification problems, using 10 synthetic benchmarks and 12 real-world data-sets. They are evaluated based on test performance, overfitting and average program size, comparing them with a standard GP search. Comparisons are carried out using non-parametric multigroup tests and post hoc pairwise statistical tests. The experimental results suggest that fitness-case sampling methods are particularly useful for difficult real-world symbolic regression problems, improving performance, reducing overfitting and limiting code growth. On the other hand, it seems that fitness-case sampling cannot improve upon GP performance when considering supervised binary classification.

  19. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    Science.gov (United States)

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  20. Testing results of Monte Carlo sampling processes in MCSAD

    International Nuclear Information System (INIS)

    Pinnera, I.; Cruz, C.; Abreu, Y.; Leyva, A.; Correa, C.; Demydenko, C.

    2009-01-01

    The Monte Carlo Simulation of Atom Displacements (MCSAD) is a code implemented by the authors to simulate the complete process of atom displacement (AD) formation. This code makes use of the Monte Carlo (MC) method to sample all the processes involved in the gamma and electronic radiation transport through matter. The kernel of the calculations applied to this code relies on a model based on an algorithm developed by the authors, which firstly splits out multiple electron elastic scattering events from those single ones at higher scattering angles and then, from the last one, sampling those leading to AD at high transferred atomic recoil energies. Some tests have been developed to check the sampling algorithms with the help of the corresponding theoretical distribution functions. Satisfactory results have been obtained, which indicate the strength of the methods and subroutines used in the code. (Author)

  1. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  2. Neutron activation analysis of certified samples by the absolute method

    Science.gov (United States)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  3. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  4. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Science.gov (United States)

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  5. Sampling method of environmental radioactivity monitoring

    International Nuclear Information System (INIS)

    1984-01-01

    This manual provides sampling methods of environmental samples of airborne dust, precipitated dust, precipitated water (rain or snow), fresh water, soil, river sediment or lake sediment, discharged water from a nuclear facility, grains, tea, milk, pasture grass, limnetic organisms, daily diet, index organisms, sea water, marine sediment, marine organisms, and that for tritium and radioiodine determination for radiation monitoring from radioactive fallout or radioactivity release by nuclear facilities. This manual aims at the presentation of standard sampling procedures for environmental radioactivity monitoring regardless of monitoring objectives, and shows preservation method of environmental samples acquired at the samplingpoint for radiation counting for those except human body. Sampling techniques adopted in this manual is decided by the criteria that they are suitable for routine monitoring and any special skillfulness is not necessary. Based on the above-mentioned principle, this manual presents outline and aims of sampling, sampling position or object, sampling quantity, apparatus, equipment or vessel for sampling, sampling location, sampling procedures, pretreatment and preparation procedures of a sample for radiation counting, necessary recording items for sampling and sample transportation procedures. Special attention is described in the chapter of tritium and radioiodine because these radionuclides might be lost by the above-mentioned sample preservation method for radiation counting of less volatile radionuclides than tritium or radioiodine. (Takagi, S.)

  6. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Shannon, D. W.

    1978-01-01

    The data obtained for the first round robin sample collected at Mesa 6-2 wellhead, East Mesa Test Site, Imperial Valley are summarized. Test results are listed by method used for cross reference to the analytic methods section. Results obtained for radioactive isotopes present in the brine sample are tabulated. The data obtained for the second round robin sample collected from the Woolsey No. 1 first stage flash unit, San Diego Gas and Electric Niland Test Facility are presented in the same manner. Lists of the participants of the two round robins are given. Data from miscellaneous analyses are included. Summaries of values derived from the round robin raw data are presented. (MHR)

  7. Multi-frequency direct sampling method in inverse scattering problem

    Science.gov (United States)

    Kang, Sangwoo; Lambert, Marc; Park, Won-Kwang

    2017-10-01

    We consider the direct sampling method (DSM) for the two-dimensional inverse scattering problem. Although DSM is fast, stable, and effective, some phenomena remain unexplained by the existing results. We show that the imaging function of the direct sampling method can be expressed by a Bessel function of order zero. We also clarify the previously unexplained imaging phenomena and suggest multi-frequency DSM to overcome traditional DSM. Our method is evaluated in simulation studies using both single and multiple frequencies.

  8. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Laboratory Management Division of the DOE. Methods are prepared for entry into DOE Methods as chapter editors, together with DOE and other participants in this program, identify analytical and sampling method needs. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types. open-quotes Draftclose quotes or open-quotes Verified.close quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations

  9. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  10. Comparison of sampling methods for animal manure

    NARCIS (Netherlands)

    Derikx, P.J.L.; Ogink, N.W.M.; Hoeksma, P.

    1997-01-01

    Currently available and recently developed sampling methods for slurry and solid manure were tested for bias and reproducibility in the determination of total phosphorus and nitrogen content of samples. Sampling methods were based on techniques in which samples were taken either during loading from

  11. Influences of different sample preparation methods on tooth enamel ESR signals

    International Nuclear Information System (INIS)

    Zhang Wenyi; Jiao Ling; Zhang Liang'an; Pan Zhihong; Zeng Hongyu

    2005-01-01

    Objective: To study the influences of different sample preparation methods on tooth enamel ESR signals in order to reduce the effect of dentine on their sensitivities to radiation. Methods: The enamel was separated from dentine of non-irradiated adult teeth by mechanical, chemical, or both methods. The samples of different preparations were scanned by an ESR spectrometer before and after irradiation. Results: The response of ESR signals of samples prepared with different methods to radiation dose was significantly different. Conclusion: The selection of sample preparation method is very important for dose reconstruction by tooth enamel ESR dosimetry, especially in the low dose range. (authors)

  12. Organic analysis of ambient samples collected near Tank 241-C-103: Results from samples collected on May 12, 1994

    International Nuclear Information System (INIS)

    Clauss, T.W.; Ligotke, M.W.; McVeety, B.D.; Lucke, R.B.; Young, J.S.; McCulloch, M.; Fruchter, J.S.; Goheen, S.C.

    1995-06-01

    This report describes organic analyses results from ambient samples collected both upwind and through the vapor sampling system (VSS) near Hanford waste storage Tank 241-C-103 (referred to as Tank C-103). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed. Quantitative results were obtained for organic compounds. Five organic tentatively identified compounds (TICS) were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and the reported concentrations are semiquantitative estimates. In addition, we looked for the 40 standard TO-14 analytes. We observed 39. Of these, only one was observed above the 2-ppbv calibrated instrument detection limit. Dichloromethane was above the detection limits using both methods, but the result from the TO-14 method is traceable to a standard gas mixture and is considered more accurate. Organic analytes were found only in the sample collected through the VSS, suggesting that these compounds were residual contamination from a previous sampling job. Detailed descriptions of the results appear in the text

  13. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  14. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, open-quotes Draftclose quotes or open-quotes Verifiedclose quotes. open-quotes Draftclose quotes methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. open-quotes Verifiedclose quotes methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy

  15. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  16. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    Science.gov (United States)

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-09

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  17. 7 CFR 29.110 - Method of sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  18. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters.

    Science.gov (United States)

    Calfee, M Worth; Rose, Laura J; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2014-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50% of their total holding capacity. The results were analyzed by one-way ANOVA across material types, presence or absence of dust, and sampling device. The extraction method gave higher relative recoveries than the two vacuum methods evaluated (p≤0.001). On average, recoveries obtained by the vacuum methods were about 30% of those achieved by the extraction method. Relative recoveries between the two vacuum methods were not significantly different (p>0.05). Although extraction methods yielded higher recoveries than vacuum methods, either HVAC filter sampling approach may provide a rapid and inexpensive mechanism for understanding the extent of contamination following a wide-area biological release incident. Published by Elsevier B.V.

  19. Single- versus multiple-sample method to measure glomerular filtration rate.

    Science.gov (United States)

    Delanaye, Pierre; Flamant, Martin; Dubourg, Laurence; Vidal-Petiot, Emmanuelle; Lemoine, Sandrine; Cavalier, Etienne; Schaeffner, Elke; Ebert, Natalie; Pottel, Hans

    2018-01-08

    There are many different ways to measure glomerular filtration rate (GFR) using various exogenous filtration markers, each having their own strengths and limitations. However, not only the marker, but also the methodology may vary in many ways, including the use of urinary or plasma clearance, and, in the case of plasma clearance, the number of time points used to calculate the area under the concentration-time curve, ranging from only one (Jacobsson method) to eight (or more) blood samples. We collected the results obtained from 5106 plasma clearances (iohexol or 51Cr-ethylenediaminetetraacetic acid (EDTA)) using three to four time points, allowing GFR calculation using the slope-intercept method and the Bröchner-Mortensen correction. For each time point, the Jacobsson formula was applied to obtain the single-sample GFR. We used Bland-Altman plots to determine the accuracy of the Jacobsson method at each time point. The single-sample method showed within 10% concordances with the multiple-sample method of 66.4%, 83.6%, 91.4% and 96.0% at the time points 120, 180, 240 and ≥300 min, respectively. Concordance was poorer at lower GFR levels, and this trend is in parallel with increasing age. Results were similar in males and females. Some discordance was found in the obese subjects. Single-sample GFR is highly concordant with a multiple-sample strategy, except in the low GFR range (<30 mL/min). © The Author 2018. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  20. Non-uniform sampling and wide range angular spectrum method

    International Nuclear Information System (INIS)

    Kim, Yong-Hae; Byun, Chun-Won; Oh, Himchan; Lee, JaeWon; Pi, Jae-Eun; Heon Kim, Gi; Lee, Myung-Lae; Ryu, Hojun; Chu, Hye-Yong; Hwang, Chi-Sun

    2014-01-01

    A novel method is proposed for simulating free space field propagation from a source plane to a destination plane that is applicable for both small and large propagation distances. The angular spectrum method (ASM) was widely used for simulating near field propagation, but it caused a numerical error when the propagation distance was large because of aliasing due to under sampling. Band limited ASM satisfied the Nyquist condition on sampling by limiting a bandwidth of a propagation field to avoid an aliasing error so that it could extend the applicable propagation distance of the ASM. However, the band limited ASM also made an error due to the decrease of an effective sampling number in a Fourier space when the propagation distance was large. In the proposed wide range ASM, we use a non-uniform sampling in a Fourier space to keep a constant effective sampling number even though the propagation distance is large. As a result, the wide range ASM can produce simulation results with high accuracy for both far and near field propagation. For non-paraxial wave propagation, we applied the wide range ASM to a shifted destination plane as well. (paper)

  1. 19 CFR 151.83 - Method of sampling.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  2. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    International Nuclear Information System (INIS)

    Nelsen, L.A.

    2009-01-01

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining

  3. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    Energy Technology Data Exchange (ETDEWEB)

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  4. Methods for Sampling and Measurement of Compressed Air Contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L

    1976-10-15

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  5. Methods for Sampling and Measurement of Compressed Air Contaminants

    International Nuclear Information System (INIS)

    Stroem, L.

    1976-10-01

    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  6. [Standard sample preparation method for quick determination of trace elements in plastic].

    Science.gov (United States)

    Yao, Wen-Qing; Zong, Rui-Long; Zhu, Yong-Fa

    2011-08-01

    Reference sample was prepared by masterbatch method, containing heavy metals with known concentration of electronic information products (plastic), the repeatability and precision were determined, and reference sample preparation procedures were established. X-Ray fluorescence spectroscopy (XRF) analysis method was used to determine the repeatability and uncertainty in the analysis of the sample of heavy metals and bromine element. The working curve and the metrical methods for the reference sample were carried out. The results showed that the use of the method in the 200-2000 mg x kg(-1) concentration range for Hg, Pb, Cr and Br elements, and in the 20-200 mg x kg(-1) range for Cd elements, exhibited a very good linear relationship, and the repeatability of analysis methods for six times is good. In testing the circuit board ICB288G and ICB288 from the Mitsubishi Heavy Industry Company, results agreed with the recommended values.

  7. Research and application of sampling and analysis method of sodium aerosol

    International Nuclear Information System (INIS)

    Yu Xiaochen; Guo Qingzhou; Wen Ximeng

    1998-01-01

    Method of sampling-analysis for sodium aerosol is researched. The vacuum sampling technology is used in the sampling process, and the analysis method adopted is volumetric analysis and atomic absorption. When the absolute content of sodium is in the rang of 0.1 mg to 1.0 mg, the deviation of results between volumetric analysis and atomic absorption is less than 2%. The method has been applied in a sodium aerosol removal device successfully. The analysis range, accuracy and precision can meet the requirements for researching sodium aerosol

  8. Assessment of reagent effectiveness and preservation methods for equine faecal samples

    Directory of Open Access Journals (Sweden)

    Eva Vavrouchova

    2015-03-01

    Full Text Available The aim of our study was to identify the most suitable flotation solution and effective preservation method for the examination of equine faeces samples using the FLOTAC technique. Samples from naturally infected horses were transported to the laboratory andanalysed accordingly. The sample from each horse was homogenized and divided into four parts: one was frozen, another two were preserved in different reagents such as sodium acetate-acetic-acid–formalin (SAF or 5% formalin.The last part was examined as a fresh sample in three different flotation solutions (Sheather´s solution, sodium chloride and sodium nitrate solution, all with a specific gravity 1.200. The preserved samples were examined in the period from 14 to21days after collection. According to our results, the sucrose solution was the most suitable flotation solution for fresh samples (small strongyle egg per gram was 706 compared to 360 in sodium chlorid and 507 in sodium nitrate and the sodium nitrate solution was the most efficient for the preserved samples (egg per gram was 382 compared to 295 in salt solution and 305 in sucrose solution. Freezing appears to be the most effective method of sample preservation, resulting in minimal damage to fragile strongyle eggs and therefore it is the most simple and effective preservation method for the examination of large numbers of faecal samples without the necessity of examining them all within 48 hours of collection. Deep freezing as a preservation method for equine faeces samples has not, according to our knowledge, been yet published.

  9. Sampling bee communities using pan traps: alternative methods increase sample size

    Science.gov (United States)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  10. New adaptive sampling method in particle image velocimetry

    International Nuclear Information System (INIS)

    Yu, Kaikai; Xu, Jinglei; Tang, Lan; Mo, Jianwei

    2015-01-01

    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method. (technical design note)

  11. Results of tritium measurement in environmental samples and drainage

    International Nuclear Information System (INIS)

    Koike, Ryoji; Hirai, Yasuo

    1983-01-01

    In Ibaraki prefecture, the tritium concentration in the drainage from the nuclear facilities has been measured since 1974. Then, with the start of operation of the fuel reprocessing plant in 1977, the tritium concentration in environmental samples was to be measured also in order to examine the effect of the drainage on the environment. The results of the tritium measurement in Ibaraki prefecture up to about 1980 are described: sampling points, sampling and measuring methods, the tritium concentration in the drainage, air, inland water and seawater, respectively. The drainages have been taken from Japan Atomic Power Company, Japan Atomic Energy Research Institute, and Power Reactor and Nuclear Fuel Development Corporation (with the fuel reprocessing plant). The samples of air, inland water and seawater have been taken in the areas concerned. The tritium concentration was measured by a low-background liquid scintillation counter. The measured values in the environment have been generally at low level, not different from other areas. (Mori, K.)

  12. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.

    2016-11-01

    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  13. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    Science.gov (United States)

    Fischer, Jesse R.; Quist, Michael C.

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  14. Qualification for Safeguards Purposes of UF6 Sampling using Alumina – Results of the Evaluation Campaign of ABACC-Cristallini Method

    OpenAIRE

    ESTAEBAN ADOLFO; GAUTIER EDUARDO; MACHADO DA SILVA LUIS; FERNANDEZ MORENO SONIA; RENHA JR GERALDO; DIAS FABIO; PEREIRA DE OLIVEIRA JUNIOR OLIVIO; AMMARAGGI DAVID; MASON PETER; SORIANO MICHAEL; CROATTO PAUL; ZULEGER EVELYN; GIAQUINTO JOSEPH; HEXEL COLE; VERCOUTER THOMAS

    2017-01-01

    The procedure currently used to sample material from process lines in uranium enrichment plants consists of collecting the uranium hexafluoride (UF6) in gaseous phase by desublimation inside a metal sampling cylinder cooled with liquid nitrogen or in certain facilities in a fluorothene P-10 tube type. The ABACC-Cristallini method (A-C method) has been proposed to collect the UF6 (gas) by adsorption in alumina (Al2O3) in the form of uranyl fluoride (UO2F2) (solid). This method uses a fluor...

  15. Comparison of vapor sampling system (VSS) and in situ vapor sampling (ISVS) methods on Tanks C-107, BY-108, and S-102

    International Nuclear Information System (INIS)

    Huckaby, J.L.; Edwards, J.A.; Evans, J.C.

    1996-05-01

    The objective of this report is to evaluate the equivalency of two methods used to sample nonradioactive gases and vapors in the Hanford Site high-level waste tank headspaces. In addition to the comparison of the two sampling methods, the effects of an in-line fine particle filter on sampling results are also examined to determine whether results are adversely affected by its presence. This report discusses data from a January 1996 sampling

  16. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Schindler, Matthias, E-mail: matthias.schindler@physik.uni-erlangen.de; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-15

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO{sub 2} and reduced to graphite to determine {sup 14}C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  17. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    Science.gov (United States)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  18. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  19. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    Science.gov (United States)

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  20. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    Science.gov (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  1. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies

    Directory of Open Access Journals (Sweden)

    Passamonti Marco

    2010-04-01

    Full Text Available Abstract Background Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. Results We propose a new method to assess taxon sampling developing Clarke and Warwick statistics. This method aims to measure the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, our method also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses we describe in this paper. Conclusions We show that this method is sensitive and allows direct discrimination between representative and unrepresentative samples. It is also informative about the addition of taxa to improve taxonomic coverage of the ingroup. Provided that the investigators' expertise is mandatory in this field, phylogenetic representativeness makes up an objective touchstone in planning phylogenetic studies.

  2. Fluidics platform and method for sample preparation

    Science.gov (United States)

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  3. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  4. Shear Strength of Remoulding Clay Samples Using Different Methods of Moulding

    Science.gov (United States)

    Norhaliza, W.; Ismail, B.; Azhar, A. T. S.; Nurul, N. J.

    2016-07-01

    Shear strength for clay soil was required to determine the soil stability. Clay was known as a soil with complex natural formations and very difficult to obtain undisturbed samples at the site. The aim of this paper was to determine the unconfined shear strength of remoulded clay on different methods in moulding samples which were proctor compaction, hand operated soil compacter and miniature mould methods. All the samples were remoulded with the same optimum moisture content (OMC) and density that were 18% and 1880 kg/m3 respectively. The unconfined shear strength results of remoulding clay soils for proctor compaction method was 289.56kPa with the strain 4.8%, hand operated method was 261.66kPa with the strain 4.4% and miniature mould method was 247.52kPa with the strain 3.9%. Based on the proctor compaction method, the reduction percentage of unconfined shear strength of remoulded clay soil of hand operated method was 9.66%, and for miniature mould method was 14.52%. Thus, because there was no significant difference of reduction percentage of unconfined shear strength between three different methods, so it can be concluded that remoulding clay by hand operated method and miniature mould method were accepted and suggested to perform remoulding clay samples by other future researcher. However for comparison, the hand operated method was more suitable to form remoulded clay sample in term of easiness, saving time and less energy for unconfined shear strength determination purposes.

  5. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  6. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  7. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  8. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  9. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Science.gov (United States)

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  10. Empirical method for matrix effects correction in liquid samples

    International Nuclear Information System (INIS)

    Vigoda de Leyt, Dora; Vazquez, Cristina

    1987-01-01

    A simple method for the determination of Cr, Ni and Mo in stainless steels is presented. In order to minimize the matrix effects, the conditions of liquid system to dissolve stainless steels chips has been developed. Pure element solutions were used as standards. Preparation of synthetic solutions with all the elements of steel and also mathematic corrections are avoided. It results in a simple chemical operation which simplifies the method of analysis. The variance analysis of the results obtained with steel samples show that the three elements may be determined from the comparison with the analytical curves obtained with the pure elements if the same parameters in the calibration curves are used. The accuracy and the precision were checked against other techniques using the British Chemical Standards of the Bureau of Anlysed Samples Ltd. (England). (M.E.L.) [es

  11. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  12. Sampling Methods for Wallenius' and Fisher's Noncentral Hypergeometric Distributions

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    the mode, ratio-of-uniforms rejection method, and rejection by sampling in the tau domain. Methods for the multivariate distributions include: simulation of urn experiments, conditional method, Gibbs sampling, and Metropolis-Hastings sampling. These methods are useful for Monte Carlo simulation of models...... of biased sampling and models of evolution and for calculating moments and quantiles of the distributions.......Several methods for generating variates with univariate and multivariate Wallenius' and Fisher's noncentral hypergeometric distributions are developed. Methods for the univariate distributions include: simulation of urn experiments, inversion by binary search, inversion by chop-down search from...

  13. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  14. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1993-03-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  15. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  16. A Comparison between Three Methods of Language Sampling: Freeplay, Narrative Speech and Conversation

    Directory of Open Access Journals (Sweden)

    Yasser Rezapour

    2011-10-01

    Full Text Available Objectives: The spontaneous language sample analysis is an important part of the language assessment protocol. Language samples give us useful information about how children use language in the natural situations of daily life. The purpose of this study was to compare Conversation, Freeplay, and narrative speech in aspects of Mean Length of Utterance (MLU, Type-token ratio (TTR, and the number of utterances. Methods: By cluster sampling method, a total of 30 Semnanian five-year-old boys with normal speech and language development were selected from the active kindergartens in Semnan city. Conversation, Freeplay, and narrative speech were three applied language sample elicitation methods to obtain 15 minutes of children’s spontaneous language samples. Means for MLU, TTR, and the number of utterances are analyzed by dependent ANOVA. Results: The result showed no significant difference in number of elicited utterances among these three language sampling methods. Narrative speech elicited longer MLU than freeplay and conversation, and compared to freeplay and narrative speech, conversation elicited higher TTR. Discussion: Results suggest that in the clinical assessment of the Persian-language children, it is better to use narrative speech to elicit longer MLU and to use conversation to elicit higher TTR.

  17. Rock sampling. [method for controlling particle size distribution

    Science.gov (United States)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  18. The method of Sample Management in Neutron Activation Analysis Laboratory-Serpong

    International Nuclear Information System (INIS)

    Elisabeth-Ratnawati

    2005-01-01

    In the testing laboratory used by neutron activation analysis method, sample preparation is the main factor and it can't be neglect. The error in the sample preparation can give result with lower accuracy. In this article is explained the scheme of sample preparation i.e sample receive administration, the separate of sample, fluid and solid sample preparation, sample grouping, irradiation, sample counting and holding the sample post irradiation. If the management of samples were good application based on Standard Operation Procedure, therefore each samples has good traceability. To optimize the management of samples is needed the trained and skilled personal and good facility. (author)

  19. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  20. Estimating sample size for a small-quadrat method of botanical ...

    African Journals Online (AJOL)

    Reports the results of a study conducted to determine an appropriate sample size for a small-quadrat method of botanical survey for application in the Mixed Bushveld of South Africa. Species density and grass density were measured using a small-quadrat method in eight plant communities in the Nylsvley Nature Reserve.

  1. OPTIMAL METHOD FOR PREPARATION OF SILICATE ROCK SAMPLES FOR ANALYTICAL PURPOSES

    Directory of Open Access Journals (Sweden)

    Maja Vrkljan

    2004-12-01

    Full Text Available The purpose of this study was to determine an optimal dissolution method for silicate rock samples for further analytical purposes. Analytical FAAS method of determining cobalt, chromium, copper, nickel, lead and zinc content in gabbro sample and geochemical standard AGV-1 has been applied for verification. Dissolution in mixtures of various inorganic acids has been tested, as well as Na2CO3 fusion technique. The results obtained by different methods have been compared and dissolution in the mixture of HNO3 + HF has been recommended as optimal.

  2. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  3. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...

  4. Mixed Methods Sampling: A Typology with Examples

    Science.gov (United States)

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  5. Comparison of indoor air sampling and dust collection methods for fungal exposure assessment using quantitative PCR

    Science.gov (United States)

    Evaluating fungal contamination indoors is complicated because of the many different sampling methods utilized. In this study, fungal contamination was evaluated using five sampling methods and four matrices for results. The five sampling methods were a 48 hour indoor air sample ...

  6. Log sampling methods and software for stand and landscape analyses.

    Science.gov (United States)

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  7. Evaluation of sampling methods for the detection of Salmonella in broiler flocks

    DEFF Research Database (Denmark)

    Skov, Marianne N.; Carstensen, B.; Tornoe, N.

    1999-01-01

    The present study compares four different sampling methods potentially applicable to detection of Salmonella in broiler flocks, based on collection of faecal samples (i) by hand, 300 fresh faecal samples (ii) absorbed on five sheets of paper (iii) absorbed on five pairs of socks (elastic cotton...... horizontal or vertical) were found in the investigation. The results showed that the sock method (five pairs of socks) had a sensitivity comparable with the hand collection method (60 pools of five faecal samples); the paper collection method was inferior, as was the use of only one pair of socks, Estimation...... tubes pulled over the boots and termed 'socks') and (iv) by using only one pair of socks. Twenty-three broiler flocks were included in the investigation and 18 of these were found to be positive by at least one method. Seven serotypes of Salmonella with different patterns of transmission (mainly...

  8. New methods for sampling sparse populations

    Science.gov (United States)

    Anna Ringvall

    2007-01-01

    To improve surveys of sparse objects, methods that use auxiliary information have been suggested. Guided transect sampling uses prior information, e.g., from aerial photographs, for the layout of survey strips. Instead of being laid out straight, the strips will wind between potentially more interesting areas. 3P sampling (probability proportional to prediction) uses...

  9. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  10. 19 CFR 151.70 - Method of sampling by Customs.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling by Customs. 151.70 Section 151... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70 Method of sampling by Customs. A general sample shall be taken from each sampling unit, unless it is not...

  11. Detection of protozoa in water samples by formalin/ether concentration method.

    Science.gov (United States)

    Lora-Suarez, Fabiana; Rivera, Raul; Triviño-Valencia, Jessica; Gomez-Marin, Jorge E

    2016-09-01

    Methods to detect protozoa in water samples are expensive and laborious. We evaluated the formalin/ether concentration method to detect Giardia sp., Cryptosporidium sp. and Toxoplasma in water. In order to test the properties of the method, we spiked water samples with different amounts of each protozoa (0, 10 and 50 cysts or oocysts) in a volume of 10 L of water. Immunofluorescence assay was used for detection of Giardia and Cryptosporidium. Toxoplasma oocysts were identified by morphology. The mean percent of recovery in 10 repetitions of the entire method, in 10 samples spiked with ten parasites and read by three different observers, were for Cryptosporidium 71.3 ± 12, for Giardia 63 ± 10 and for Toxoplasma 91.6 ± 9 and the relative standard deviation of the method was of 17.5, 17.2 and 9.8, respectively. Intraobserver variation as measured by intraclass correlation coefficient, was fair for Toxoplasma, moderate for Cryptosporidium and almost perfect for Giardia. The method was then applied in 77 samples of raw and drinkable water in three different plant of water treatment. Cryptosporidium was found in 28 of 77 samples (36%) and Giardia in 31 of 77 samples (40%). Theses results identified significant differences in treatment process to reduce the presence of Giardia and Cryptosporidium. In conclusion, the formalin ether method to concentrate protozoa in water is a new alternative for low resources countries, where is urgently need to monitor and follow the presence of theses protozoa in drinkable water. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Sample processing method for the determination of perchlorate in milk

    International Nuclear Information System (INIS)

    Dyke, Jason V.; Kirk, Andrea B.; Kalyani Martinelango, P.; Dasgupta, Purnendu K.

    2006-01-01

    In recent years, many different water sources and foods have been reported to contain perchlorate. Studies indicate that significant levels of perchlorate are present in both human and dairy milk. The determination of perchlorate in milk is particularly important due to its potential health impact on infants and children. As for many other biological samples, sample preparation is more time consuming than the analysis itself. The concurrent presence of large amounts of fats, proteins, carbohydrates, etc., demands some initial cleanup; otherwise the separation column lifetime and the limit of detection are both greatly compromised. Reported milk processing methods require the addition of chemicals such as ethanol, acetic acid or acetonitrile. Reagent addition is undesirable in trace analysis. We report here an essentially reagent-free sample preparation method for the determination of perchlorate in milk. Milk samples are spiked with isotopically labeled perchlorate and centrifuged to remove lipids. The resulting liquid is placed in a disposable centrifugal ultrafilter device with a molecular weight cutoff of 10 kDa, and centrifuged. Approximately 5-10 ml of clear liquid, ready for analysis, is obtained from a 20 ml milk sample. Both bovine and human milk samples have been successfully processed and analyzed by ion chromatography-mass spectrometry (IC-MS). Standard addition experiments show good recoveries. The repeatability of the analytical result for the same sample in multiple sample cleanup runs ranged from 3 to 6% R.S.D. This processing technique has also been successfully applied for the determination of iodide and thiocyanate in milk

  13. Innovative methods for inorganic sample preparation

    Energy Technology Data Exchange (ETDEWEB)

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized.

  14. Innovative methods for inorganic sample preparation

    International Nuclear Information System (INIS)

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized

  15. A Novel Analysis Method for Paired-Sample Microbial Ecology Experiments.

    Science.gov (United States)

    Olesen, Scott W; Vora, Suhani; Techtmann, Stephen M; Fortney, Julian L; Bastidas-Oyanedel, Juan R; Rodríguez, Jorge; Hazen, Terry C; Alm, Eric J

    2016-01-01

    Many microbial ecology experiments use sequencing data to measure a community's response to an experimental treatment. In a common experimental design, two units, one control and one experimental, are sampled before and after the treatment is applied to the experimental unit. The four resulting samples contain information about the dynamics of organisms that respond to the treatment, but there are no analytical methods designed to extract exactly this type of information from this configuration of samples. Here we present an analytical method specifically designed to visualize and generate hypotheses about microbial community dynamics in experiments that have paired samples and few or no replicates. The method is based on the Poisson lognormal distribution, long studied in macroecology, which we found accurately models the abundance distribution of taxa counts from 16S rRNA surveys. To demonstrate the method's validity and potential, we analyzed an experiment that measured the effect of crude oil on ocean microbial communities in microcosm. Our method identified known oil degraders as well as two clades, Maricurvus and Rhodobacteraceae, that responded to amendment with oil but do not include known oil degraders. Our approach is sensitive to organisms that increased in abundance only in the experimental unit but less sensitive to organisms that increased in both control and experimental units, thus mitigating the role of "bottle effects".

  16. A Method for Microalgae Proteomics Analysis Based on Modified Filter-Aided Sample Preparation.

    Science.gov (United States)

    Li, Song; Cao, Xupeng; Wang, Yan; Zhu, Zhen; Zhang, Haowei; Xue, Song; Tian, Jing

    2017-11-01

    With the fast development of microalgal biofuel researches, the proteomics studies of microalgae increased quickly. A filter-aided sample preparation (FASP) method is widely used proteomics sample preparation method since 2009. Here, a method of microalgae proteomics analysis based on modified filter-aided sample preparation (mFASP) was described to meet the characteristics of microalgae cells and eliminate the error caused by over-alkylation. Using Chlamydomonas reinhardtii as the model, the prepared sample was tested by standard LC-MS/MS and compared with the previous reports. The results showed mFASP is suitable for most of occasions of microalgae proteomics studies.

  17. 222Rn in water: A comparison of two sample collection methods and two sample transport methods, and the determination of temporal variation in North Carolina ground water

    International Nuclear Information System (INIS)

    Hightower, J.H. III

    1994-01-01

    Objectives of this field experiment were: (1) determine whether there was a statistically significant difference between the radon concentrations of samples collected by EPA's standard method, using a syringe, and an alternative, slow-flow method; (2) determine whether there was a statistically significant difference between the measured radon concentrations of samples mailed vs samples not mailed; and (3) determine whether there was a temporal variation of water radon concentration over a 7-month period. The field experiment was conducted at 9 sites, 5 private wells, and 4 public wells, at various locations in North Carolina. Results showed that a syringe is not necessary for sample collection, there was generally no significant radon loss due to mailing samples, and there was statistically significant evidence of temporal variations in water radon concentrations

  18. Preparation Of Deposited Sediment Sample By Casting Method For Environmental Study

    International Nuclear Information System (INIS)

    Hutabarat, Tommy; Ristin PI, Evarista

    2000-01-01

    The preparation of deposited sediment sample by c asting m ethod for environmental study has been carried out. This method comprises separation of size fraction and casting process. The deposited sediment samples were wet sieved to separate the size fraction of >500 mum, (250-500) mum, (125-250) mum and (63-125) mum and settling procedures were followed for the separation of (40-63) mum, (20-40) mum, (10-20) mum and o C, ashed at 450 o C, respectively. In the casting process of sample, it was used polyester rapid cure resin and methyl ethyl ketone peroxide (MEKP) hardener. The moulded sediment sample was poured onto caster, allow for 60 hours long. The aim of this method is to get the casted sample which can be used effectively, efficiently and to be avoided from contamination of each other samples. Before casting, samples were grinded up to be fine. The result shows that casting product is ready to be used for natural radionuclide analysis

  19. Sparse feature learning for instrument identification: Effects of sampling and pooling methods.

    Science.gov (United States)

    Han, Yoonchang; Lee, Subin; Nam, Juhan; Lee, Kyogu

    2016-05-01

    Feature learning for music applications has recently received considerable attention from many researchers. This paper reports on the sparse feature learning algorithm for musical instrument identification, and in particular, focuses on the effects of the frame sampling techniques for dictionary learning and the pooling methods for feature aggregation. To this end, two frame sampling techniques are examined that are fixed and proportional random sampling. Furthermore, the effect of using onset frame was analyzed for both of proposed sampling methods. Regarding summarization of the feature activation, a standard deviation pooling method is used and compared with the commonly used max- and average-pooling techniques. Using more than 47 000 recordings of 24 instruments from various performers, playing styles, and dynamics, a number of tuning parameters are experimented including the analysis frame size, the dictionary size, and the type of frequency scaling as well as the different sampling and pooling methods. The results show that the combination of proportional sampling and standard deviation pooling achieve the best overall performance of 95.62% while the optimal parameter set varies among the instrument classes.

  20. Chapter 12. Sampling and analytical methods

    International Nuclear Information System (INIS)

    Busenberg, E.; Plummer, L.N.; Cook, P.G.; Solomon, D.K.; Han, L.F.; Groening, M.; Oster, H.

    2006-01-01

    When water samples are taken for the analysis of CFCs, regardless of the sampling method used, contamination of samples by contact with atmospheric air (with its 'high' CFC concentrations) is a major concern. This is because groundwaters usually have lower CFC concentrations than those waters which have been exposed to the modern air. Some groundwaters might not contain CFCs and, therefore, are most sensitive to trace contamination by atmospheric air. Thus, extreme precautions are needed to obtain uncontaminated samples when groundwaters, particularly those with older ages, are sampled. It is recommended at the start of any CFC investigation that samples from a CFC-free source be collected and analysed, as a check upon the sampling equipment and methodology. The CFC-free source might be a deep monitoring well or, alternatively, CFC-free water could be carefully prepared in the laboratory. It is especially important that all tubing, pumps and connection that will be used in the sampling campaign be checked in this manner

  1. Measuring larval nematode contamination on cattle pastures: Comparing two herbage sampling methods.

    Science.gov (United States)

    Verschave, S H; Levecke, B; Duchateau, L; Vercruysse, J; Charlier, J

    2015-06-15

    Assessing levels of pasture larval contamination is frequently used to study the population dynamics of the free-living stages of parasitic nematodes of livestock. Direct quantification of infective larvae (L3) on herbage is the most applied method to measure pasture larval contamination. However, herbage collection remains labour intensive and there is a lack of studies addressing the variation induced by the sampling method and the required sample size. The aim of this study was (1) to compare two different sampling methods in terms of pasture larval count results and time required to sample, (2) to assess the amount of variation in larval counts at the level of sample plot, pasture and season, respectively and (3) to calculate the required sample size to assess pasture larval contamination with a predefined precision using random plots across pasture. Eight young stock pastures of different commercial dairy herds were sampled in three consecutive seasons during the grazing season (spring, summer and autumn). On each pasture, herbage samples were collected through both a double-crossed W-transect with samples taken every 10 steps (method 1) and four random located plots of 0.16 m(2) with collection of all herbage within the plot (method 2). The average (± standard deviation (SD)) pasture larval contamination using sampling methods 1 and 2 was 325 (± 479) and 305 (± 444)L3/kg dry herbage (DH), respectively. Large discrepancies in pasture larval counts of the same pasture and season were often seen between methods, but no significant difference (P = 0.38) in larval counts between methods was found. Less time was required to collect samples with method 2. This difference in collection time between methods was most pronounced for pastures with a surface area larger than 1 ha. The variation in pasture larval counts from samples generated by random plot sampling was mainly due to the repeated measurements on the same pasture in the same season (residual variance

  2. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  3. An adaptive Monte Carlo method under emission point as sampling station for deep penetration calculation

    International Nuclear Information System (INIS)

    Wang, Ruihong; Yang, Shulin; Pei, Lucheng

    2011-01-01

    Deep penetration problem has been one of the difficult problems in shielding calculation with Monte Carlo method for several decades. In this paper, an adaptive technique under the emission point as a sampling station is presented. The main advantage is to choose the most suitable sampling number from the emission point station to get the minimum value of the total cost in the process of the random walk. Further, the related importance sampling method is also derived. The main principle is to define the importance function of the response due to the particle state and ensure the sampling number of the emission particle is proportional to the importance function. The numerical results show that the adaptive method under the emission point as a station could overcome the difficulty of underestimation to the result in some degree, and the related importance sampling method gets satisfied results as well. (author)

  4. Mapping species distributions with MAXENT using a geographically biased sample of presence data: a performance assessment of methods for correcting sampling bias.

    Science.gov (United States)

    Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

  5. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  6. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  7. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    Energy Technology Data Exchange (ETDEWEB)

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    is placed in a sample, then the sample is flooded with water and cooled [Priest et al., 2009]. We have performed a number of tests in which hydrate was formed and the uniformity of the hydrate formation was examined. These tests have primarily used a variety of modifications of the excess gas method to make the hydrate, although we have also used a version of the excess water technique. Early on, we found difficulties in creating uniform samples with a particular sand/ initial water saturation combination (F-110 Sand, {approx} 35% initial water saturation). In many of our tests we selected this combination intentionally to determine whether we could use a method to make the samples uniform. The following methods were examined: Excess gas, Freeze/thaw/form, Freeze/pressurize/thaw, Excess gas followed by water saturation, Excess water, Sand and kaolinite, Use of a nucleation enhancer (SnoMax), and Use of salt in the water. Below, each method, the underlying hypothesis, and our results are briefly presented, followed by a brief conclusion. Many of the hypotheses investigated are not our own, but were presented to us. Much of the data presented is from x-ray CT scanning our samples. The x-ray CT scanner provides a three-dimensional density map of our samples. From this map and the physics that is occurring in our samples, we are able to gain an understanding of the spatial nature of the processes that occur, and attribute them to the locations where they occur.

  8. Development of sample preparation method for honey analysis using PIXE

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Chiba, Keiko; Sera, Koichiro

    2008-01-01

    We developed an original preparation method for honey samples (samples in paste-like state) specifically designed for PIXE analysis. The results of PIXE analysis of thin targets prepared by adding a standard containing nine elements to honey samples demonstrated that the preparation method bestowed sufficient accuracy on quantitative values. PIXE analysis of 13 kinds of honey was performed, and eight mineral components (Si, P, S, K, Ca, Mn, Cu and Zn) were detected in all honey samples. The principal mineral components were K and Ca, and the quantitative value for K accounted for the majority of the total value for mineral components. K content in honey varies greatly depending on the plant source. Chestnuts had the highest K content. In fact, it was 2-3 times that of Manuka, which is known as a high quality honey. K content of false-acacia, which is produced in the greatest abundance, was 1/20 that of chestnuts. (author)

  9. Methods of human body odor sampling: the effect of freezing.

    Science.gov (United States)

    Lenochova, Pavlina; Roberts, S Craig; Havlicek, Jan

    2009-02-01

    Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing. Although these differences might have a critical impact on results obtained, almost no studies test validity of current methods. Here, we focused on the effect of freezing samples between collection and use in experiments involving body odor perception. In 2 experiments, we tested whether axillary odors were perceived differently by raters when presented fresh or having been frozen and whether several freeze-thaw cycles affected sample quality. In the first experiment, samples were frozen for 2 weeks, 1 month, or 4 months. We found no differences in ratings of pleasantness, attractiveness, or masculinity between fresh and frozen samples. Similarly, almost no differences between repeatedly thawed and fresh samples were found. We found some variations in intensity; however, this was unrelated to length of storage. The second experiment tested differences between fresh samples and those frozen for 6 months. Again no differences in subjective ratings were observed. These results suggest that freezing has no significant effect on perceived odor hedonicity and that samples can be reliably used after storage for relatively long periods.

  10. An improved correlated sampling method for calculating correction factor of detector

    International Nuclear Information System (INIS)

    Wu Zhen; Li Junli; Cheng Jianping

    2006-01-01

    In the case of a small size detector lying inside a bulk of medium, there are two problems in the correction factors calculation of the detectors. One is that the detector is too small for the particles to arrive at and collide in; the other is that the ratio of two quantities is not accurate enough. The method discussed in this paper, which combines correlated sampling with modified particle collision auto-importance sampling, and has been realized on the MCNP-4C platform, can solve these two problems. Besides, other 3 variance reduction techniques are also combined with correlated sampling respectively to calculate a simple calculating model of the correction factors of detectors. The results prove that, although all the variance reduction techniques combined with correlated sampling can improve the calculating efficiency, the method combining the modified particle collision auto-importance sampling with the correlated sampling is the most efficient one. (authors)

  11. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  12. Recent Results from the SAMPLE Experiment

    International Nuclear Information System (INIS)

    Ito, Takeyasu M.

    2004-01-01

    The previous two SAMPLE experiments yielded a measurement of the axial e-N form factor G A e substantially different from the theoretical estimate. In order to confirm this observation, a third SAMPLE experiment was carried out at a lower beam energy of 125 MeV (Q2 = 0.038 (GeV/c)2) on a deuterium target. The data analysis is now at the final stage and the results are consistent with the theoretical prediction of the axial form factor G A e . Also, reevaluation of the background dilution factor and the electromagnetic radiative correction for the 200 MeV deuterium data lead to updated results, which are also consistent with the theoretical prediction

  13. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual benefits and risks. This review critically surveys the available evidence to generate a comparison between arterial and capillary blood gas sampling, focusing on their ...

  14. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  15. A study on the weather sampling method for probabilistic consequence analysis

    International Nuclear Information System (INIS)

    Oh, Hae Cheol

    1996-02-01

    The main task of probabilistic accident consequence analysis model is to predict the radiological situation and to provide a reliable quantitative data base for making decisions on countermeasures. The magnitude of accident consequence is depended on the characteristic of the accident and the weather coincident. In probabilistic accident consequence analysis, it is necessary to repeat the atmospheric dispersion calculation with several hundreds of weather sequences to predict the full distribution of consequences which may occur following a postulated accident release. It is desirable to select a representative sample of weather sequences from a meteorological record which is typical of the area over which the released radionuclides will disperse and which spans a sufficiently long period. The selection process is done by means of sampling techniques from a full year of hourly weather data characteristic of the plant site. In this study, the proposed Weighted importance sampling method selects proportional to the each bin size to closely approximate the true frequency distribution of weather condition at the site. The Weighted importance sampling method results in substantially less sampling uncertainty than the previous technique. The proposed technique can result in improve confidence in risk estimates

  16. A Method for Choosing the Best Samples for Mars Sample Return.

    Science.gov (United States)

    Gordon, Peter R; Sephton, Mark A

    2018-05-01

    Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission. Key Words

  17. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  18. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    International Nuclear Information System (INIS)

    Chady, T.

    2004-01-01

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed

  19. Two media method for linear attenuation coefficient determination of irregular soil samples

    International Nuclear Information System (INIS)

    Vici, Carlos Henrique Georges

    2004-01-01

    In several situations of nuclear applications, the knowledge of gamma-ray linear attenuation coefficient for irregular samples is necessary, such as in soil physics and geology. This work presents the validation of a methodology for the determination of the linear attenuation coefficient (μ) of irregular shape samples, in such a way that it is not necessary to know the thickness of the considered sample. With this methodology irregular soil samples (undeformed field samples) from Londrina region, north of Parana were studied. It was employed the two media method for the μ determination. It consists of the μ determination through the measurement of a gamma-ray beam attenuation by the sample sequentially immersed in two different media, with known and appropriately chosen attenuation coefficients. For comparison, the theoretical value of μ was calculated by the product of the mass attenuation coefficient, obtained by the WinXcom code, and the measured value of the density sample. This software employs the chemical composition of the samples and supplies a table of the mass attenuation coefficients versus the photon energy. To verify the validity of the two media method, compared with the simple gamma ray transmission method, regular pome stone samples were used. With these results for the attenuation coefficients and their respective deviations, it was possible to compare the two methods. In this way we concluded that the two media method is a good tool for the determination of the linear attenuation coefficient of irregular materials, particularly in the study of soils samples. (author)

  20. Sample preparation method for scanning force microscopy

    CERN Document Server

    Jankov, I R; Szente, R N; Carreno, M N P; Swart, J W; Landers, R

    2001-01-01

    We present a method of sample preparation for studies of ion implantation on metal surfaces. The method, employing a mechanical mask, is specially adapted for samples analysed by Scanning Force Microscopy. It was successfully tested on polycrystalline copper substrates implanted with phosphorus ions at an acceleration voltage of 39 keV. The changes of the electrical properties of the surface were measured by Kelvin Probe Force Microscopy and the surface composition was analysed by Auger Electron Spectroscopy.

  1. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.

    1978-07-01

    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  2. Method validation to determine total alpha beta emitters in water samples using LSC

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.; Al-akel, B.; Saaid, S.

    2006-06-01

    In this work a method was validated to determine gross alpha and beta emitters in water samples using liquid scintillation counter. 200 ml of water from each sample were evaporated to 20 ml and 8 ml of them were mixed with 12 ml of the suitable cocktail to be measured by liquid scintillation counter Wallac Winspectral 1414. The lower detection limit by this method (LDL) was 0.33 DPM for total alpha emitters and 1.3 DPM for total beta emitters. and the reproducibility limit was (± 2.32 DPM) and (±1.41 DPM) for total alpha and beta emitters respectively, and the repeatability limit was (±2.19 DPM) and (±1.11 DPM) for total alpha and beta emitters respectively. The method is easy and fast because of the simple preparation steps and the large number of samples that can be measured at the same time. In addition, many real samples and standard samples were analyzed by the method and showed accurate results so it was concluded that the method can be used with various water samples. (author)

  3. Parameter sampling capabilities of sequential and simultaneous data assimilation: II. Statistical analysis of numerical results

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess and compare parameter sampling capabilities of one sequential and one simultaneous Bayesian, ensemble-based, joint state-parameter (JS) estimation method. In the companion paper, part I (Fossum and Mannseth 2014 Inverse Problems 30 114002), analytical investigations lead us to propose three claims, essentially stating that the sequential method can be expected to outperform the simultaneous method for weakly nonlinear forward models. Here, we assess the reliability and robustness of these claims through statistical analysis of results from a range of numerical experiments. Samples generated by the two approximate JS methods are compared to samples from the posterior distribution generated by a Markov chain Monte Carlo method, using four approximate measures of distance between probability distributions. Forward-model nonlinearity is assessed from a stochastic nonlinearity measure allowing for sufficiently large model dimensions. Both toy models (with low computational complexity, and where the nonlinearity is fairly easy to control) and two-phase porous-media flow models (corresponding to down-scaled versions of problems to which the JS methods have been frequently applied recently) are considered in the numerical experiments. Results from the statistical analysis show strong support of all three claims stated in part I. (paper)

  4. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  5. Results of Self-Absorption Study on the Versapor 3000 Filters for Radioactive Particulate Air Sampling

    International Nuclear Information System (INIS)

    Barnett, J.M.

    2008-01-01

    Since the mid-1980s the Pacific Northwest National Laboratory (PNNL) has used a value of 0.85 as a correction factor for the self absorption of activity of particulate radioactive air samples. More recently, an effort was made to evaluate the current particulate radioactive air sample filters (Versapor(reg s ign) 3000) used at PNNL for self absorption effects. There were two methods used in the study, (1) to compare the radioactivity concentration by direct gas-flow proportional counting of the filter to the results obtained after acid digestion of the filter and counting again by gas-flow proportional detection and (2) to evaluate sample filters by high resolution visual/infrared microscopy to determine the depth of material loading on or in the filter fiber material. Sixty samples were selected from the archive for acid digestion in the first method and about 30 samples were selected for high resolution visual/infrared microscopy. Mass loading effects were also considered. From the sample filter analysis, large error is associated with the average self absorption factor, however, when the data is compared directly one-to-one, statistically, there appears to be good correlation between the two analytical methods. The mass loading of filters evaluated was <0.2 mg cm-2 and was also compared against other published results. The microscopy analysis shows the sample material remains on the top of the filter paper and does not imbed into the filter media. Results of the microscopy evaluation lead to the conclusion that there is not a mechanism for significant self absorption. The overall conclusion is that self-absorption is not a significant factor in the analysis of filters used at PNNL for radioactive air stack sampling of radionuclide particulates and that an applied correction factor is conservative in determining overall sample activity. A new self absorption factor of 1.0 is recommended

  6. Effect of sample stratification on dairy GWAS results

    Directory of Open Access Journals (Sweden)

    Ma Li

    2012-10-01

    Full Text Available Abstract Background Artificial insemination and genetic selection are major factors contributing to population stratification in dairy cattle. In this study, we analyzed the effect of sample stratification and the effect of stratification correction on results of a dairy genome-wide association study (GWAS. Three methods for stratification correction were used: the efficient mixed-model association expedited (EMMAX method accounting for correlation among all individuals, a generalized least squares (GLS method based on half-sib intraclass correlation, and a principal component analysis (PCA approach. Results Historical pedigree data revealed that the 1,654 contemporary cows in the GWAS were all related when traced through approximately 10–15 generations of ancestors. Genome and phenotype stratifications had a striking overlap with the half-sib structure. A large elite half-sib family of cows contributed to the detection of favorable alleles that had low frequencies in the general population and high frequencies in the elite cows and contributed to the detection of X chromosome effects. All three methods for stratification correction reduced the number of significant effects. EMMAX method had the most severe reduction in the number of significant effects, and the PCA method using 20 principal components and GLS had similar significance levels. Removal of the elite cows from the analysis without using stratification correction removed many effects that were also removed by the three methods for stratification correction, indicating that stratification correction could have removed some true effects due to the elite cows. SNP effects with good consensus between different methods and effect size distributions from USDA’s Holstein genomic evaluation included the DGAT1-NIBP region of BTA14 for production traits, a SNP 45kb upstream from PIGY on BTA6 and two SNPs in NIBP on BTA14 for protein percentage. However, most of these consensus effects had

  7. Ionizing radiation as optimization method for aluminum detection from drinking water samples

    International Nuclear Information System (INIS)

    Bazante-Yamguish, Renata; Geraldo, Aurea Beatriz C.; Moura, Eduardo; Manzoli, Jose Eduardo

    2013-01-01

    The presence of organic compounds in water samples is often responsible for metal complexation; depending on the analytic method, the organic fraction may dissemble the evaluation of the real values of metal concentration. Pre-treatment of the samples is advised when organic compounds are interfering agents, and thus sample mineralization may be accomplished by several chemical and/or physical methods. Here, the ionizing radiation was used as an advanced oxidation process (AOP), for sample pre-treatment before the analytic determination of total and dissolved aluminum by ICP-OES in drinking water samples from wells and spring source located at Billings dam region. Before irradiation, the spring source and wells' samples showed aluminum levels of 0.020 mg/l and 0.2 mg/l respectively; after irradiation, both samples showed a 8-fold increase of aluminum concentration. These results are discussed considering other physical and chemical parameters and peculiarities of sample sources. (author)

  8. Solvent extraction method for rapid separation of strontium-90 in milk and food samples

    International Nuclear Information System (INIS)

    Hingorani, S.B.; Sathe, A.P.

    1991-01-01

    A solvent extraction method, using tributyl phosphate, for rapid separation of strontium-90 in milk and other food samples has been presented in this report in view of large number of samples recieved after Chernobyl accident for checking radioactive contamination. The earlier nitration method in use for the determination of 90 Sr through its daughter 90 Y takes over two weeks for analysis of a sample. While by this extraction method it takes only 4 to 5 hours for sample analysis. Complete estimation including initial counting can be done in a single day. The chemical recovery varies between 80-90% compared to nitration method which is 65-80%. The purity of the method has been established by following the decay of yttrium-90 separated. Some of the results obtained by adopting this chemical method for food analysis are included. The method is, thus, found to be rapid and convenient for accurate estimation of strontium-90 in milk and food samples. (author). 2 tabs., 1 fig

  9. A Sequential Kriging reliability analysis method with characteristics of adaptive sampling regions and parallelizability

    International Nuclear Information System (INIS)

    Wen, Zhixun; Pei, Haiqing; Liu, Hai; Yue, Zhufeng

    2016-01-01

    The sequential Kriging reliability analysis (SKRA) method has been developed in recent years for nonlinear implicit response functions which are expensive to evaluate. This type of method includes EGRA: the efficient reliability analysis method, and AK-MCS: the active learning reliability method combining Kriging model and Monte Carlo simulation. The purpose of this paper is to improve SKRA by adaptive sampling regions and parallelizability. The adaptive sampling regions strategy is proposed to avoid selecting samples in regions where the probability density is so low that the accuracy of these regions has negligible effects on the results. The size of the sampling regions is adapted according to the failure probability calculated by last iteration. Two parallel strategies are introduced and compared, aimed at selecting multiple sample points at a time. The improvement is verified through several troublesome examples. - Highlights: • The ISKRA method improves the efficiency of SKRA. • Adaptive sampling regions strategy reduces the number of needed samples. • The two parallel strategies reduce the number of needed iterations. • The accuracy of the optimal value impacts the number of samples significantly.

  10. Rheology and TIC/TOC results of ORNL tank samples

    International Nuclear Information System (INIS)

    Pareizs, J. M.; Hansen, E. K.

    2013-01-01

    The Savannah River National Laboratory (SRNL)) was requested by Oak Ridge National Laboratory (ORNL) to perform total inorganic carbon (TIC), total organic carbon (TOC), and rheological measurements for several Oak Ridge tank samples. As received slurry samples were diluted and submitted to SRNL-Analytical for TIC and TOC analyses. Settled solids yield stress (also known as settled shear strength) of the as received settled sludge samples were determined using the vane method and these measurements were obtained 24 hours after the samples were allowed to settled undisturbed. Rheological or flow properties (Bingham Plastic viscosity and Bingham Plastic yield stress) were determined from flow curves of the homogenized or well mixed samples. Other targeted total suspended solids (TSS) concentrations samples were also analyzed for flow properties and these samples were obtained by diluting the as-received sample with de-ionized (DI) water

  11. Long-term frozen storage of urine samples: a trouble to get PCR results in Schistosoma spp. DNA detection?

    Science.gov (United States)

    Fernández-Soto, Pedro; Velasco Tirado, Virginia; Carranza Rodríguez, Cristina; Pérez-Arellano, José Luis; Muro, Antonio

    2013-01-01

    Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples' storage or conditions for handling and DNA preservation and extraction methods. We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patients urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA method of extraction used.

  12. Wet-digestion of environmental sample using silver-mediated electrochemical method

    International Nuclear Information System (INIS)

    Kuwabara, Jun

    2010-01-01

    An application of silver-mediated electrochemical method to environmental samples as the effective digestion method for iodine analysis was tried. Usual digestion method for 129 I in many type of environmental sample is combustion method using quartz glass tube. Chemical yield of iodine on the combustion method reduce depending on the type of sample. The silver-mediated electrochemical method is expected to achieve very low loss of iodine. In this study, dried kombu (Laminaria) sample was tried to digest with electrochemical cell. At the case of 1g of sample, digestion was completed for about 24 hours under the electric condition of <10V and <2A. After the digestion, oxidized species of iodine was reduced to iodide by adding sodium sulfite. And then the precipitate of silver iodide was obtained. (author)

  13. A flexible method for multi-level sample size determination

    International Nuclear Information System (INIS)

    Lu, Ming-Shih; Sanborn, J.B.; Teichmann, T.

    1997-01-01

    This paper gives a flexible method to determine sample sizes for both systematic and random error models (this pertains to sampling problems in nuclear safeguard questions). In addition, the method allows different attribute rejection limits. The new method could assist achieving a higher detection probability and enhance inspection effectiveness

  14. A novel heterogeneous training sample selection method on space-time adaptive processing

    Science.gov (United States)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  15. Lipidomic analysis of biological samples: Comparison of liquid chromatography, supercritical fluid chromatography and direct infusion mass spectrometry methods.

    Science.gov (United States)

    Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal

    2017-11-24

    Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements

  16. THE USE OF RANKING SAMPLING METHOD WITHIN MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2011-01-01

    Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.

  17. Advancing the Use of Passive Sampling in Risk Assessment and Management of Sediments Contaminated with Hydrophobic Organic Chemicals: Results of an International Ex Situ Passive Sampling Interlaboratory Comparison.

    Science.gov (United States)

    Jonker, Michiel T O; van der Heijden, Stephan A; Adelman, Dave; Apell, Jennifer N; Burgess, Robert M; Choi, Yongju; Fernandez, Loretta A; Flavetta, Geanna M; Ghosh, Upal; Gschwend, Philip M; Hale, Sarah E; Jalalizadeh, Mehregan; Khairy, Mohammed; Lampi, Mark A; Lao, Wenjian; Lohmann, Rainer; Lydy, Michael J; Maruya, Keith A; Nutile, Samuel A; Oen, Amy M P; Rakowska, Magdalena I; Reible, Danny; Rusina, Tatsiana P; Smedes, Foppe; Wu, Yanwen

    2018-03-20

    This work presents the results of an international interlaboratory comparison on ex situ passive sampling in sediments. The main objectives were to map the state of the science in passively sampling sediments, identify sources of variability, provide recommendations and practical guidance for standardized passive sampling, and advance the use of passive sampling in regulatory decision making by increasing confidence in the use of the technique. The study was performed by a consortium of 11 laboratories and included experiments with 14 passive sampling formats on 3 sediments for 25 target chemicals (PAHs and PCBs). The resulting overall interlaboratory variability was large (a factor of ∼10), but standardization of methods halved this variability. The remaining variability was primarily due to factors not related to passive sampling itself, i.e., sediment heterogeneity and analytical chemistry. Excluding the latter source of variability, by performing all analyses in one laboratory, showed that passive sampling results can have a high precision and a very low intermethod variability (sampling, irrespective of the specific method used, is fit for implementation in risk assessment and management of contaminated sediments, provided that method setup and performance, as well as chemical analyses are quality-controlled.

  18. Rapid-viability PCR method for detection of live, virulent Bacillus anthracis in environmental samples.

    Science.gov (United States)

    Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R

    2011-09-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.

  19. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  20. Comparison of DNA preservation methods for environmental bacterial community samples.

    Science.gov (United States)

    Gray, Michael A; Pratte, Zoe A; Kellogg, Christina A

    2013-02-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard(™), RNAlater(®), DMSO-EDTA-salt (DESS), FTA(®) cards, and FTA Elute(®) cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA(®) cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard(™), RNAlater(®), and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  1. Fast and simple method for semiquantitative determination of calcium propionate in bread samples.

    Science.gov (United States)

    Phechkrajang, Chutima Matayatsuk; Yooyong, Surin

    2017-04-01

    Calcium propionate has been widely used as a preservative in bakery and in bread. It is sometimes not carefully used, or a high concentration is added to preserve products. High consumption of calcium propionate can lead to several health problems. This study aims to develop a fast and simple semiquantitative method based on color complex formation for the determination of calcium propionate in a bread sample. A red-brown complex was obtained from the reaction of ferric ammonium sulfate and propionate anion. The product was rapidly formed and easily observed with the concentration of propionate anion >0.4 mg/mL. A high-performance liquid chromatography (HPLC) method was also developed and validated for comparison. Twenty-two bread samples from three markets near Bangkok were randomly selected and assayed for calcium propionate using the above two developed methods. The results showed that 19/22 samples contained calcium propionate >2000 mg/kg. The results of the complex formation method agreed with the HPLC method. Copyright © 2016. Published by Elsevier B.V.

  2. Fast and simple method for semiquantitative determination of calcium propionate in bread samples

    Directory of Open Access Journals (Sweden)

    Chutima Matayatsuk Phechkrajang

    2017-04-01

    Full Text Available Calcium propionate has been widely used as a preservative in bakery and in bread. It is sometimes not carefully used, or a high concentration is added to preserve products. High consumption of calcium propionate can lead to several health problems. This study aims to develop a fast and simple semiquantitative method based on color complex formation for the determination of calcium propionate in a bread sample. A red–brown complex was obtained from the reaction of ferric ammonium sulfate and propionate anion. The product was rapidly formed and easily observed with the concentration of propionate anion >0.4 mg/mL. A high-performance liquid chromatography (HPLC method was also developed and validated for comparison. Twenty-two bread samples from three markets near Bangkok were randomly selected and assayed for calcium propionate using the above two developed methods. The results showed that 19/22 samples contained calcium propionate >2000 mg/kg. The results of the complex formation method agreed with the HPLC method.

  3. PhyloChip™ microarray comparison of sampling methods used for coral microbial ecology

    Science.gov (United States)

    Kellogg, Christina A.; Piceno, Yvette M.; Tom, Lauren M.; DeSantis, Todd Z.; Zawada, David G.; Andersen, Gary L.

    2012-01-01

    Interest in coral microbial ecology has been increasing steadily over the last decade, yet standardized methods of sample collection still have not been defined. Two methods were compared for their ability to sample coral-associated microbial communities: tissue punches and foam swabs, the latter being less invasive and preferred by reef managers. Four colonies of star coral, Montastraea annularis, were sampled in the Dry Tortugas National Park (two healthy and two with white plague disease). The PhyloChip™ G3 microarray was used to assess microbial community structure of amplified 16S rRNA gene sequences. Samples clustered based on methodology rather than coral colony. Punch samples from healthy and diseased corals were distinct. All swab samples clustered closely together with the seawater control and did not group according to the health state of the corals. Although more microbial taxa were detected by the swab method, there is a much larger overlap between the water control and swab samples than punch samples, suggesting some of the additional diversity is due to contamination from water absorbed by the swab. While swabs are useful for noninvasive studies of the coral surface mucus layer, these results show that they are not optimal for studies of coral disease.

  4. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    Science.gov (United States)

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  5. Method for evaluation of radiative properties of glass samples

    Energy Technology Data Exchange (ETDEWEB)

    Mohelnikova, Jitka [Faculty of Civil Engineering, Brno University of Technology, Veveri 95, 602 00 Brno (Czech Republic)], E-mail: mohelnikova.j@fce.vutbr.cz

    2008-04-15

    The paper presents a simple calculation method which serves for an evaluation of radiative properties of window glasses. The method is based on a computer simulation model of the energy balance of a thermally insulated box with selected glass samples. A temperature profile of the air inside of the box with a glass sample exposed to affecting radiation was determined for defined boundary conditions. The spectral range of the radiation was considered in the interval between 280 and 2500 nm. This interval is adequate to the spectral range of solar radiation affecting windows in building facades. The air temperature rise within the box was determined in a response to the affecting radiation in the time between the beginning of the radiation exposition and the time of steady-state thermal conditions. The steady state temperature inside of the insulated box serves for the evaluation of the box energy balance and determination of the glass sample radiative properties. These properties are represented by glass characteristics as mean values of transmittance, reflectance and absorptance calculated for a defined spectral range. The data of the computer simulations were compared to experimental measurements on a real model of the insulated box. Results of both the calculations and measurements are in a good compliance. The method is recommended for preliminary evaluation of window glass radiative properties which serve as data for energy evaluation of buildings.

  6. Pi sampling: a methodical and flexible approach to initial macromolecular crystallization screening

    International Nuclear Information System (INIS)

    Gorrec, Fabrice; Palmer, Colin M.; Lebon, Guillaume; Warne, Tony

    2011-01-01

    Pi sampling, derived from the incomplete factorial approach, is an effort to maximize the diversity of macromolecular crystallization conditions and to facilitate the preparation of 96-condition initial screens. The Pi sampling method is derived from the incomplete factorial approach to macromolecular crystallization screen design. The resulting ‘Pi screens’ have a modular distribution of a given set of up to 36 stock solutions. Maximally diverse conditions can be produced by taking into account the properties of the chemicals used in the formulation and the concentrations of the corresponding solutions. The Pi sampling method has been implemented in a web-based application that generates screen formulations and recipes. It is particularly adapted to screens consisting of 96 different conditions. The flexibility and efficiency of Pi sampling is demonstrated by the crystallization of soluble proteins and of an integral membrane-protein sample

  7. A Method for Determining the Content of Glycoproteins in Biological Samples

    Directory of Open Access Journals (Sweden)

    Yang Gao

    2016-11-01

    Full Text Available The glycoprotein purified from the mycelium extract of Tremella fuciformis was marked with iodine through the iodine substitution reaction. The content of iodine, which is indicative of the amount of the marked tremella glycoprotein (ITG, was detected with Inductively coupled plasma mass spectrometry (ICP-MS. The method was found to be stable, sensitive, and accurate at detecting the content of iodine-substituted glycoprotein, and was used in the quantitative analysis of biological samples, including blood and organs. Different biological samples were collected from rats after oral administration of ITG, and were tested for iodine content by ICP-MS to calculate the amount of ITG in the samples. The results suggested that ICP-MS is a sensitive, stable, and accurate method for detection of iodinated glycoproteins in blood and organs.

  8. Standard methods for sampling North American freshwater fishes

    Science.gov (United States)

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  9. Relative efficiency of anuran sampling methods in a restinga habitat (Jurubatiba, Rio de Janeiro, Brazil).

    Science.gov (United States)

    Rocha, C F D; Van Sluys, M; Hatano, F H; Boquimpani-Freitas, L; Marra, R V; Marques, R V

    2004-11-01

    Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min.), the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog) in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23) and the breeding site survey (9.5 MSI; richness = 4; abundance = 22) were the most efficient. The visual encounter inventory (45.0 MSI) and patch sampling (65.0 MSI) methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.

  10. Relative efficiency of anuran sampling methods in a restinga habitat (Jurubatiba, Rio de Janeiro, Brazil

    Directory of Open Access Journals (Sweden)

    C. F. D. Rocha

    Full Text Available Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min., the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23 and the breeding site survey (9.5 MSI; richness = 4; abundance = 22 were the most efficient. The visual encounter inventory (45.0 MSI and patch sampling (65.0 MSI methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.

  11. Passive sampling methods for contaminated sediments

    DEFF Research Database (Denmark)

    Peijnenburg, Willie J.G.M.; Teasdale, Peter R.; Reible, Danny

    2014-01-01

    “Dissolved” concentrations of contaminants in sediment porewater (Cfree) provide a more relevant exposure metric for risk assessment than do total concentrations. Passive sampling methods (PSMs) for estimating Cfree offer the potential for cost-efficient and accurate in situ characterization...

  12. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    Science.gov (United States)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  13. [Sampling methods for PM2.5 from stationary sources: a review].

    Science.gov (United States)

    Jiang, Jing-Kun; Deng, Jian-Guo; Li, Zhen; Li, Xing-Hua; Duan, Lei; Hao, Ji-Ming

    2014-05-01

    The new China national ambient air quality standard has been published in 2012 and will be implemented in 2016. To meet the requirements in this new standard, monitoring and controlling PM2,,5 emission from stationary sources are very important. However, so far there is no national standard method on sampling PM2.5 from stationary sources. Different sampling methods for PM2.5 from stationary sources and relevant international standards were reviewed in this study. It includes the methods for PM2.5 sampling in flue gas and the methods for PM2.5 sampling after dilution. Both advantages and disadvantages of these sampling methods were discussed. For environmental management, the method for PM2.5 sampling in flue gas such as impactor and virtual impactor was suggested as a standard to determine filterable PM2.5. To evaluate environmental and health effects of PM2.5 from stationary sources, standard dilution method for sampling of total PM2.5 should be established.

  14. Long-term frozen storage of urine samples: a trouble to get PCR results in Schistosoma spp. DNA detection?

    Directory of Open Access Journals (Sweden)

    Pedro Fernández-Soto

    Full Text Available BACKGROUND: Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples' storage or conditions for handling and DNA preservation and extraction methods. METHODOLOGY/PRINCIPAL FINDINGS: We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patients urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. CONCLUSIONS/SIGNIFICANCE: Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA

  15. Sample Results from MCU Solids Outage

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Washington, A.; Oji, L.; Coleman, C.; Poirier, M.

    2014-09-22

    Savannah River National Laboratory (SRNL) has received several solid and liquid samples from MCU in an effort to understand and recover from the system outage starting on April 6, 2014. SRNL concludes that the presence of solids in the Salt Solution Feed Tank (SSFT) is the likely root cause for the outage, based upon the following discoveries: A solids sample from the extraction contactor #1 proved to be mostly sodium oxalate; A solids sample from the scrub contactor#1 proved to be mostly sodium oxalate; A solids sample from the Salt Solution Feed Tank (SSFT) proved to be mostly sodium oxalate; An archived sample from Tank 49H taken last year was shown to contain a fine precipitate of sodium oxalate; A solids sample from ; A liquid sample from the SSFT was shown to have elevated levels of oxalate anion compared to the expected concentration in the feed. Visual inspection of the SSFT indicated the presence of precipitated or transferred solids, which were likely also in the Salt Solution Receipt Tank (SSRT). The presence of the solids coupled with agitation performed to maintain feed temperature resulted in oxalate solids migration through the MCU system and caused hydraulic issues that resulted in unplanned phase carryover from the extraction into the scrub, and ultimately the strip contactors. Not only did this carryover result in the Strip Effluent (SE) being pushed out of waste acceptance specification, but it resulted in the deposition of solids into several of the contactors. At the same time, extensive deposits of aluminosilicates were found in the drain tube in the extraction contactor #1. However it is not known at this time how the aluminosilicate solids are related to the oxalate solids. The solids were successfully cleaned out of the MCU system. However, future consideration must be given to the exclusion of oxalate solids into the MCU system. There were 53 recommendations for improving operations recently identified. Some additional considerations or

  16. ANALYTICAL RESULTS OF MOX COLEMANITE CONCRETE SAMPLES POURED AUGUST 29, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Best, D.; Cozzi, A.; Reigel, M.

    2012-12-20

    The Mixed Oxide Fuel Fabrication Facility (MFFF) will use colemanite bearing concrete neutron absorber panels credited with attenuating neutron flux in the criticality design analyses and shielding operators from radiation. The Savannah River National Laboratory is tasked with measuring the total density, partial hydrogen density, and partial boron density of the colemanite concrete. Samples poured 8/29/12 were received on 9/20/2012 and analyzed. The average total density of each of the samples measured by the ASTM method C 642 was within the lower bound of 1.88 g/cm{sup 3}. The average partial hydrogen density of samples 8.6.1, 8.7.1, and 8.5.3 as measured using method ASTM E 1311 met the lower bound of 6.04E-02 g/cm{sup 3}. The average measured partial boron density of each sample met the lower bound of 1.65E-01 g/cm{sup 3} measured by the ASTM C 1301 method. The average partial hydrogen density of samples 8.5.1, 8.6.3, and 8.7.3 did not meet the lower bound. The samples, as received, were not wrapped in a moist towel as previous samples and appeared to be somewhat drier. This may explain the lower hydrogen partial density with respect to previous samples.

  17. Comparative study of methods on outlying data detection in experimental results

    International Nuclear Information System (INIS)

    Oliveira, P.M.S.; Munita, C.S.; Hazenfratz, R.

    2009-01-01

    The interpretation of experimental results through multivariate statistical methods might reveal the outliers existence, which is rarely taken into account by the analysts. However, their presence can influence the results interpretation, generating false conclusions. This paper shows the importance of the outliers determination for one data base of 89 samples of ceramic fragments, analyzed by neutron activation analysis. The results were submitted to five procedures to detect outliers: Mahalanobis distance, cluster analysis, principal component analysis, factor analysis, and standardized residual. The results showed that although cluster analysis is one of the procedures most used to identify outliers, it can fail by not showing the samples that are easily identified as outliers by other methods. In general, the statistical procedures for the identification of the outliers are little known by the analysts. (author)

  18. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Sampling methods for gasoline and... PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES General Provisions § 80.8 Sampling methods for gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect samples...

  19. Gap-Filling of Landsat 7 Imagery Using the Direct Sampling Method

    KAUST Repository

    Yin, Gaohong

    2016-12-28

    The failure of the Scan Line Corrector (SLC) on Landsat 7 imposed systematic data gaps on retrieved imagery and removed the capacity to provide spatially continuous fields. While a number of methods have been developed to fill these gaps, most of the proposed techniques are only applicable over relatively homogeneous areas. When they are applied to heterogeneous landscapes, retrieving image features and elements can become challenging. Here we present a gap-filling approach that is based on the adoption of the Direct Sampling multiple-point geostatistical method. The method employs a conditional stochastic resampling of known areas in a training image to simulate unknown locations. The approach is assessed across a range of both homogeneous and heterogeneous regions. Simulation results show that for homogeneous areas, satisfactory results can be obtained by simply adopting non-gap locations in the target image as baseline training data. For heterogeneous landscapes, bivariate simulations using an auxiliary variable acquired at a different date provides more accurate results than univariate simulations, especially as land cover complexity increases. Apart from recovering spatially continuous fields, one of the key advantages of the Direct Sampling is the relatively straightforward implementation process that relies on relatively few parameters.

  20. Characterization of hazardous waste sites: a methods manual. Volume 2. Available sampling methods (second edition)

    International Nuclear Information System (INIS)

    Ford, P.J.; Turina, P.J.; Seely, D.E.

    1984-12-01

    Investigations at hazardous waste sites and sites of chemical spills often require on-site measurements and sampling activities to assess the type and extent of contamination. This document is a compilation of sampling methods and materials suitable to address most needs that arise during routine waste site and hazardous spill investigations. The sampling methods presented in this document are compiled by media, and were selected on the basis of practicality, economics, representativeness, compatability with analytical considerations, and safety, as well as other criteria. In addition to sampling procedures, sample handling and shipping, chain-of-custody procedures, instrument certification, equipment fabrication, and equipment decontamination procedures are described. Sampling methods for soil, sludges, sediments, and bulk materials cover the solids medium. Ten methods are detailed for surface waters, groundwater and containerized liquids; twelve are presented for ambient air, soil gases and vapors, and headspace gases. A brief discussion of ionizing radiation survey instruments is also provided

  1. 2015 Long-Term Hydrologic Monitoring Program Sampling and Analysis Results Report for Project Rulison, Co

    Energy Technology Data Exchange (ETDEWEB)

    Findlay, Rick [Navarro Research and Engineering, Oak Ridge, TN (United States); Kautsky, Mark [US Department of Energy, Washington, DC (United States). Office of Legacy Management

    2015-12-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted annual sampling at the Rulison, Colorado, Site for the Long-Term Hydrologic Monitoring Program (LTHMP) on May 20–22 and 27, 2015. Several of the land owners were not available to allow access to their respective properties, which created the need for several sample collection trips. This report documents the analytical results of the Rulison monitoring event and includes the trip report and the data validation package (Appendix A). The groundwater and surface water monitoring were shipped to the GEL Group Inc. laboratories for analysis. All requested analyses were successfully completed. Samples were analyzed for gamma-emitting radionuclides by high- resolution gamma spectrometry. Tritium was analyzed using two methods, the conventional tritium method, which has a detection limit on the order of 400 picocuries per liter (pCi/L), and the enriched method (for selected samples), which has a detection limit on the order of 3 pCi/L.

  2. The SAGES Legacy Unifying Globulars and Galaxies survey (SLUGGS): sample definition, methods, and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Brodie, Jean P.; Romanowsky, Aaron J.; Jennings, Zachary G.; Pota, Vincenzo; Kader, Justin; Roediger, Joel C.; Villaume, Alexa; Arnold, Jacob A.; Woodley, Kristin A. [University of California Observatories, 1156 High Street, Santa Cruz, CA 95064 (United States); Strader, Jay [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Forbes, Duncan A.; Pastorello, Nicola; Usher, Christopher; Blom, Christina; Kartha, Sreeja S. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Foster, Caroline; Spitler, Lee R., E-mail: jbrodie@ucsc.edu [Australian Astronomical Observatory, P.O. Box 915, North Ryde, NSW 1670 (Australia)

    2014-11-20

    We introduce and provide the scientific motivation for a wide-field photometric and spectroscopic chemodynamical survey of nearby early-type galaxies (ETGs) and their globular cluster (GC) systems. The SAGES Legacy Unifying Globulars and GalaxieS (SLUGGS) survey is being carried out primarily with Subaru/Suprime-Cam and Keck/DEIMOS. The former provides deep gri imaging over a 900 arcmin{sup 2} field-of-view to characterize GC and host galaxy colors and spatial distributions, and to identify spectroscopic targets. The NIR Ca II triplet provides GC line-of-sight velocities and metallicities out to typically ∼8 R {sub e}, and to ∼15 R {sub e} in some cases. New techniques to extract integrated stellar kinematics and metallicities to large radii (∼2-3 R {sub e}) are used in concert with GC data to create two-dimensional (2D) velocity and metallicity maps for comparison with simulations of galaxy formation. The advantages of SLUGGS compared with other, complementary, 2D-chemodynamical surveys are its superior velocity resolution, radial extent, and multiple halo tracers. We describe the sample of 25 nearby ETGs, the selection criteria for galaxies and GCs, the observing strategies, the data reduction techniques, and modeling methods. The survey observations are nearly complete and more than 30 papers have so far been published using SLUGGS data. Here we summarize some initial results, including signatures of two-phase galaxy assembly, evidence for GC metallicity bimodality, and a novel framework for the formation of extended star clusters and ultracompact dwarfs. An integrated overview of current chemodynamical constraints on GC systems points to separate, in situ formation modes at high redshifts for metal-poor and metal-rich GCs.

  3. Determination method for 129I in soil samples by MIP-MS

    International Nuclear Information System (INIS)

    Uezu, Yasuhiro; Nakano, Masanao; Fujita, Hiroki; Watanabe, Hitoshi; Maruo, Yoshihiro

    2001-01-01

    The radioactive iodine-129 ( 129 I) is an important radionuclide for environmental assessment because it has a long half-life and is accumulated in the thyroid gland in humans. A new analytical technique by Microwave Induced Plasma Mass Spectrometer (MIP-MS) was applied to the determination of 129 I in soil samples. In environmental samples, a large amount of matrix elements are present. Therefore, the matrix elements were eliminated by ashing at 1000degC, and iodine isotopes were trapped by an activated charcoal and finally extracted by 10% tetramethylammonium hydroxide (TMAH). The concentration of 129 I in a soil samples were compared between results of neutron activation analysis and MIP-MS method. The results showed an excellent agreement. (author)

  4. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  5. National comparison on volume sample activity measurement methods

    International Nuclear Information System (INIS)

    Sahagia, M.; Grigorescu, E.L.; Popescu, C.; Razdolescu, C.

    1992-01-01

    A national comparison on volume sample activity measurements methods may be regarded as a step toward accomplishing the traceability of the environmental and food chain activity measurements to national standards. For this purpose, the Radionuclide Metrology Laboratory has distributed 137 Cs and 134 Cs water-equivalent solid standard sources to 24 laboratories having responsibilities in this matter. Every laboratory has to measure the activity of the received source(s) by using its own standards, equipment and methods and report the obtained results to the organizer. The 'measured activities' will be compared with the 'true activities'. A final report will be issued, which plans to evaluate the national level of precision of such measurements and give some suggestions for improvement. (Author)

  6. Comparing hair-morphology and molecular methods to identify fecal samples from Neotropical felids.

    Directory of Open Access Journals (Sweden)

    Carlos C Alberts

    Full Text Available To avoid certain problems encountered with more-traditional and invasive methods in behavioral-ecology studies of mammalian predators, such as felids, molecular approaches have been employed to identify feces found in the field. However, this method requires a complete molecular biology laboratory, and usually also requires very fresh fecal samples to avoid DNA degradation. Both conditions are normally absent in the field. To address these difficulties, identification based on morphological characters (length, color, banding, scales and medullar patterns of hairs found in feces could be employed as an alternative. In this study we constructed a morphological identification key for guard hairs of eight Neotropical felids (jaguar, oncilla, Geoffroy's cat, margay, ocelot, Pampas cat, puma and jaguarundi and compared its efficiency to that of a molecular identification method, using the ATP6 region as a marker. For this molecular approach, we simulated some field conditions by postponing sample-conservation procedures. A blind test of the identification key obtained a nearly 70% overall success rate, which we considered equivalent to or better than the results of some molecular methods (probably due to DNA degradation found in other studies. The jaguar, puma and jaguarundi could be unequivocally discriminated from any other Neotropical felid. On a scale ranging from inadequate to excellent, the key proved poor only for the margay, with only 30% of its hairs successfully identified using this key; and have intermediate success rates for the remaining species, the oncilla, Geoffroy's cat, ocelot and Pampas cat, were intermediate. Complementary information about the known distributions of felid populations may be necessary to substantially improve the results obtained with the key. Our own molecular results were even better, since all blind-tested samples were correctly identified. Part of these identifications were made from samples kept in suboptimal

  7. Vapor space characterization of Waste Tank 241-C-103: Inorganic results from sample Job 7B (May 12-25, 1994)

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Lerner, B.D.

    1994-10-01

    This report is to provide analytical results for use in safety and toxicological evaluations of the vapor space of Hanford single-shell waste storage tanks C-103. Samples were analysed to determine concentrations of ammonia, nitric oxide, nitrogen dioxide, sulfur oxides, and hydrogen cyanide. In addition to the samples, controls were analyzed that included blanks, spiked blanks, and spiked samples. These controls provided information about the suitability of sampling and analytical methods. Also included are the following: information describing the methods and sampling procedures used; results of sample analyses; and Conclusions and recommendations

  8. A novel sampling method for multiple multiscale targets from scattering amplitudes at a fixed frequency

    Science.gov (United States)

    Liu, Xiaodong

    2017-08-01

    A sampling method by using scattering amplitude is proposed for shape and location reconstruction in inverse acoustic scattering problems. Only matrix multiplication is involved in the computation, thus the novel sampling method is very easy and simple to implement. With the help of the factorization of the far field operator, we establish an inf-criterion for characterization of underlying scatterers. This result is then used to give a lower bound of the proposed indicator functional for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functional decays like the bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functional continuously depends on the scattering amplitude, this further implies that the novel sampling method is extremely stable with respect to errors in the data. Different to the classical sampling method such as the linear sampling method or the factorization method, from the numerical point of view, the novel indicator takes its maximum near the boundary of the underlying target and decays like the bessel functions as the sampling points go away from the boundary. The numerical simulations also show that the proposed sampling method can deal with multiple multiscale case, even the different components are close to each other.

  9. Sample preparation method for ICP-MS measurement of 99Tc in a large amount of environmental samples

    International Nuclear Information System (INIS)

    Kondo, M.; Seki, R.

    2002-01-01

    Sample preparation for measurement of 99 Tc in a large amount of soil and water samples by ICP-MS has been developed using 95m Tc as a yield tracer. This method is based on the conventional method for a small amount of soil samples using incineration, acid digestion, extraction chromatography (TEVA resin) and ICP-MS measurement. Preliminary concentration of Tc has been introduced by co-precipitation with ferric oxide. The matrix materials in a large amount of samples were more sufficiently removed with keeping the high recovery of Tc than previous method. The recovery of Tc was 70-80% for 100 g soil samples and 60-70% for 500 g of soil and 500 L of water samples. The detection limit of this method was evaluated as 0.054 mBq/kg in 500 g soil and 0.032 μBq/L in 500 L water. The determined value of 99 Tc in the IAEA-375 (soil sample collected near the Chernobyl Nuclear Reactor) was 0.25 ± 0.02 Bq/kg. (author)

  10. A Non-Uniformly Under-Sampled Blade Tip-Timing Signal Reconstruction Method for Blade Vibration Monitoring

    Directory of Open Access Journals (Sweden)

    Zheng Hu

    2015-01-01

    Full Text Available High-speed blades are often prone to fatigue due to severe blade vibrations. In particular, synchronous vibrations can cause irreversible damages to the blade. Blade tip-timing methods (BTT have become a promising way to monitor blade vibrations. However, synchronous vibrations are unsuitably monitored by uniform BTT sampling. Therefore, non-equally mounted probes have been used, which will result in the non-uniformity of the sampling signal. Since under-sampling is an intrinsic drawback of BTT methods, how to analyze non-uniformly under-sampled BTT signals is a big challenge. In this paper, a novel reconstruction method for non-uniformly under-sampled BTT data is presented. The method is based on the periodically non-uniform sampling theorem. Firstly, a mathematical model of a non-uniform BTT sampling process is built. It can be treated as the sum of certain uniform sample streams. For each stream, an interpolating function is required to prevent aliasing in the reconstructed signal. Secondly, simultaneous equations of all interpolating functions in each sub-band are built and corresponding solutions are ultimately derived to remove unwanted replicas of the original signal caused by the sampling, which may overlay the original signal. In the end, numerical simulations and experiments are carried out to validate the feasibility of the proposed method. The results demonstrate the accuracy of the reconstructed signal depends on the sampling frequency, the blade vibration frequency, the blade vibration bandwidth, the probe static offset and the number of samples. In practice, both types of blade vibration signals can be particularly reconstructed by non-uniform BTT data acquired from only two probes.

  11. Serum chromium levels sampled with steel needle versus plastic IV cannula. Does method matter?

    DEFF Research Database (Denmark)

    Penny, Jeannette Ø; Overgaard, Søren

    2010-01-01

    PURPOSE: Modern metal-on-metal (MoM) joint articulations releases metal ions to the body. Research tries to establish how much this elevates metal ion levels and whether it causes adverse effects. The steel needle that samples the blood may introduce additional chromium to the sample thereby...... causing bias. This study aimed to test that theory. METHODS: We compared serum chromium values for two sampling methods, steel needle and IV plastic cannula, as well as sampling sequence in 16 healthy volunteers. RESULTS: We found statistically significant chromium contamination from the steel needle...... with mean differences between the two methods of 0.073 ng/mL, for the first sample, and 0.033 ng/mL for the second. No difference was found between the first and second plastic sample. The first steel needle sample contained an average of 0.047 ng/mL more than the second. This difference was only borderline...

  12. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  13. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  14. Method for analysing radium in powder samples and its application to uranium prospecting

    International Nuclear Information System (INIS)

    Gong Xinxi; Hu Minzhi.

    1987-01-01

    The decayed daughter of Rn released from the power sample (soil) in a sealed bottle were collected on a piece of copper and the radium in the sample can be measured by counting α-particles with an Alphameter for uranium prospection, thus it is called the radium method. This method has many advantages, such as high sensitivity (the lowest limit of detection for radium sample per gram is 2.7 x 10 -15 g), high efficiency, low cost and easy to use. On the basis of measuring more than 700 samples taken along 20 sections in 8 deposits, the results show that the radium method is better than γ-measurement and equal to 210 Po method for the capability to descover anomalies. The author also summarizes the anomaly intensities of radium method, 210 Po method and γ-measurement respectively at the surface with deep blind ores, with or without surficial mineralization, and the figures of their profiles and the variation of Ra/ 210 Po ratios. According to the above-mentioned distinguishing features, the uranium mineralization located in deep and/or shallow parts can be distinguishd. The combined application of radium, 210 Po and γ-measurement methods may be regarded as one of the important methods used for anomaly assessment. Based on the experiments of the radium measurements with 771 stream sediments samples in an area of 100 km 2 , it is demonstrated that the radium mehtod can be used in the stages of uranium reconnaissance and prospecting

  15. Method and apparatus for sampling atmospheric mercury

    Science.gov (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  16. Comparison of three mycobacterial DNA extraction methods from extrapulmonary samples for PCR assay

    Directory of Open Access Journals (Sweden)

    Khandaker Shadia

    2012-01-01

    Full Text Available Sensitivity of the molecular diagnostic tests of extrapulmonary tuberculosis largely depends upon the efficiency of DNA extraction methods. The objective of our study was to compare three methods of extracting DNA of Mycobacterium tuberculosis for testing by polymerase chain reaction. All three methods; heating, heating with sonication and addition of lysis buffer with heating and sonication were implicated on 20 extrapulmonary samples. PCR positivity was 2 (10%, 4 (20% and 7 (35% in the samples extracted by heating, heat+sonication and heat+sonication+lysis buffer method respectively. Of the extraction methods evaluated, maximum PCR positive results were achieved by combined heat, sonication and lysis buffer method which can be applied in routine clinical practice. Ibrahim Med. Coll. J. 2012; 6(1: 9-11

  17. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  18. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  19. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    Science.gov (United States)

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  20. Preparation of Samples for Leaf Architecture Studies, A Method for Mounting Cleared Leaves

    Directory of Open Access Journals (Sweden)

    Alejandra Vasco

    2014-09-01

    Full Text Available Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.

  1. Analytic continuation of quantum Monte Carlo data. Stochastic sampling method

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Khaldoon; Koch, Erik [Institute for Advanced Simulation, Forschungszentrum Juelich, 52425 Juelich (Germany)

    2016-07-01

    We apply Bayesian inference to the analytic continuation of quantum Monte Carlo (QMC) data from the imaginary axis to the real axis. Demanding a proper functional Bayesian formulation of any analytic continuation method leads naturally to the stochastic sampling method (StochS) as the Bayesian method with the simplest prior, while it excludes the maximum entropy method and Tikhonov regularization. We present a new efficient algorithm for performing StochS that reduces computational times by orders of magnitude in comparison to earlier StochS methods. We apply the new algorithm to a wide variety of typical test cases: spectral functions and susceptibilities from DMFT and lattice QMC calculations. Results show that StochS performs well and is able to resolve sharp features in the spectrum.

  2. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  3. Optical methods for microstructure determination of doped samples

    Science.gov (United States)

    Ciosek, Jerzy F.

    2008-12-01

    The optical methods to determine refractive index profile of layered materials are commonly used with spectroscopic ellipsometry or transmittance/reflectance spectrometry. Measurements of spectral reflection and transmission usually permit to characterize optical materials and determine their refractive index. However, it is possible to characterize of samples with dopants, impurities as well as defects using optical methods. Microstructures of a hydrogenated crystalline Si wafer and a layer of SiO2 - ZrO2 composition are investigated. The first sample is a Si(001):H Czochralski grown single crystalline wafer with 50 nm thick surface Si02 layer. Hydrogen dose implantation (D continue to be an important issue in microelectronic device and sensor fabrication. Hydrogen-implanted silicon (Si: H) has become a topic of remarkable interest, mostly because of the potential of implantation-induced platelets and micro-cavities for the creation of gettering -active areas and for Si layer splitting. Oxygen precipitation and atmospheric impurity are analysed. The second sample is the layer of co-evaporated SiO2 and ZrO2 materials using simultaneously two electron beam guns in reactive evaporation methods. The composition structure was investigated by X-Ray photoelectron spectroscopy (XPS), and spectroscopic ellipsometry methods. A non-uniformity and composition of layer are analysed using average density method.

  4. Peak Bagging of red giant stars observed by Kepler: first results with a new method based on Bayesian nested sampling

    Science.gov (United States)

    Corsaro, Enrico; De Ridder, Joris

    2015-09-01

    The peak bagging analysis, namely the fitting and identification of single oscillation modes in stars' power spectra, coupled to the very high-quality light curves of red giant stars observed by Kepler, can play a crucial role for studying stellar oscillations of different flavor with an unprecedented level of detail. A thorough study of stellar oscillations would thus allow for deeper testing of stellar structure models and new insights in stellar evolution theory. However, peak bagging inferences are in general very challenging problems due to the large number of observed oscillation modes, hence of free parameters that can be involved in the fitting models. Efficiency and robustness in performing the analysis is what may be needed to proceed further. For this purpose, we developed a new code implementing the Nested Sampling Monte Carlo (NSMC) algorithm, a powerful statistical method well suited for Bayesian analyses of complex problems. In this talk we show the peak bagging of a sample of high signal-to-noise red giant stars by exploiting recent Kepler datasets and a new criterion for the detection of an oscillation mode based on the computation of the Bayesian evidence. Preliminary results for frequencies and lifetimes for single oscillation modes, together with acoustic glitches, are therefore presented.

  5. Peak Bagging of red giant stars observed by Kepler: first results with a new method based on Bayesian nested sampling

    Directory of Open Access Journals (Sweden)

    Corsaro Enrico

    2015-01-01

    Full Text Available The peak bagging analysis, namely the fitting and identification of single oscillation modes in stars’ power spectra, coupled to the very high-quality light curves of red giant stars observed by Kepler, can play a crucial role for studying stellar oscillations of different flavor with an unprecedented level of detail. A thorough study of stellar oscillations would thus allow for deeper testing of stellar structure models and new insights in stellar evolution theory. However, peak bagging inferences are in general very challenging problems due to the large number of observed oscillation modes, hence of free parameters that can be involved in the fitting models. Efficiency and robustness in performing the analysis is what may be needed to proceed further. For this purpose, we developed a new code implementing the Nested Sampling Monte Carlo (NSMC algorithm, a powerful statistical method well suited for Bayesian analyses of complex problems. In this talk we show the peak bagging of a sample of high signal-to-noise red giant stars by exploiting recent Kepler datasets and a new criterion for the detection of an oscillation mode based on the computation of the Bayesian evidence. Preliminary results for frequencies and lifetimes for single oscillation modes, together with acoustic glitches, are therefore presented.

  6. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    Science.gov (United States)

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  7. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    Science.gov (United States)

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  8. Cytotoxicity of Light-Cured Dental Materials according to Different Sample Preparation Methods

    Directory of Open Access Journals (Sweden)

    Myung-Jin Lee

    2017-03-01

    Full Text Available Dental light-cured resins can undergo different degrees of polymerization when applied in vivo. When polymerization is incomplete, toxic monomers may be released into the oral cavity. The present study assessed the cytotoxicity of different materials, using sample preparation methods that mirror clinical conditions. Composite and bonding resins were used and divided into four groups according to sample preparation method: uncured; directly cured samples, which were cured after being placed on solidified agar; post-cured samples were polymerized before being placed on agar; and “removed unreacted layer” samples had their oxygen-inhibition layer removed after polymerization. Cytotoxicity was evaluated using an agar diffusion test, MTT assay, and confocal microscopy. Uncured samples were the most cytotoxic, while removed unreacted layer samples were the least cytotoxic (p < 0.05. In the MTT assay, cell viability increased significantly in every group as the concentration of the extracts decreased (p < 0.05. Extracts from post-cured and removed unreacted layer samples of bonding resin were less toxic than post-cured and removed unreacted layer samples of composite resin. Removal of the oxygen-inhibition layer resulted in the lowest cytotoxicity. Clinicians should remove unreacted monomers on the resin surface immediately after restoring teeth with light-curing resin to improve the restoration biocompatibility.

  9. Field Sample Preparation Method Development for Isotope Ratio Mass Spectrometry

    International Nuclear Information System (INIS)

    Leibman, C.; Weisbrod, K.; Yoshida, T.

    2015-01-01

    Non-proliferation and International Security (NA-241) established a working group of researchers from Los Alamos National Laboratory (LANL), Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to evaluate the utilization of in-field mass spectrometry for safeguards applications. The survey of commercial off-the-shelf (COTS) mass spectrometers (MS) revealed no instrumentation existed capable of meeting all the potential safeguards requirements for performance, portability, and ease of use. Additionally, fieldable instruments are unlikely to meet the International Target Values (ITVs) for accuracy and precision for isotope ratio measurements achieved with laboratory methods. The major gaps identified for in-field actinide isotope ratio analysis were in the areas of: 1. sample preparation and/or sample introduction, 2. size reduction of mass analyzers and ionization sources, 3. system automation, and 4. decreased system cost. Development work in 2 through 4, numerated above continues, in the private and public sector. LANL is focusing on developing sample preparation/sample introduction methods for use with the different sample types anticipated for safeguard applications. Addressing sample handling and sample preparation methods for MS analysis will enable use of new MS instrumentation as it becomes commercially available. As one example, we have developed a rapid, sample preparation method for dissolution of uranium and plutonium oxides using ammonium bifluoride (ABF). ABF is a significantly safer and faster alternative to digestion with boiling combinations of highly concentrated mineral acids. Actinides digested with ABF yield fluorides, which can then be analyzed directly or chemically converted and separated using established column chromatography techniques as needed prior to isotope analysis. The reagent volumes and the sample processing steps associated with ABF sample digestion lend themselves to automation and field

  10. Radiochemistry methods in DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Fadeff, S.K.; Goheen, S.C.

    1994-08-01

    Current standard sources of radiochemistry methods are often inappropriate for use in evaluating US Department of Energy environmental and waste management (DOE/EW) samples. Examples of current sources include EPA, ASTM, Standard Methods for the Examination of Water and Wastewater and HASL-300. Applicability of these methods is limited to specific matrices (usually water), radiation levels (usually environmental levels), and analytes (limited number). Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) attempt to fill the applicability gap that exists between standard methods and those needed for DOE/EM activities. The Radiochemistry chapter in DOE Methods includes an ''analysis and reporting'' guidance section as well as radiochemistry methods. A basis for identifying the DOE/EM radiochemistry needs is discussed. Within this needs framework, the applicability of standard methods and targeted new methods is identified. Sources of new methods (consolidated methods from DOE laboratories and submissions from individuals) and the methods review process will be discussed. The processes involved in generating consolidated methods add editing individually submitted methods will be compared. DOE Methods is a living document and continues to expand by adding various kinds of methods. Radiochemistry methods are highlighted in this paper. DOE Methods is intended to be a resource for methods applicable to DOE/EM problems. Although it is intended to support DOE, the guidance and methods are not necessarily exclusive to DOE. The document is available at no cost through the Laboratory Management Division of DOE, Office of Technology Development

  11. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Science.gov (United States)

    2010-01-01

    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and calculation of percentages. (a) When the numerical... 7 Agriculture 2 2010-01-01 2010-01-01 false Methods of sampling and calculation of percentages. 51...

  12. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  13. A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.

    Science.gov (United States)

    Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa

    2016-05-17

    Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.

  14. Gap-Filling of Landsat 7 Imagery Using the Direct Sampling Method

    Directory of Open Access Journals (Sweden)

    Gaohong Yin

    2016-12-01

    Full Text Available The failure of the Scan Line Corrector (SLC on Landsat 7 imposed systematic data gaps on retrieved imagery and removed the capacity to provide spatially continuous fields. While a number of methods have been developed to fill these gaps, most of the proposed techniques are only applicable over relatively homogeneous areas. When they are applied to heterogeneous landscapes, retrieving image features and elements can become challenging. Here we present a gap-filling approach that is based on the adoption of the Direct Sampling multiple-point geostatistical method. The method employs a conditional stochastic resampling of known areas in a training image to simulate unknown locations. The approach is assessed across a range of both homogeneous and heterogeneous regions. Simulation results show that for homogeneous areas, satisfactory results can be obtained by simply adopting non-gap locations in the target image as baseline training data. For heterogeneous landscapes, bivariate simulations using an auxiliary variable acquired at a different date provides more accurate results than univariate simulations, especially as land cover complexity increases. Apart from recovering spatially continuous fields, one of the key advantages of the Direct Sampling is the relatively straightforward implementation process that relies on relatively few parameters.

  15. Sample Results From Tank 48H Samples HTF-48-14-158, -159, -169, and -170

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hang, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-28

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 48H in support of determining the cause for the unusually high dose rates at the sampling points for this tank. A set of two samples was taken from the quiescent tank, and two additional samples were taken after the contents of the tank were mixed. The results of the analyses of all the samples show that the contents of the tank have changed very little since the analysis of the previous sample in 2012. The solids are almost exclusively composed of tetraphenylborate (TPB) salts, and there is no indication of acceleration in the TPB decomposition. The filtrate composition shows a moderate increase in salt concentration and density, which is attributable to the addition of NaOH for the purposes of corrosion control. An older modeling simulation of the TPB degradation was updated, and the supernate results from a 2012 sample were run in the model. This result was compared to the results from the 2014 recent sample results reported in this document. The model indicates there is no change in the TPB degradation from 2012 to 2014. SRNL measured the buoyancy of the TPB solids in Tank 48H simulant solutions. It was determined that a solution of density 1.279 g/mL (~6.5M sodium) was capable of indefinitely suspending the TPB solids evenly throughout the solution. A solution of density 1.296 g/mL (~7M sodium) caused a significant fraction of the solids to float on the solution surface. As the experiments could not include the effect of additional buoyancy elements such as benzene or hydrogen generation, the buoyancy measurements provide an upper bound estimate of the density in Tank 48H required to float the solids.

  16. A novel sample preparation method using rapid nonheated saponification method for the determination of cholesterol in emulsified foods.

    Science.gov (United States)

    Jeong, In-Seek; Kwak, Byung-Man; Ahn, Jang-Hyuk; Leem, Donggil; Yoon, Taehyung; Yoon, Changyong; Jeong, Jayoung; Park, Jung-Min; Kim, Jin-Man

    2012-10-01

    In this study, nonheated saponification was employed as a novel, rapid, and easy sample preparation method for the determination of cholesterol in emulsified foods. Cholesterol content was analyzed using gas chromatography with a flame ionization detector (GC-FID). The cholesterol extraction method was optimized for maximum recovery from baby food and infant formula. Under these conditions, the optimum extraction solvent was 10 mL ethyl ether per 1 to 2 g sample, and the saponification solution was 0.2 mL KOH in methanol. The cholesterol content in the products was determined to be within the certified range of certified reference materials (CRMs), NIST SRM 1544 and SRM 1849. The results of the recovery test performed using spiked materials were in the range of 98.24% to 99.45% with an relative standard devitation (RSD) between 0.83% and 1.61%. This method could be used to reduce sample pretreatment time and is expected to provide an accurate determination of cholesterol in emulsified food matrices such as infant formula and baby food. A novel, rapid, and easy sample preparation method using nonheated saponification was developed for cholesterol detection in emulsified foods. Recovery tests of CRMs were satisfactory, and the recoveries of spiked materials were accurate and precise. This method was effective and decreased the time required for analysis by 5-fold compared to the official method. © 2012 Institute of Food Technologists®

  17. Intercomparison of methods for determining 90Sr and 137Cs in plant samples

    International Nuclear Information System (INIS)

    Sha Lianmao; Zhao Min; Tian Guizhi

    1986-01-01

    The results of intercomparison of methods for determining 90 Sr and 137 Cs in plant samples are reported. Nine laboratories participated in the intercomparison. The samples used in intercomparison were reed and tea powders. The analytical results of 90 Sr in reed and 137 Cs in tea from different laboratories show good comparability and follow the normal-distribution. Some results reported of 137 Cs in reed are apparently lower than others. The results of 90 Sr in tea from different laboratories have poor comparability. The results obtained by HDEHP rapid extraction chromatograph appear to be too high, and it's cause is discussed. The 95% confidence intervals of content of 90 Sr and 137 Cs in reed and tea samples are given

  18. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  19. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  20. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  1. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  2. Estimation of creatinine in Urine sample by Jaffe's method

    International Nuclear Information System (INIS)

    Wankhede, Sonal; Arunkumar, Suja; Sawant, Pramilla D.; Rao, B.B.

    2012-01-01

    In-vitro bioassay monitoring is based on the determination of activity concentrations in biological samples excreted from the body and is most suitable for alpha and beta emitters. A truly representative bioassay sample is the one having all the voids collected during a 24-h period however, this being technically difficult, overnight urine samples collected by the workers are analyzed. These overnight urine samples are collected for 10-16 h, however in the absence of any specific information, 12 h duration is assumed and the observed results are then corrected accordingly obtain the daily excretion rate. To reduce the uncertainty due to unknown duration of sample collection, IAEA has recommended two methods viz., measurement of specific gravity and creatinine excretion rate in urine sample. Creatinine is a final metabolic product creatinine phosphate in the body and is excreted at a steady rate for people with normally functioning kidneys. It is, therefore, often used as a normalization factor for estimation of duration of sample collection. The present study reports the chemical procedure standardized and its application for the estimation of creatinine in urine samples collected from occupational workers. Chemical procedure for estimation of creatinine in bioassay samples was standardized and applied successfully for its estimation in bioassay samples collected from the workers. The creatinine excretion rate observed for these workers is lower than observed in literature. Further, work is in progress to generate a data bank of creatinine excretion rate for most of the workers and also to study the variability in creatinine coefficient for the same individual based on the analysis of samples collected for different duration

  3. Verification of spectrophotometric method for nitrate analysis in water samples

    Science.gov (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu

    2017-12-01

    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  4. Comparing two sampling methods to engage hard-to-reach communities in research priority setting

    Directory of Open Access Journals (Sweden)

    Melissa A. Valerio

    2016-10-01

    Full Text Available Abstract Background Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. Methods In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1 snowball sampling, a chain- referral method or 2 purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community. Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities’ stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Results Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 % consented, 52 (95 % attended the first meeting, and 36 (65 % attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 % consented, 36 (58 % attended the first meeting, and 26 (42 % attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05. Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045 which was higher for the purposive/convenience sampling group and for city improvements

  5. An economic passive sampling method to detect particulate pollutants using magnetic measurements.

    Science.gov (United States)

    Cao, Liwan; Appel, Erwin; Hu, Shouyun; Ma, Mingming

    2015-10-01

    Identifying particulate matter (PM) emitted from industrial processes into the atmosphere is an important issue in environmental research. This paper presents a passive sampling method using simple artificial samplers that maintains the advantage of bio-monitoring, but overcomes some of its disadvantages. The samplers were tested in a heavily polluted area (Linfen, China) and compared to results from leaf samples. Spatial variations of magnetic susceptibility from artificial passive samplers and leaf samples show very similar patterns. Scanning electron microscopy suggests that the collected PM are mostly in the range of 2-25 μm; frequent occurrence of spherical shape indicates industrial combustion dominates PM emission. Magnetic properties around power plants show different features than other plants. This sampling method provides a suitable and economic tool for semi-quantifying temporal and spatial distribution of air quality; they can be installed in a regular grid and calibrate the weight of PM. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 4 TANK 21H QUALIFICATION SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2011-06-22

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H to qualify them for use in the Integrated Salt Disposition Program (ISDP) Batch 4 processing. All sample results agree with expectations based on prior analyses where available. No issues with the projected Salt Batch 4 strategy are identified. This revision includes additional data points that were not available in the original issue of the document, such as additional plutonium results, the results of the monosodium titanate (MST) sorption test and the extraction, scrub strip (ESS) test. This report covers the revision to the Tank 21H qualification sample results for Macrobatch (Salt Batch) 4 of the Integrated Salt Disposition Program (ISDP). A previous document covers initial characterization which includes results for a number of non-radiological analytes. These results were used to perform aluminum solubility modeling to determine the hydroxide needs for Salt Batch 4 to prevent the precipitation of solids. Sodium hydroxide was then added to Tank 21 and additional samples were pulled for the analyses discussed in this report. This work was specified by Task Technical Request and by Task Technical and Quality Assurance Plan (TTQAP).

  7. Tank 241-U-104 headspace gas and vapor characterization results from samples collected on July 16, 1996

    International Nuclear Information System (INIS)

    Pool, K.H.; Evans, J.C.; Hayes, J.C.; Mitroshkov, A.V.; Edwards, J.A.; Julya, J.L.; Thornton, B.M.; Fruchter, J.S.; Silvers, K.L.

    1997-08-01

    This report presents the results from analyses of samples taken from the headspace of waste storage tank 241-U-104 (Tank U-104) at the Hanford Site in Washington State. Tank headspace samples collected by Westinghouse Hanford Company (WHC) were analyzed by Pacific Northwest National Laboratory (PNNL) to determine headspace concentrations of selected non-radioactive analytes. Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Vapor concentrations from sorbent trap samples are based on measured sample volumes provided by WHC. No analytes were determined to be above the immediate notification limits specified by the sampling and analysis plan. None of the flammable constituents were present at concentrations above the analytical instrument detection limits. Total headspace flammability was estimated to be <0.108% of the lower flammability limit. Average measured concentrations of targeted gases, inorganic vapors, and selected organic vapors are provided in a table. A summary of experimental methods, including sampling methodology, analytical procedures, and quality assurance and control methods are presented in Section 2.0. Detailed descriptions of the analytical results are provided in Section 3.0

  8. Sample similarity analysis of angles of repose based on experimental results for DEM calibration

    Science.gov (United States)

    Tan, Yuan; Günthner, Willibald A.; Kessler, Stephan; Zhang, Lu

    2017-06-01

    As a fundamental material property, particle-particle friction coefficient is usually calculated based on angle of repose which can be obtained experimentally. In the present study, the bottomless cylinder test was carried out to investigate this friction coefficient of a kind of biomass material, i.e. willow chips. Because of its irregular shape and varying particle size distribution, calculation of the angle becomes less applicable and decisive. In the previous studies only one section of those uneven slopes is chosen in most cases, although standard methods in definition of a representable section are barely found. Hence, we presented an efficient and reliable method from the new technology, 3D scan, which was used to digitize the surface of heaps and generate its point cloud. Then, two tangential lines of any selected section were calculated through the linear least-squares regression (LLSR), such that the left and right angle of repose of a pile could be derived. As the next step, a certain sum of sections were stochastic selected, and calculations were repeated correspondingly in order to achieve sample of angles, which was plotted in Cartesian coordinates as spots diagram. Subsequently, different samples were acquired through various selections of sections. By applying similarities and difference analysis of these samples, the reliability of this proposed method was verified. Phased results provides a realistic criterion to reduce the deviation between experiment and simulation as a result of random selection of a single angle, which will be compared with the simulation results in the future.

  9. Sample similarity analysis of angles of repose based on experimental results for DEM calibration

    Directory of Open Access Journals (Sweden)

    Tan Yuan

    2017-01-01

    Full Text Available As a fundamental material property, particle-particle friction coefficient is usually calculated based on angle of repose which can be obtained experimentally. In the present study, the bottomless cylinder test was carried out to investigate this friction coefficient of a kind of biomass material, i.e. willow chips. Because of its irregular shape and varying particle size distribution, calculation of the angle becomes less applicable and decisive. In the previous studies only one section of those uneven slopes is chosen in most cases, although standard methods in definition of a representable section are barely found. Hence, we presented an efficient and reliable method from the new technology, 3D scan, which was used to digitize the surface of heaps and generate its point cloud. Then, two tangential lines of any selected section were calculated through the linear least-squares regression (LLSR, such that the left and right angle of repose of a pile could be derived. As the next step, a certain sum of sections were stochastic selected, and calculations were repeated correspondingly in order to achieve sample of angles, which was plotted in Cartesian coordinates as spots diagram. Subsequently, different samples were acquired through various selections of sections. By applying similarities and difference analysis of these samples, the reliability of this proposed method was verified. Phased results provides a realistic criterion to reduce the deviation between experiment and simulation as a result of random selection of a single angle, which will be compared with the simulation results in the future.

  10. Turbidity threshold sampling: Methods and instrumentation

    Science.gov (United States)

    Rand Eads; Jack Lewis

    2001-01-01

    Traditional methods for determining the frequency of suspended sediment sample collection often rely on measurements, such as water discharge, that are not well correlated to sediment concentration. Stream power is generally not a good predictor of sediment concentration for rivers that transport the bulk of their load as fines, due to the highly variable routing of...

  11. Solid phase microextraction headspace sampling of chemical warfare agent contaminated samples : method development for GC-MS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson Lepage, C.R.; Hancock, J.R. [Defence Research and Development Canada, Medicine Hat, AB (Canada); Wyatt, H.D.M. [Regina Univ., SK (Canada)

    2004-07-01

    Defence R and D Canada-Suffield (DRDC-Suffield) is responsible for analyzing samples that are suspected to contain chemical warfare agents, either collected by the Canadian Forces or by first-responders in the event of a terrorist attack in Canada. The analytical techniques used to identify the composition of the samples include gas chromatography-mass spectrometry (GC-MS), liquid chromatography-mass spectrometry (LC-MS), Fourier-transform infrared spectroscopy (FT-IR) and nuclear magnetic resonance spectroscopy. GC-MS and LC-MS generally require solvent extraction and reconcentration, thereby increasing sample handling. The authors examined analytical techniques which reduce or eliminate sample manipulation. In particular, this paper presented a screening method based on solid phase microextraction (SPME) headspace sampling and GC-MS analysis for chemical warfare agents such as mustard, sarin, soman, and cyclohexyl methylphosphonofluoridate in contaminated soil samples. SPME is a method which uses small adsorbent polymer coated silica fibers that trap vaporous or liquid analytes for GC or LC analysis. Collection efficiency can be increased by adjusting sampling time and temperature. This method was tested on two real-world samples, one from excavated chemical munitions and the second from a caustic decontamination mixture. 7 refs., 2 tabs., 3 figs.

  12. Comparability among four invertebrate sampling methods and two multimetric indexes, Fountain Creek Basin, Colorado, 2010–2012

    Science.gov (United States)

    Bruce, James F.; Roberts, James J.; Zuellig, Robert E.

    2018-05-24

    The U.S. Geological Survey (USGS), in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, analyzed previously collected invertebrate data to determine the comparability among four sampling methods and two versions (2010 and 2017) of the Colorado Benthic Macroinvertebrate Multimetric Index (MMI). For this study, annual macroinvertebrate samples were collected concurrently (in space and time) at 15 USGS surface-water gaging stations in the Fountain Creek Basin from 2010 to 2012 using four sampling methods. The USGS monitoring project in the basin uses two of the methods and the Colorado Department of Public Health and Environment recommends the other two. These methods belong to two distinct sample types, one that targets single habitats and one that targets multiple habitats. The study results indicate that there are significant differences in MMI values obtained from the single-habitat and multihabitat sample types but methods from each program within each sample type produced comparable values. This study also determined that MMI values calculated by different versions of the Colorado Benthic Macroinvertebrate MMI are indistinguishable. This indicates that the Colorado Department of Public Health and Environment methods are comparable with the USGS monitoring project methods for single-habitat and multihabitat sample types. This report discusses the direct application of the study results to inform the revision of the existing USGS monitoring project in the Fountain Creek Basin.

  13. Measurement of cerebral blood flow the blood sampling method using 99mTc-ECD. Simultaneous scintigram scanning of arterial blood samples and the brain with a gamma camera

    International Nuclear Information System (INIS)

    Hachiya, Takenori; Inugami, Atsushi; Iida, Hidehiro; Mizuta, Yoshihiko; Kawakami, Takeshi; Inoue, Minoru

    1999-01-01

    To measure regional cerebral blood flow (rCBF) by blood sampling using 99m Tc-ECD we devised a method of measuring the radioactive concentration in arterial blood sample with a gamma camera. In this method the head and a blood sample are placed within the same visual field to record the SPECT data of both specimens simultaneously. The results of an evaluation of the counting rate performance, applying the 30 hours decaying method using 99m Tc solution showed that this method is not comparable to the well-type scintillation counter and in clinical cases the active concentration in arterial blood sample remained well within the dynamic range. In addition, examination of the influence of scattered radiation from the brain by the dilution method showed that it was negligible at a distance of more than 7.5 cm between the brain and the arterial blood sample. In the present study we placed a head-shaped phantom next to the sample. The results of the examinations suggested that this method is suitable for clinical application, and because it does not require a well-type scintillation counter, it is expected to find wide application. (author)

  14. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media.

  15. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung; Noh, Jae Man

    2015-01-01

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media

  16. Comparing Microbiome Sampling Methods in a Wild Mammal: Fecal and Intestinal Samples Record Different Signals of Host Ecology, Evolution.

    Science.gov (United States)

    Ingala, Melissa R; Simmons, Nancy B; Wultsch, Claudia; Krampis, Konstantinos; Speer, Kelly A; Perkins, Susan L

    2018-01-01

    The gut microbiome is a community of host-associated symbiotic microbes that fulfills multiple key roles in host metabolism, immune function, and tissue development. Given the ability of the microbiome to impact host fitness, there is increasing interest in studying the microbiome of wild animals to better understand these communities in the context of host ecology and evolution. Human microbiome research protocols are well established, but wildlife microbiome research is still a developing field. Currently, there is no standardized set of best practices guiding the collection of microbiome samples from wildlife. Gut microflora are typically sampled either by fecal collection, rectal swabbing, or by destructively sampling the intestinal contents of the host animal. Studies rarely include more than one sampling technique and no comparison of these methods currently exists for a wild mammal. Although some studies have hypothesized that the fecal microbiome is a nested subset of the intestinal microbiome, this hypothesis has not been formally tested. To address these issues, we examined guano (feces) and distal intestinal mucosa from 19 species of free-ranging bats from Lamanai, Belize, using 16S rRNA amplicon sequencing to compare microbial communities across sample types. We found that the diversity and composition of intestine and guano samples differed substantially. In addition, we conclude that signatures of host evolution are retained by studying gut microbiomes based on mucosal tissue samples, but not fecal samples. Conversely, fecal samples retained more signal of host diet than intestinal samples. These results suggest that fecal and intestinal sampling methods are not interchangeable, and that these two microbiotas record different information about the host from which they are isolated.

  17. Comparing Microbiome Sampling Methods in a Wild Mammal: Fecal and Intestinal Samples Record Different Signals of Host Ecology, Evolution

    Directory of Open Access Journals (Sweden)

    Melissa R. Ingala

    2018-05-01

    Full Text Available The gut microbiome is a community of host-associated symbiotic microbes that fulfills multiple key roles in host metabolism, immune function, and tissue development. Given the ability of the microbiome to impact host fitness, there is increasing interest in studying the microbiome of wild animals to better understand these communities in the context of host ecology and evolution. Human microbiome research protocols are well established, but wildlife microbiome research is still a developing field. Currently, there is no standardized set of best practices guiding the collection of microbiome samples from wildlife. Gut microflora are typically sampled either by fecal collection, rectal swabbing, or by destructively sampling the intestinal contents of the host animal. Studies rarely include more than one sampling technique and no comparison of these methods currently exists for a wild mammal. Although some studies have hypothesized that the fecal microbiome is a nested subset of the intestinal microbiome, this hypothesis has not been formally tested. To address these issues, we examined guano (feces and distal intestinal mucosa from 19 species of free-ranging bats from Lamanai, Belize, using 16S rRNA amplicon sequencing to compare microbial communities across sample types. We found that the diversity and composition of intestine and guano samples differed substantially. In addition, we conclude that signatures of host evolution are retained by studying gut microbiomes based on mucosal tissue samples, but not fecal samples. Conversely, fecal samples retained more signal of host diet than intestinal samples. These results suggest that fecal and intestinal sampling methods are not interchangeable, and that these two microbiotas record different information about the host from which they are isolated.

  18. Comparison of analytical methods for the determination of histamine in reference canned fish samples

    Science.gov (United States)

    Jakšić, S.; Baloš, M. Ž.; Mihaljev, Ž.; Prodanov Radulović, J.; Nešić, K.

    2017-09-01

    Two screening methods for histamine in canned fish, an enzymatic test and a competitive direct enzyme-linked immunosorbent assay (CD-ELISA), were compared with the reversed-phase liquid chromatography (RP-HPLC) standard method. For enzymatic and CD-ELISA methods, determination was conducted according to producers’ manuals. For RP-HPLC, histamine was derivatized with dansyl-chloride, followed by RP-HPLC and diode array detection. Results of analysis of canned fish, supplied as reference samples for proficiency testing, showed good agreement when histamine was present at higher concentrations (above 100 mg kg-1). At a lower level (16.95 mg kg-1), the enzymatic test produced some higher results. Generally, analysis of four reference samples according to CD-ELISA and RP-HPLC showed good agreement for histamine determination (r=0.977 in concentration range 16.95-216 mg kg-1) The results show that the applied enzymatic test and CD-ELISA appeared to be suitable screening methods for the determination of histamine in canned fish.

  19. Validation of EIA sampling methods - bacterial and biochemical analysis

    Digital Repository Service at National Institute of Oceanography (India)

    Sheelu, G.; LokaBharathi, P.A.; Nair, S.; Raghukumar, C.; Mohandass, C.

    to temporal factors. Paired T-test between pre- and post-disturbance samples suggested that the above methods of sampling and variables like TC, protein and TOC could be used for monitoring disturbance....

  20. Results of the analyses of the intercomparison samples of natural dioxide SR-1

    International Nuclear Information System (INIS)

    Aigner, H.; Kuhn, E.; Deron, S.

    1980-08-01

    Samples of a homogeneous powder of natural uranium dioxide, SR-1, were distributed to 37 laboratories in November 1977 for intercomparison of the precisions and accuracies of wet chemical assays. 17 laboratories reported 18 sets of results (one laboratory applied two techniques). The analytical methods which were applied were: titration (11), coulometry (2), precipitation-gravimetry (1), flourimetry (2), X-Ray flourescence (1) and neutron activation (1). Analysis of variance yield for each combination of laboratory and technique the estimates of the measurement errors, the dissolution or treatment errors and the fluctuation of the measurements between sample bottles. Time effects have also been tested. The measurement errors vary between 0.01% and 6.4%. Eleven laboratories agree within 0.25% with the reference value. No mean obtained by wet chemical methods is biased by more than 0.4%. The biases of the other methods (flourimetry, X-Ray fluorescence and neutron activation) vary between 0.5% and 4.3%. The biases of 9 laboratories or techniques are greater than expected from their random errors. The mean bias of the fourteen wet chemical methods is equal to 0.08% U with a standard deviation of +-0.18% U

  1. Most Recent Sampling Results for Annex III Building

    Science.gov (United States)

    Contains email from Scott Miller, US EPA to Scott Kramer. Subject: Most Recent Sampling Results for Annex III Building. (2:52 PM) and Gore(TM) Surveys Analytical Results U.S. Geological Survey, Montgomery, AL.

  2. An efficient method for sampling the essential subspace of proteins

    NARCIS (Netherlands)

    Amadei, A; Linssen, A.B M; de Groot, B.L.; van Aalten, D.M.F.; Berendsen, H.J.C.

    A method is presented for a more efficient sampling of the configurational space of proteins as compared to conventional sampling techniques such as molecular dynamics. The method is based on the large conformational changes in proteins revealed by the ''essential dynamics'' analysis. A form of

  3. Cast Stone Oxidation Front Evaluation: Preliminary Results For Samples Exposed To Moist Air

    International Nuclear Information System (INIS)

    Langton, C. A.; Almond, P. M.

    2013-01-01

    The rate of oxidation is important to the long-term performance of reducing salt waste forms because the solubility of some contaminants, e.g., technetium, is a function of oxidation state. TcO 4 - in the salt solution is reduced to Tc(IV) and has been shown to react with ingredients in the waste form to precipitate low solubility sulfide and/or oxide phases. Upon exposure to oxygen, the compounds containing Tc(IV) oxidize to the pertechnetate ion, Tc(VII)O 4 - , which is very soluble. Consequently the rate of technetium oxidation front advancement into a monolith and the technetium leaching profile as a function of depth from an exposed surface are important to waste form performance and ground water concentration predictions. An approach for measuring contaminant oxidation rate (effective contaminant specific oxidation rate) based on leaching of select contaminants of concern is described in this report. In addition, the relationship between reduction capacity and contaminant oxidation is addressed. Chromate (Cr(VI) was used as a non-radioactive surrogate for pertechnetate, Tc(VII), in Cast Stone samples prepared with 5 M Simulant. Cast Stone spiked with pertechnetate was also prepared and tested. Depth discrete subsamples spiked with Cr were cut from Cast Stone exposed to Savannah River Site (SRS) outdoor ambient temperature fluctuations and moist air. Depth discrete subsamples spiked with Tc-99 were cut from Cast Stone exposed to laboratory ambient temperature fluctuations and moist air. Similar conditions are expected to be encountered in the Cast Stone curing container. The leachability of Cr and Tc-99 and the reduction capacities, measured by the Angus-Glasser method, were determined for each subsample as a function of depth from the exposed surface. The results obtained to date were focused on continued method development and are preliminary and apply to the sample composition and curing / exposure conditions described in this report. The Cr oxidation front

  4. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  5. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  6. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    Science.gov (United States)

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  7. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  8. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Science.gov (United States)

    Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.

    2018-02-01

    River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of

  9. Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples: Addressing new challenges

    International Nuclear Information System (INIS)

    Fadeff, S.K.; Goheen, S.C.; Riley, R.G.

    1994-01-01

    Radiochemistry methods in Department of Energy Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) add to the repertoire of other standard methods in support of U.S. Department of Energy environmental restoration and waste management (DOE/EM) radiochemical characterization activities. Current standard sources of radiochemistry methods are not always applicable for evaluating DOE/EM samples. Examples of current sources include those provided by the US Environmental Protection Agency, the American Society for Testing and Materials, Standard Methods for the Examination of Water and Wastewater, and Environmental Measurements Laboratory Procedures Manual (HASL-300). The applicability of these methods is generally limited to specific matrices (usually water), low-level radioactive samples, and a limited number of analytes. DOE Methods complements these current standard methods by addressing the complexities of EM characterization needs. The process for determining DOE/EM radiochemistry characterization needs is discussed. In this context of DOE/EM needs, the applicability of other sources of standard radiochemistry methods is defined, and gaps in methodology are identified. Current methods in DOE Methods and the EM characterization needs they address are discussed. Sources of new methods and the methods incorporation process are discussed. The means for individuals to participate in (1) identification of DOE/EM needs, (2) the methods incorporation process, and (3) submission of new methods are identified

  10. A comprehensive comparison of perpendicular distance sampling methods for sampling downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2013-01-01

    Many new methods for sampling down coarse woody debris have been proposed in the last dozen or so years. One of the most promising in terms of field application, perpendicular distance sampling (PDS), has several variants that have been progressively introduced in the literature. In this study, we provide an overview of the different PDS variants and comprehensive...

  11. Evaluation of environmental sampling methods for detection of Salmonella enterica in a large animal veterinary hospital.

    Science.gov (United States)

    Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey

    2018-04-01

    Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.

  12. Sample preparation and biomass determination of SRF model mixture using cryogenic milling and the adapted balance method

    Energy Technology Data Exchange (ETDEWEB)

    Schnöller, Johannes, E-mail: johannes.schnoeller@chello.at; Aschenbrenner, Philipp; Hahn, Manuel; Fellner, Johann; Rechberger, Helmut

    2014-11-15

    Highlights: • An alternative sample comminution procedure for SRF is tested. • Proof of principle is shown on a SRF model mixture. • The biogenic content of the SRF is analyzed with the adapted balance method. • The novel method combines combustion analysis and a data reconciliation algorithm. • Factors for the variance of the analysis results are statistically quantified. - Abstract: The biogenic fraction of a simple solid recovered fuel (SRF) mixture (80 wt% printer paper/20 wt% high density polyethylene) is analyzed with the in-house developed adapted balance method (aBM). This fairly new approach is a combination of combustion elemental analysis (CHNS) and a data reconciliation algorithm based on successive linearisation for evaluation of the analysis results. This method shows a great potential as an alternative way to determine the biomass content in SRF. However, the employed analytical technique (CHNS elemental analysis) restricts the probed sample mass to low amounts in the range of a few hundred milligrams. This requires sample comminution to small grain sizes (<200 μm) to generate representative SRF specimen. This is not easily accomplished for certain material mixtures (e.g. SRF with rubber content) by conventional means of sample size reduction. This paper presents a proof of principle investigation of the sample preparation and analysis of an SRF model mixture with the use of cryogenic impact milling (final sample comminution) and the adapted balance method (determination of biomass content). The so derived sample preparation methodology (cutting mills and cryogenic impact milling) shows a better performance in accuracy and precision for the determination of the biomass content than one solely based on cutting mills. The results for the determination of the biogenic fraction are within 1–5% of the data obtained by the reference methods, selective dissolution method (SDM) and {sup 14}C-method ({sup 14}C-M)

  13. Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods

    Directory of Open Access Journals (Sweden)

    David P. Griesheimer

    2017-09-01

    Full Text Available The application of Monte Carlo (MC to large-scale fixed-source problems has recently become possible with new hybrid methods that automate generation of parameters for variance reduction techniques. Two common variance reduction techniques, weight windows and source biasing, have been automated and popularized by the consistent adjoint-driven importance sampling (CADIS method. This method uses the adjoint solution from an inexpensive deterministic calculation to define a consistent set of weight windows and source particles for a subsequent MC calculation. One of the motivations for source consistency is to avoid the splitting or rouletting of particles at birth, which requires computational resources. However, it is not always possible or desirable to implement such consistency, which results in inconsistent source biasing. This paper develops an original framework that mathematically expresses the coupling of the weight window and source biasing techniques, allowing the authors to explore the impact of inconsistent source sampling on the variance of MC results. A numerical experiment supports this new framework and suggests that certain classes of problems may be relatively insensitive to inconsistent source sampling schemes with moderate levels of splitting and rouletting.

  14. Comparison of Spot and Time Weighted Averaging (TWA Sampling with SPME-GC/MS Methods for Trihalomethane (THM Analysis

    Directory of Open Access Journals (Sweden)

    Don-Roger Parkinson

    2016-02-01

    Full Text Available Water samples were collected and analyzed for conductivity, pH, temperature and trihalomethanes (THMs during the fall of 2014 at two monitored municipal drinking water source ponds. Both spot (or grab and time weighted average (TWA sampling methods were assessed over the same two day sampling time period. For spot sampling, replicate samples were taken at each site and analyzed within 12 h of sampling by both Headspace (HS- and direct (DI- solid phase microextraction (SPME sampling/extraction methods followed by Gas Chromatography/Mass Spectrometry (GC/MS. For TWA, a two day passive on-site TWA sampling was carried out at the same sampling points in the ponds. All SPME sampling methods undertaken used a 65-µm PDMS/DVB SPME fiber, which was found optimal for THM sampling. Sampling conditions were optimized in the laboratory using calibration standards of chloroform, bromoform, bromodichloromethane, dibromochloromethane, 1,2-dibromoethane and 1,2-dichloroethane, prepared in aqueous solutions from analytical grade samples. Calibration curves for all methods with R2 values ranging from 0.985–0.998 (N = 5 over the quantitation linear range of 3–800 ppb were achieved. The different sampling methods were compared for quantification of the water samples, and results showed that DI- and TWA- sampling methods gave better data and analytical metrics. Addition of 10% wt./vol. of (NH42SO4 salt to the sampling vial was found to aid extraction of THMs by increasing GC peaks areas by about 10%, which resulted in lower detection limits for all techniques studied. However, for on-site TWA analysis of THMs in natural waters, the calibration standard(s ionic strength conditions, must be carefully matched to natural water conditions to properly quantitate THM concentrations. The data obtained from the TWA method may better reflect actual natural water conditions.

  15. A multi-dimensional sampling method for locating small scatterers

    International Nuclear Information System (INIS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-01-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method. (paper)

  16. Efficiency of snake sampling methods in the Brazilian semiarid region.

    Science.gov (United States)

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z

    2013-09-01

    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  17. Comparison of three methods for recovery of Brucella canis DNA from canine blood samples.

    Science.gov (United States)

    Batinga, Maria Cryskely A; Dos Santos, Jaíne C; Lima, Julia T R; Bigotto, Maria Fernanda D; Muner, Kerstin; Faita, Thalita; Soares, Rodrigo M; da Silva, David A V; Oliveira, Trícia M F S; Ferreira, Helena L; Diniz, Jaqueline A; Keid, Lara B

    2017-12-01

    Brucella canis, a gram-negative, facultative intracellular and zoonotic bacterium causes canine brucellosis. Direct methods are the most appropriate for the detection of canine brucellosis and bacterial isolation from blood samples has been employed as gold-standard method. However, due to the delay in obtaining results and the biological risk of the bacterial culturing, the polymerase chain reaction (PCR) has been successfully used as an alternative method for the diagnosis of the infection. Sample preparation is a key step for successful PCR and protocols that provide high DNA yield and purity are recommended to ensure high diagnostic sensitivity. The objective of this study was to evaluate the performance of PCR for the diagnosis of B. canis infection in 36 dogs by testing DNA of whole blood obtained through different extraction and purification protocols. Methods 1 and 2 were based on a commercial kit, using protocols recommended for DNA purification of whole blood and tissue samples, respectively. Method 3 was an in-house method based on enzymatic lysis and purification using organic solvents. The results of the PCR on samples obtained through three different DNA extraction protocols were compared to the blood culture. Of the 36 dogs, 13 (36.1%) were positive by blood culturing, while nine (25.0%), 14 (38.8%), and 15 (41.6%) were positive by PCR after DNA extraction using methods 1, 2 and 3, respectively. PCR performed on DNA purified by Method 2 was as efficient as blood culturing and PCR performed on DNA purified with in-house method, but had the advantage of being less laborious and, therefore, a suitable alternative for the direct B. canis detection in dogs. Copyright © 2017. Published by Elsevier B.V.

  18. The currently used commercial DNA-extraction methods give different results of clostridial and actinobacterial populations derived from human fecal samples.

    Science.gov (United States)

    Maukonen, Johanna; Simões, Catarina; Saarela, Maria

    2012-03-01

    Recently several human health-related microbiota studies have had partly contradictory results. As some differences may be explained by methodologies applied, we evaluated how different storage conditions and commonly used DNA-extraction kits affect bacterial composition, diversity, and numbers of human fecal microbiota. According to our results, the DNA-extraction did not affect the diversity, composition, or quantity of Bacteroides spp., whereas after a week's storage at -20 °C, the numbers of Bacteroides spp. were 1.6-2.5 log units lower (P Eubacterium rectale (Erec)-group, Clostridium leptum group, bifidobacteria, and Atopobium group were 0.5-4 log units higher (P < 0.05) after mechanical DNA-extraction as detected with qPCR, regardless of storage. Furthermore, the bacterial composition of Erec-group differed significantly after different DNA-extractions; after enzymatic DNA-extraction, the most prevalent genera detected were Roseburia (39% of clones) and Coprococcus (10%), whereas after mechanical DNA-extraction, the most prevalent genera were Blautia (30%), Coprococcus (13%), and Dorea (10%). According to our results, rigorous mechanical lysis enables detection of higher bacterial numbers and diversity from human fecal samples. As it was shown that the results of clostridial and actinobacterial populations are highly dependent on the DNA-extraction methods applied, the use of different DNA-extraction protocols may explain the contradictory results previously obtained. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  19. Comparison of the analysis result between two laboratories using different methods

    International Nuclear Information System (INIS)

    Sri Murniasih; Agus Taftazani

    2017-01-01

    Comparison of the analysis result of volcano ash sample between two laboratories using different analysis methods. The research aims to improve the testing laboratory quality and cooperate with the testing laboratory from other country. Samples were tested at the Center for Accelerator of Science and Technology (CAST)-NAA laboratory using NAA, while at the University of Texas (UT) USA using ICP-MS and ENAA method. From 12 elements of target, CAST-NAA able to present 11 elements of data analysis. The comparison results shows that the analysis of the K, Mn, Ti and Fe elements from both laboratories have a very good comparison and close one to other. It is known from RSD values and correlation coefficients of the both laboratories analysis results. While observed of the results difference known that the analysis results of Al, Na, K, Fe, V, Mn, Ti, Cr and As elements from both laboratories is not significantly different. From 11 elements were reported, only Zn which have significantly different values for both laboratories. (author)

  20. Preparation of samples for leaf architecture studies, a method for mounting cleared leaves1

    Science.gov (United States)

    Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C.

    2014-01-01

    • Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration. PMID:25225627

  1. Methods for Characterisation of unknown Suspect Radioactive Samples

    International Nuclear Information System (INIS)

    Sahagia, M.; Grigorescu, E.L.; Luca, A.; Razdolescu, A.C.; Ivan, C.

    2001-01-01

    Full text: The paper presents various identification and measurement methods, used for the expertise of a wide variety of suspect radioactive materials, whose circulation was not legally stated. The main types of examined samples were: radioactive sources, illegally trafficked; suspect radioactive materials or radioactively contaminated devices; uranium tablets; fire detectors containing 241 Am sources; osmium samples containing radioactive 185 Os or enriched 187 Os. The types of analyses and determination methods were as follows: the chemical composition was determined by using identification reagents or by neutron activation analysis; the radionuclide composition was determined by using gamma-ray spectrometry; the activity and particle emission rates were determined by using calibrated radiometric equipment; the absorbed dose rate at the wall of all types of containers and samples was determined by using calibrated dose ratemeters. The radiation exposure risk for population, due to these radioactive materials, was evaluated for every case. (author)

  2. Multielement methods of atomic fluorescence analysis of enviromental samples

    International Nuclear Information System (INIS)

    Rigin, V.I.

    1985-01-01

    A multielement method of atomic fluorescence analysis of environmental samples based on sample decomposition by autoclave fluorination and gas-phase atomization of volatile compounds in inductive araon plasma using a nondispersive polychromator is suggested. Detection limits of some elements (Be, Sr, Cd, V, Mo, Te, Ru etc.) for different sample forms introduced in to an analyzer are given

  3. Salmonella detection in poultry samples. Comparison of two commercial real-time PCR systems with culture methods for the detection of Salmonella spp. in environmental and fecal samples of poultry.

    Science.gov (United States)

    Sommer, D; Enderlein, D; Antakli, A; Schönenbrücher, H; Slaghuis, J; Redmann, T; Lierz, M

    2012-01-01

    The efficiency of two commercial PCR methods based on real-time technology, the foodproof® Salmonella detection system and the BAX® PCR Assay Salmonella system was compared to standardized culture methods (EN ISO 6579:2002 - Annex D) for the detection of Salmonella spp. in poultry samples. Four sample matrices (feed, dust, boot swabs, feces) obtained directly from poultry flocks, as well as artificially spiked samples of the same matrices, were used. All samples were tested for Salmonella spp. using culture methods first as the gold standard. In addition samples spiked with Salmonella Enteridis were tested to evaluate the sensitivity of both PCR methods. Furthermore all methods were evaluated in an annual ring-trial of the National Salmonella Reference Laboratory of Germany. Salmonella detection in the matrices feed, dust and boot swabs were comparable in both PCR systems whereas the results from feces differed markedly. The quality, especially the freshness, of the fecal samples had an influence on the sensitivity of the real-time PCR and the results of the culture methods. In fresh fecal samples an initial spiking level of 100cfu/25g Salmonella Enteritidis was detected. Two-days-dried fecal samples allowed the detection of 14cfu/25g. Both real- time PCR protocols appear to be suitable for the detection of Salmonella spp. in all four matrices. The foodproof® system detected eight samples more to be positive compared to the BAX® system, but had a potential false positive result in one case. In 7-days-dried samples none of the methods was able to detect Salmonella likely through letal cell damage. In general the advantage of PCR analyses over the culture method is the reduction of working time from 4-5 days to only 2 days. However, especially for the analysis of fecal samples official validation should be conducted according to the requirement of EN ISO6579:2002 - Annex D.

  4. A DOE manual: DOE Methods for Evaluating Environmental and Waste Management Samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Riley, R.G.

    1994-01-01

    Waste Management inherently requires knowledge of the waste's chemical composition. The waste can often be analyzed by established methods; however, if the samples are radioactive, or are plagued by other complications, established methods may not be feasible. The US Department of Energy (DOE) has been faced with managing some waste types that are not amenable to standard or available methods, so new or modified sampling and analysis methods are required. These methods are incorporated into DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), which is a guidance/methods document for sampling and analysis activities in support of DOE sites. It is a document generated by consensus of the DOE laboratory staff and is intended to fill the gap within existing guidance documents (e. g., the Environmental Protection Agency's (EPA's) Test Methods for Evaluating Solid Waste, SW-846), which apply to low-level or non-radioactive samples. DOE Methods fills the gap by including methods that take into account the complexities of DOE site matrices. The most recent update, distributed in October 1993, contained quality assurance (QA), quality control (QC), safety, sampling, organic analysis, inorganic analysis, and radioanalytical guidance as well as 29 methods. The next update, which will be distributed in April 1994, will contain 40 methods and will therefore have greater applicability. All new methods are either peer reviewed or labeled ''draft'' methods. Draft methods were added to speed the release of methods to field personnel

  5. Implementation of direct LSC method for diesel samples on the fuel market

    International Nuclear Information System (INIS)

    Krištof, Romana; Hirsch, Marko; Kožar Logar, Jasmina

    2014-01-01

    The European Union develops common EU policy and strategy on biofuels and sustainable bio-economy through several documents. The encouragement of biofuel's consumption is therefore the obligation of each EU member state. The situation in Slovenian fuel market is presented and compared with other EU countries in the frame of prescribed values from EU directives. Diesel is the most common fuel for transportation needs in Slovenia. The study was therefore performed on diesel. The sampling net was determined in accordance with the fuel consumption statistics of the country. 75 Sampling points were located on different types of roads. The quantity of bio-component in diesel samples was determined by direct LSC method through measurement of C-14 content. The measured values were in the range from 0 up to nearly 6 mass percentage of bio-component in fuel. The method has proved to be appropriate, suitable and effective for studies on the real fuel market. - Highlights: • The direct LSC method was tested and applied on real fuel samples from the Slovenian market. • The results of the study are comparable with the findings of official of EUROSTAT's report. • Comparison to other EU member states and EU directive prescription was performed

  6. The use of Geographic Information System (GIS) and non-GIS methods to assess the external validity of samples postcollection.

    Science.gov (United States)

    Richardson, Esther; Good, Margaret; McGrath, Guy; More, Simon J

    2009-09-01

    External validity is fundamental to veterinary diagnostic investigation, reflecting the accuracy with which sample results can be extrapolated to a broader population of interest. Probability sampling methods are routinely used during the collection of samples from populations, specifically to maximize external validity. Nonprobability sampling (e.g., of blood samples collected as part of routine surveillance programs or laboratory submissions) may provide useful data for further posthoc epidemiological analysis, adding value to the collection and submission of samples. As the sample has already been submitted, the analyst or investigator does not have any control over the sampling methodology, and hence external validity as routine probability sampling methods may not have been employed. The current study describes several Geographic Information System (GIS) and non-GIS methods, applied posthoc, to assess the external validity of samples collected using both probability and nonprobability sampling methods. These methods could equally be employed for inspecting other datasets. Mapping was conducted using ArcView 9.1. Based on this posthoc assessment, results from the random field sample could provide an externally valid, albeit relatively imprecise, estimate of national disease prevalence, of disease prevalence in 3 of the 4 provinces (all but Ulster, in the north and northwest, where sample size was small), and in beef and dairy herds. This study provides practical methods for examining the external validity of samples postcollection.

  7. Sampling and sample preparation methods for the analysis of trace elements in biological material

    International Nuclear Information System (INIS)

    Sansoni, B.; Iyengar, V.

    1978-05-01

    The authors attempt to give a most systamtic possible treatment of the sample taking and sample preparation of biological material (particularly in human medicine) for trace analysis (e.g. neutron activation analysis, atomic absorption spectrometry). Contamination and loss problems are discussed as well as the manifold problems of the different consistency of solid and liquid biological materials, as well as the stabilization of the sample material. The process of dry and wet ashing is particularly dealt with, where new methods are also described. (RB) [de

  8. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  9. Statistical sampling methods for soils monitoring

    Science.gov (United States)

    Ann M. Abbott

    2010-01-01

    Development of the best sampling design to answer a research question should be an interactive venture between the land manager or researcher and statisticians, and is the result of answering various questions. A series of questions that can be asked to guide the researcher in making decisions that will arrive at an effective sampling plan are described, and a case...

  10. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    International Nuclear Information System (INIS)

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-01-01

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-846 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (S and GRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a 'blind' sample to the laboratory. Feedback from the S and GRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 2008a). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated-carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more

  11. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING AND CHARACTERIZATION FACILITY

    International Nuclear Information System (INIS)

    Douglas, J.G.; Meznarich, H.K.; Olsen, J.R.; Ross, G.A.; Stauffer, M.

    2009-01-01

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-S46 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (SGRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a ''blind'' sample to the laboratory. Feedback from the SGRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 200Sa). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more effectively

  12. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    Science.gov (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  13. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    Science.gov (United States)

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  14. Precision and Accuracy of k0-NAA Method for Analysis of Multi Elements in Reference Samples

    International Nuclear Information System (INIS)

    Sri-Wardani

    2004-01-01

    Accuracy and precision of k 0 -NAA method could determine in the analysis of multi elements contained in reference samples. The analyzed results of multi elements in SRM 1633b sample were obtained with optimum results in bias of 20% but it is in a good accuracy and precision. The analyzed results of As, Cd and Zn in CCQM-P29 rice flour sample were obtained with very good result in bias of 0.5 - 5.6%. (author)

  15. Modification of a two blood sample method used for measurement of GFR with 99mTc-DTPA.

    Science.gov (United States)

    Surma, Marian J; Płachcińska, Anna; Kuśmierek, Jacek

    2018-01-01

    Measurements of GFR may be performed with a slope/intercept method (S/I), using only two blood samples taken in strictly defined time points. The aim of the study was to modify this method in order to extend time intervals suitable for blood sampling. Modification was based on a variation of a Russel et al. model parameter, selection of time intervals suitable for blood sampling and assessment of uncertainty of calculated results. Archived values of GFR measurements of 169 patients with different renal function, from 5.5 to 179 mL/min, calculated with a multiple blood sample method were used. Concentrations of a radiopharmaceutical in consecutive minutes, from 60th to 190th after injection, were calculated theoretically, using archived parameters of biexponential functions describing a decrease in 99mTc-DTPA concentration in blood plasma with time. These values, together with injected activities, were treated as measurements and used for S/I clearance calculations. Next, values of S/I clearance were compared with the multiple blood sample method in order to calculate suitable values of exponent present in a Russel's model, for every combination of two blood sampling time points. A model was considered accurately fitted to measured values when SEE ≤ 3.6 mL/min. Assessments of uncertainty of obtained results were based on law of error superposition, taking into account mean square prediction error and also errors introduced by pipetting, time measurement and stochastic radioactive decay. The accepted criteria resulted in extension of time intervals suitable for blood sampling to: between 60 and 90 minutes after injection for the first sample and between 150 and 180 minutes for the second sample. Uncertainty of results was assessed as between 4 mL/min for GFR = 5-10 mL/min and 8 mL/min for GFR = 180 mL/min. Time intervals accepted for blood sampling fully satisfy nuclear medicine staff and ensure proper determination of GFR. Uncertainty of results is entirely

  16. Methods of sampling airborne fungi in working environments of waste treatment facilities.

    Science.gov (United States)

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk

    2016-01-01

    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p airborne fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  17. Sampling methods for pasture, soil and deposition for radioactivity emergency preparedness in the Nordic countries

    International Nuclear Information System (INIS)

    Isaksson, M.

    2002-01-01

    The aim of this work was to compare sampling techniques for pasture, soil and deposition, planned for radioactivity surveillance in emergency situations in the Nordic countries. The basis of the survey was a questionnaire, sent to radiation protection authorities and laboratories. Sampling of pasture is performed with a cutting height between 1 and 5 cm above the ground from an area of about 1 m 2 . The sampling plots are usually randomly positioned. Soil samples, 3 to 20 cores in various patterns, are generally taken by a corer of varying diameter. For deposition sampling, precipitation collectors of different sizes are used. When comparing results, the differences between laboratories should be borne in mind so that proper corrections can be made. It is, however, important to consider that, especially in an emergency situation, the use of standardised methods may worsen the results if these methods are not part of the daily work. (orig.)

  18. Solvent Hold Tank Sample Results for MCU-16-934-935-936: June 2016 Monthly Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-30

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-934-935-936), pulled on 07/01/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-934-935-936 indicated the Isopar™L concentration is above its nominal level (101%). The modifier (CS-7SB) and the TiDG concentrations are 8% and 29 % below their nominal concentrations. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended. No impurities above the 1000 ppm level were found in this solvent by the Semi-Volatile Organic Analysis (SVOA). No impurities were observed in the Hydrogen Nuclear Magnetic Resonance (HNMR). However, up to 21.1 ± 4 micrograms of mercury per gram of solvent (or 17.5 μg/mL) was detected in this sample (as determined by the XRF method of undigested sample). The current gamma level (1.41E5 dpm/mL) confirmed that the gamma concentration has returned to previous levels (as observed in the late 2015 samples) where the process operated normally and as expected.

  19. IMPROVEMENT OF METHODS FOR HYDROBIOLOGICAL RESEARCH AND MODIFICATION OF STANDARD TOOLS FOR SAMPLE COLLECTION

    Directory of Open Access Journals (Sweden)

    M. M. Aligadjiev

    2015-01-01

    Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket. 

  20. Calculation of parameter failure probability of thermodynamic system by response surface and importance sampling method

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei

    2012-01-01

    In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)

  1. Deficits in knowledge, attitude, and practice towards blood culture sampling: results of a nationwide mixed-methods study among inpatient care physicians in Germany.

    Science.gov (United States)

    Raupach-Rosin, Heike; Duddeck, Arne; Gehrlich, Maike; Helmke, Charlotte; Huebner, Johannes; Pletz, Mathias W; Mikolajczyk, Rafael; Karch, André

    2017-08-01

    Blood culture (BC) sampling rates in Germany are considerably lower than recommended. Aim of our study was to assess knowledge, attitudes, and practice of physicians in Germany regarding BC diagnostics. We conducted a cross-sectional mixed-methods study among physicians working in inpatient care in Germany. Based on the results of qualitative focus groups, a questionnaire-based quantitative study was conducted in 2015-2016. In total, 706 medical doctors and final-year medical students from 11 out of 16 federal states in Germany participated. BC sampling was considered an important diagnostic tool by 95% of the participants. However, only 23% of them would collect BCs in three scenarios for which BC ordering is recommended by present guidelines in Germany; almost one out of ten physicians would not have taken blood cultures in any of the three scenarios. The majority of participants (74%) reported not to adhere to the guideline recommendation that blood culture sampling should include at least two blood culture sets from two different injection sites. High routine in blood culture sampling, perceived importance of blood culture diagnostics, the availability of an in-house microbiological lab, and the department the physician worked in were identified as predictors for good blood culture practice. Our study suggests that there are substantial deficits in BC ordering and the application of guidelines for good BC practice in Germany. Based on these findings, multimodal interventions appear necessary for improving BC diagnostics.

  2. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    Science.gov (United States)

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Efficient free energy calculations by combining two complementary tempering sampling methods.

    Science.gov (United States)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-14

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  4. Utility of the microculture method in non-invasive samples obtained from an experimental murine model with asymptomatic leishmaniasis.

    Science.gov (United States)

    Allahverdiyev, Adil M; Bagirova, Malahat; Cakir-Koc, Rabia; Elcicek, Serhat; Oztel, Olga Nehir; Canim-Ates, Sezen; Abamor, Emrah Sefik; Yesilkir-Baydar, Serap

    2012-07-01

    The sensitivity of diagnostic methods for visceral leishmaniasis (VL) decreases because of the low number of parasites and antibody amounts in asymptomatic healthy donors who are not suitable for invasive sample acquisition procedures. Therefore, new studies are urgently needed to improve the sensitivity and specificity of the diagnostic approaches in non-invasive samples. In this study, the sensitivity of the microculture method (MCM) was compared with polymerase chain reaction (PCR), enzyme-linked immunosorbent assay (ELISA), and immunofluorescent antibody test (IFAT) methods in an experimental murine model with asymptomatic leishmaniasis. Results showed that the percent of positive samples in ELISA, IFAT, and peripheral blood (PB) -PCR tests were 17.64%, 8.82%, and 5.88%, respectively, whereas 100% positive results were obtained with MCM and MCM-PCR methods. Thus, this study, for the first time, showed that MCM is more sensitive, specific, and economic than other methods, and the sensitivity of PCR that was performed to samples obtained from MCM was higher than sensitivity of the PCR method sampled by PB.

  5. SnagPRO: snag and tree sampling and analysis methods for wildlife

    Science.gov (United States)

    Lisa J. Bate; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe sampling methods and provide software to accurately and efficiently estimate snag and tree densities at desired scales to meet a variety of research and management objectives. The methods optimize sampling effort by choosing a plot size appropriate for the specified forest conditions and sampling goals. Plot selection and data analyses are supported by...

  6. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    Science.gov (United States)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  7. Methods of scaling threshold color difference using printed samples

    Science.gov (United States)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  8. Soil separator and sampler and method of sampling

    Science.gov (United States)

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  9. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  10. Analytical results of Tank 38H core samples -- Fall 1999

    International Nuclear Information System (INIS)

    Swingle, R.F.

    2000-01-01

    Two samples were pulled from Tank 38H in the Fall of 1999: a variable depth sample (VDS) of the supernate was pulled in October and a core sample from the salt layer was pulled in December. Analysis of the rinse from the outside of the core sample indicated no sign of volatile or semivolatile organics. Both supernate and solids from the VDS and the dried core sample solids were analyzed for isotopes which could pose a criticality concern and also for elements which could serve as neutron poisons, as well as other elements. Results of the elemental analyses of these samples show significant elements present to mitigate the potential for nuclear criticality. However, it should be noted the results given for the VDS solids elemental analyses may be higher than the actual concentration in the solids, since the filter paper was dissolved along with the sample solids

  11. Culture methods of allograft musculoskeletal tissue samples in Australian bacteriology laboratories.

    Science.gov (United States)

    Varettas, Kerry

    2013-12-01

    Samples of allograft musculoskeletal tissue are cultured by bacteriology laboratories to determine the presence of bacteria and fungi. In Australia, this testing is performed by 6 TGA-licensed clinical bacteriology laboratories with samples received from 10 tissue banks. Culture methods of swab and tissue samples employ a combination of solid agar and/or broth media to enhance micro-organism growth and maximise recovery. All six Australian laboratories receive Amies transport swabs and, except for one laboratory, a corresponding biopsy sample for testing. Three of the 6 laboratories culture at least one allograft sample directly onto solid agar. Only one laboratory did not use a broth culture for any sample received. An international literature review found that a similar combination of musculoskeletal tissue samples were cultured onto solid agar and/or broth media. Although variations of allograft musculoskeletal tissue samples, culture media and methods are used in Australian and international bacteriology laboratories, validation studies and method evaluations have challenged and supported their use in recovering fungi and aerobic and anaerobic bacteria.

  12. New method for simultaneous determination of 55Fe and 59Fe in blood serum samples

    International Nuclear Information System (INIS)

    Saukkonen, H.; Uhlenius, R.

    1978-01-01

    Routine methods for the measurement of 55 Fe and 59 Fe activities in biological samples are frequently required in metabolic studies of iron. A new simple method for the simultaneous determination of 59 Fe and 55 Fe concentration in 5 cm 3 samples of blood is described and carefully evaluated. Before the measurement of the activity, organic matter was eliminated by HNO 3 -HClO 4 wet ashing and iron was electroplated onto a copper plate. The accuracy of results was studied by assessing samples, which contained known amounts of radioactivity and determining the counts per nanocurie in each case. The accuracy of the results of 59 Fe and 55 Fe determinations was found to be about 5%. The method has been routinely used to determine iron resorption in patients using the double isotope method. The determination proved to be satisfactory and not too laborious. When performing the yield determination there is a way of detecting and correcting mistakes or incompleteness in different stages of the measurement, thus leading to a high degree of reliability. (T.G.)

  13. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Directory of Open Access Journals (Sweden)

    Q. Zhang

    2018-02-01

    Full Text Available River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1 fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2 the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling – in the form of spectral slope (β or other equivalent scaling parameters (e.g., Hurst exponent – are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1 they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β  =  0 to Brown noise (β  =  2 and (2 their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb–Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among

  14. 2015 Long-Term Hydrologic Monitoring Program Sampling and Analysis Results at Rio Blanco, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    Findlay, Rick [Nararro Research and Engineering, Oak Ridge, TN (United States); Kautsky, Mark [US Department of Energy, Washington, DC (United States). Office of Legacy Management

    2015-12-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted annual sampling at the Rio Blanco, Colorado, Site for the Long-Term Hydrologic Monitoring Program (LTHMP) on May 20–21, 2015. This report documents the analytical results of the Rio Blanco annual monitoring event, the trip report, and the data validation package. The groundwater and surface water monitoring samples were shipped to the GEL Group Inc. laboratories for conventional analysis of tritium and analysis of gamma-emitting radionuclides by high-resolution gamma spectrometry. A subset of water samples collected from wells near the Rio Blanco site was also sent to GEL Group Inc. for enriched tritium analysis. All requested analyses were successfully completed. Samples were collected from a total of four onsite wells, including two that are privately owned. Samples were also collected from two additional private wells at nearby locations and from nine surface water locations. Samples were analyzed for gamma-emitting radionuclides by high-resolution gamma spectrometry, and they were analyzed for tritium using the conventional method with a detection limit on the order of 400 picocuries per liter (pCi/L). Four locations (one well and three surface locations) were analyzed using the enriched tritium method, which has a detection limit on the order of 3 pCi/L. The enriched locations included the well at the Brennan Windmill and surface locations at CER-1, CER-4, and Fawn Creek 500 feet upstream.

  15. A method for three-dimensional quantitative observation of the microstructure of biological samples

    Science.gov (United States)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  16. Computational methods and modeling. 1. Sampling a Position Uniformly in a Trilinear Hexahedral Volume

    International Nuclear Information System (INIS)

    Urbatsch, Todd J.; Evans, Thomas M.; Hughes, H. Grady

    2001-01-01

    Monte Carlo particle transport plays an important role in some multi-physics simulations. These simulations, which may additionally involve deterministic calculations, typically use a hexahedral or tetrahedral mesh. Trilinear hexahedrons are attractive for physics calculations because faces between cells are uniquely defined, distance-to-boundary calculations are deterministic, and hexahedral meshes tend to require fewer cells than tetrahedral meshes. We discuss one aspect of Monte Carlo transport: sampling a position in a tri-linear hexahedron, which is made up of eight control points, or nodes, and six bilinear faces, where each face is defined by four non-coplanar nodes in three-dimensional Cartesian space. We derive, code, and verify the exact sampling method and propose an approximation to it. Our proposed approximate method uses about one-third the memory and can be twice as fast as the exact sampling method, but we find that its inaccuracy limits its use to well-behaved hexahedrons. Daunted by the expense of the exact method, we propose an alternate approximate sampling method. First, calculate beforehand an approximate volume for each corner of the hexahedron by taking one-eighth of the volume of an imaginary parallelepiped defined by the corner node and the three nodes to which it is directly connected. For the sampling, assume separability in the parameters, and sample each parameter, in turn, from a linear pdf defined by the sum of the four corner volumes at each limit (-1 and 1) of the parameter. This method ignores the quadratic portion of the pdf, but it requires less storage, has simpler sampling, and needs no extra, on-the-fly calculations. We simplify verification by designing tests that consist of one or more cells that entirely fill a unit cube. Uniformly sampling complicated cells that fill a unit cube will result in uniformly sampling the unit cube. Unit cubes are easily analyzed. The first problem has four wedges (or tents, or A frames) whose

  17. Quantitative extraction of nucleotides from frozen muscle samples of Atlantic salmon ( Salmo salar ) and rainbow trout ( Oncorhynchus mykiss ) : Effects of time taken to sample and extraction method

    DEFF Research Database (Denmark)

    Thomas, P.M.; Bremner, Allan; Pankhurst, N.W.

    2000-01-01

    time taken to sample, method 2 resulted in higher adenylate and lower IMP concentration than method I. These results indicate that method 2 is most effective in obtaining realistic nucleotide concentrations from fish muscle because it maintains the tissue temperature below the critical freeze zone, (-0...

  18. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    Science.gov (United States)

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  19. Comparison of chlorzoxazone one-sample methods to estimate CYP2E1 activity in humans

    DEFF Research Database (Denmark)

    Kramer, Iza; Dalhoff, Kim; Clemmesen, Jens O

    2003-01-01

    OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one...... estimates, Cl(est) at 3 h or 6 h, and MR at 3 h, can serve as reliable markers of CYP2E1 activity. The one-sample clearance method is an accurate, renal function-independent measure of the intrinsic activity; it is simple to use and easily applicable to humans.......-time-point clearance estimation (Cl(est)) at 3, 4, 5 and 6 h. Furthermore, the metabolite/drug ratios (MRs) estimated from one-time-point samples at 1, 2, 3, 4, 5 and 6 h were compared with Cl(fe). RESULTS: The concordance between Cl(est) and Cl(fe) was highest at 6 h. The minimal mean prediction error (MPE) of Cl...

  20. Measurements of linear attenuation coefficients of irregular shaped samples by two media method

    International Nuclear Information System (INIS)

    Singh, Sukhpal; Kumar, Ashok; Thind, Kulwant Singh; Mudahar, Gurmel S.

    2008-01-01

    The linear attenuation coefficient values of regular and irregular shaped flyash materials have been measured without knowing the thickness of a sample using a new technique namely 'two media method'. These values have also been measured with a standard gamma ray transmission method and obtained theoretically with winXCOM computer code. From the comparison it is reported that the two media method has given accurate results of attenuation coefficients of flyash materials

  1. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  2. Chromium speciation in environmental samples using a solid phase spectrophotometric method

    Science.gov (United States)

    Amin, Alaa S.; Kassem, Mohammed A.

    2012-10-01

    A solid phase extraction technique is proposed for preconcentration and speciation of chromium in natural waters using spectrophotometric analysis. The procedure is based on sorption of chromium(III) as 4-(2-benzothiazolylazo)2,2'-biphenyldiol complex on dextran-type anion-exchange gel (Sephadex DEAE A-25). After reduction of Cr(VI) by 0.5 ml of 96% concentrated H2SO4 and ethanol, the system was applied to the total chromium. The concentration of Cr(VI) was calculated as the difference between the total Cr and the Cr(III) content. The influences of some analytical parameters such as: pH of the aqueous solution, amounts of 4-(2-benzothiazolylazo)2,2'-biphenyldiol (BTABD), and sample volumes were investigated. The absorbance of the gel, at 628 and 750 nm, packed in a 1.0 mm cell, is measured directly. The molar absorptivities were found to be 2.11 × 107 and 3.90 × 107 L mol-1 cm-1 for 500 and 1000 ml, respectively. Calibration is linear over the range 0.05-1.45 μg L-1 with RSD of <1.85% (n = 8.0). Using 35 mg exchanger, the detection and quantification limits were 13 and 44 ng L-1 for 500 ml sample, whereas for 1000 ml sample were 8.0 and 27 ng L-1, respectively. Increasing the sample volume can enhance the sensitivity. No considerable interferences have been observed from other investigated anions and cations on the chromium speciation. The proposed method was applied to the speciation of chromium in natural waters and total chromium preconcentration in microwave digested tobacco, coffee, tea, and soil samples. The results were simultaneously compared with those obtained using an ET AAS method, whereby the validity of the method has been tested.

  3. Method validation for control determination of mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry.

    Science.gov (United States)

    Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller

    2015-01-01

    A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025.

  4. Determination of element concentrations in biological reference materials by solid sampling and other analytical methods

    International Nuclear Information System (INIS)

    Schauenburg, H.; Weigert, P.

    1992-01-01

    Using solid sampling with graphite furnace atomic absorption spectrometry (GFAAS), values for cadmium, copper, lead and zinc in six biological reference materials were obtained from up to four laboratories participating in three collaborative studies. These results are compared with those obtained with other methods used in routine analysis from laboratories of official food control. Under certain conditions solid sampling with GFAAS seems to be suitable for routine analysis as well as conventional methods. (orig.)

  5. Rapid-Viability PCR Method for Detection of Live, Virulent Bacillus anthracis in Environmental Samples

    OpenAIRE

    Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.

    2011-01-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real...

  6. A modified FOX-1 method for Micro-determination of hydrogen peroxide in honey samples.

    Science.gov (United States)

    Li, Dan; Wang, Meng; Cheng, Ni; Xue, Xiaofeng; Wu, Liming; Cao, Wei

    2017-12-15

    Hydrogen peroxide (H 2 O 2 ) is a major antibacterial activity-associated biomarker in honey. Measurement of endogenous H 2 O 2 in honey is of great value in prediction of the H 2 O 2 -depended antibacterial activity and characterization or selection of honey samples for their use as an antibacterial agent or natural food preservative. Considering current methods for H 2 O 2 determination are either time-consuming or complicated with their high-cost, a study was conducted to modify and validate the spectrophotometry-based ferrous oxidation-xylenol orange (FOX-1) method for micro-determination of H 2 O 2 in honey samples. The result suggested that the proposed FOX-1 method is fast, sensitive, precise and repeatable. The method was successfully applied for the analysis of a total of 35 honey samples from 5 floral origins and 33 geographical origins. The proposed method is low-cost and easy-to-run, and it can be considered by researchers and industry for routine analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Extending the alias Monte Carlo sampling method to general distributions

    International Nuclear Information System (INIS)

    Edwards, A.L.; Rathkopf, J.A.; Smidt, R.K.

    1991-01-01

    The alias method is a Monte Carlo sampling technique that offers significant advantages over more traditional methods. It equals the accuracy of table lookup and the speed of equal probable bins. The original formulation of this method sampled from discrete distributions and was easily extended to histogram distributions. We have extended the method further to applications more germane to Monte Carlo particle transport codes: continuous distributions. This paper presents the alias method as originally derived and our extensions to simple continuous distributions represented by piecewise linear functions. We also present a method to interpolate accurately between distributions tabulated at points other than the point of interest. We present timing studies that demonstrate the method's increased efficiency over table lookup and show further speedup achieved through vectorization. 6 refs., 12 figs., 2 tabs

  8. A novel method for fission product noble gas sampling

    International Nuclear Information System (INIS)

    Jain, S.K.; Prakash, Vivek; Singh, G.K.; Vinay, Kr.; Awsthi, A.; Bihari, K.; Joyson, R.; Manu, K.; Gupta, Ashok

    2008-01-01

    Noble gases occur to some extent in the Earth's atmosphere, but the concentrations of all but argon are exceedingly low. Argon is plentiful, constituting almost 1 % of the air. Fission Product Noble Gases (FPNG) are produced by nuclear fission and large parts of FPNG is produced in Nuclear reactions. FPNG are b-j emitters and contributing significantly in public dose. During normal operation of reactor release of FPNG is negligible but its release increases in case of fuel failure. Xenon, a member of FPNG family helps in identification of fuel failure and its extent in PHWRs. Due to above reasons it becomes necessary to assess the FPNG release during operation of NPPs. Presently used methodology of assessment of FPNG, at almost all power stations is Computer based gamma ray spectrometry. This provides fission product Noble gases nuclide identification through peak search of spectra. The air sample for the same is collected by grab sampling method, which has inherent disadvantages. An alternate method was developed at Rajasthan Atomic Power Station (RAPS) - 3 and 4 for assessment of FPNG, which uses adsorption phenomena for collection of air samples. This report presents details of sampling method for FPNG and noble gases in different systems of Nuclear Power Plant. (author)

  9. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  10. Evaluation of sample preparation methods and optimization of nickel determination in vegetable tissues

    Directory of Open Access Journals (Sweden)

    Rodrigo Fernando dos Santos Salazar

    2011-02-01

    Full Text Available Nickel, although essential to plants, may be toxic to plants and animals. It is mainly assimilated by food ingestion. However, information about the average levels of elements (including Ni in edible vegetables from different regions is still scarce in Brazil. The objectives of this study were to: (a evaluate and optimize a method for preparation of vegetable tissue samples for Ni determination; (b optimize the analytical procedures for determination by Flame Atomic Absorption Spectrometry (FAAS and by Electrothermal Atomic Absorption (ETAAS in vegetable samples and (c determine the Ni concentration in vegetables consumed in the cities of Lorena and Taubaté in the Vale do Paraíba, State of São Paulo, Brazil. By means of the analytical technique for determination by ETAAS or FAAS, the results were validated by the test of analyte addition and recovery. The most viable method tested for quantification of this element was HClO4-HNO3 wet digestion. All samples but carrot tissue collected in Lorena contained Ni levels above the permitted by the Brazilian Ministry of Health. The most disturbing results, requiring more detailed studies, were the Ni concentrations measured in carrot samples from Taubaté, where levels were five times higher than permitted by Brazilian regulations.

  11. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava

    2000-01-01

    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  12. Evaluation of various conventional methods for sampling weeds in potato and spinach crops

    Directory of Open Access Journals (Sweden)

    David Jamaica

    2014-04-01

    Full Text Available This study aimed to evaluate (at an exploratory level, some of the different conventional sampling designs in a section of a potato crop and in a commercial crop of spinach. Weeds were sampled in a 16 x 48 m section of a potato crop with a set grid of 192 sections. The cover and density of the weeds were registered in squares of from 0.25 to 64 m². The results were used to create a database that allowed for the simulation of different sampling designs: variables and square size. A second sampling was carried out with these results in a spinach crop of 1.16 ha with a set grid of 6 x 6 m cells, evaluating the cover in 4 m² squares. Another database was created with this information, which was used to simulate other sampling designs such as distribution and quantity of sampling squares. According to the obtained results, a good method for approximating the quantity of squares for diverse samples is 10-12 squares (4 m² for richness per ha and 18 or more squares for abundance per hectare. This square size is optimal since it allows for a sampling of more area without losing sight of low-profile species, with the cover variable best representing the abundance of the weeds.

  13. Preparation of samples for leaf architecture studies, a method for mounting cleared leaves.

    Science.gov (United States)

    Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C

    2014-09-01

    Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.

  14. Transformation-cost time-series method for analyzing irregularly sampled data.

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  15. Transformation-cost time-series method for analyzing irregularly sampled data

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  16. Different methods for volatile sampling in mammals.

    Directory of Open Access Journals (Sweden)

    Marlen Kücklich

    Full Text Available Previous studies showed that olfactory cues are important for mammalian communication. However, many specific compounds that convey information between conspecifics are still unknown. To understand mechanisms and functions of olfactory cues, olfactory signals such as volatile compounds emitted from individuals need to be assessed. Sampling of animals with and without scent glands was typically conducted using cotton swabs rubbed over the skin or fur and analysed by gas chromatography-mass spectrometry (GC-MS. However, this method has various drawbacks, including a high level of contaminations. Thus, we adapted two methods of volatile sampling from other research fields and compared them to sampling with cotton swabs. To do so we assessed the body odor of common marmosets (Callithrix jacchus using cotton swabs, thermal desorption (TD tubes and, alternatively, a mobile GC-MS device containing a thermal desorption trap. Overall, TD tubes comprised most compounds (N = 113, with half of those compounds being volatile (N = 52. The mobile GC-MS captured the fewest compounds (N = 35, of which all were volatile. Cotton swabs contained an intermediate number of compounds (N = 55, but very few volatiles (N = 10. Almost all compounds found with the mobile GC-MS were also captured with TD tubes (94%. Hence, we recommend TD tubes for state of the art sampling of body odor of mammals or other vertebrates, particularly for field studies, as they can be easily transported, stored and analysed with high performance instruments in the lab. Nevertheless, cotton swabs capture compounds which still may contribute to the body odor, e.g. after bacterial fermentation, while profiles from mobile GC-MS include only the most abundant volatiles of the body odor.

  17. Using mark-recapture distance sampling methods on line transect surveys

    Science.gov (United States)

    Burt, Louise M.; Borchers, David L.; Jenkins, Kurt J.; Marques, Tigao A

    2014-01-01

    Mark–recapture distance sampling (MRDS) methods are widely used for density and abundance estimation when the conventional DS assumption of certain detection at distance zero fails, as they allow detection at distance zero to be estimated and incorporated into the overall probability of detection to better estimate density and abundance. However, incorporating MR data in DS models raises survey and analysis issues not present in conventional DS. Conversely, incorporating DS assumptions in MR models raises issues not present in conventional MR. As a result, being familiar with either conventional DS methods or conventional MR methods does not on its own put practitioners in good a position to apply MRDS methods appropriately. This study explains the sometimes subtly different varieties of MRDS survey methods and the associated concepts underlying MRDS models. This is done as far as possible without giving mathematical details – in the hope that this will make the key concepts underlying the methods accessible to a wider audience than if we were to present the concepts via equations.

  18. Advanced Markov chain Monte Carlo methods learning from past samples

    CERN Document Server

    Liang, Faming; Carrol, Raymond J

    2010-01-01

    This book provides comprehensive coverage of simulation of complex systems using Monte Carlo methods. Developing algorithms that are immune to the local trap problem has long been considered as the most important topic in MCMC research. Various advanced MCMC algorithms which address this problem have been developed include, the modified Gibbs sampler, the methods based on auxiliary variables and the methods making use of past samples. The focus of this book is on the algorithms that make use of past samples. This book includes the multicanonical algorithm, dynamic weighting, dynamically weight

  19. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  20. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    Science.gov (United States)

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  1. Multivariate Methods for Prediction of Geologic Sample Composition with Laser-Induced Breakdown Spectroscopy

    Science.gov (United States)

    Morris, Richard; Anderson, R.; Clegg, S. M.; Bell, J. F., III

    2010-01-01

    the CC ANN often gave results comparable to PLS. Averaging the spectra for each training sample and/or using feature selection to choose a small subset of wavelengths to use for predictions gave mixed results, with degraded performance in some cases and similar or slightly improved performance in other cases. However, training time was significantly reduced for both PLS and ANN methods by implementing feature selection, making this a potentially appealing method for initial, rapid-turn-around analyses necessary for Chemcam's tactical role on MSL. Choice of training samples has a strong influence on the accuracy of predictions. We are currently investigating the use of clustering algorithms (e.g. k-means, neural gas, etc.) to identify training sets that are spectrally similar to the unknown samples that are being predicted, and therefore result in improved predictions

  2. Sampling methods for terrestrial amphibians and reptiles.

    Science.gov (United States)

    Paul Stephen Corn; R. Bruce. Bury

    1990-01-01

    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  3. Comparison of acid leachate and fusion methods to determine plutonium and americium in environmental samples

    International Nuclear Information System (INIS)

    Smith, L.L.; Markun, F.; TenKate, T.

    1992-06-01

    The Analytical Chemistry Laboratory at Argonne National Laboratory performs radiochemical analyses for a wide variety of sites within the Department of Energy complex. Since the chemical history of the samples may vary drastically from site to site, the effectiveness of any analytical technique may also vary. This study compares a potassium fluoride-pyrosulfate fusion technique with an acid leachate method. Both normal and high-fired soils and vegetation samples were analyzed for both americium and plutonium. Results show both methods work well, except for plutonium in high-fired soils. Here the fusion method provides higher accuracy

  4. Concentration comparison of selected constituents between groundwater samples collected within the Missouri River alluvial aquifer using purge and pump and grab-sampling methods, near the city of Independence, Missouri, 2013

    Science.gov (United States)

    Krempa, Heather M.

    2015-10-29

    The U.S. Geological Survey, in cooperation with the City of Independence, Missouri, Water Department, has historically collected water-quality samples using the purge and pump method (hereafter referred to as pump method) to identify potential contamination in groundwater supply wells within the Independence well field. If grab sample results are comparable to the pump method, grab samplers may reduce time, labor, and overall cost. This study was designed to compare constituent concentrations between samples collected within the Independence well field using the pump method and the grab method.

  5. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy

    Science.gov (United States)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng

    2017-12-01

    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  6. Universal nucleic acids sample preparation method for cells, spores and their mixture

    Science.gov (United States)

    Bavykin, Sergei [Darien, IL

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  7. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  8. Damage evolution analysis of coal samples under cyclic loading based on single-link cluster method

    Science.gov (United States)

    Zhang, Zhibo; Wang, Enyuan; Li, Nan; Li, Xuelong; Wang, Xiaoran; Li, Zhonghui

    2018-05-01

    In this paper, the acoustic emission (AE) response of coal samples under cyclic loading is measured. The results show that there is good positive relation between AE parameters and stress. The AE signal of coal samples under cyclic loading exhibits an obvious Kaiser Effect. The single-link cluster (SLC) method is applied to analyze the spatial evolution characteristics of AE events and the damage evolution process of coal samples. It is found that a subset scale of the SLC structure becomes smaller and smaller when the number of cyclic loading increases, and there is a negative linear relationship between the subset scale and the degree of damage. The spatial correlation length ξ of an SLC structure is calculated. The results show that ξ fluctuates around a certain value from the second cyclic loading process to the fifth cyclic loading process, but spatial correlation length ξ clearly increases in the sixth loading process. Based on the criterion of microcrack density, the coal sample failure process is the transformation from small-scale damage to large-scale damage, which is the reason for changes in the spatial correlation length. Through a systematic analysis, the SLC method is an effective method to research the damage evolution process of coal samples under cyclic loading, and will provide important reference values for studying coal bursts.

  9. Fluidics platform and method for sample preparation and analysis

    Science.gov (United States)

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  10. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  11. [Sample preparation methods for chromatographic analysis of organic components in atmospheric particulate matter].

    Science.gov (United States)

    Hao, Liang; Wu, Dapeng; Guan, Yafeng

    2014-09-01

    The determination of organic composition in atmospheric particulate matter (PM) is of great importance in understanding how PM affects human health, environment, climate, and ecosystem. Organic components are also the scientific basis for emission source tracking, PM regulation and risk management. Therefore, the molecular characterization of the organic fraction of PM has become one of the priority research issues in the field of environmental analysis. Due to the extreme complexity of PM samples, chromatographic methods have been the chief selection. The common procedure for the analysis of organic components in PM includes several steps: sample collection on the fiber filters, sample preparation (transform the sample into a form suitable for chromatographic analysis), analysis by chromatographic methods. Among these steps, the sample preparation methods will largely determine the throughput and the data quality. Solvent extraction methods followed by sample pretreatment (e. g. pre-separation, derivatization, pre-concentration) have long been used for PM sample analysis, and thermal desorption methods have also mainly focused on the non-polar organic component analysis in PM. In this paper, the sample preparation methods prior to chromatographic analysis of organic components in PM are reviewed comprehensively, and the corresponding merits and limitations of each method are also briefly discussed.

  12. A quantitative method to detect explosives and selected semivolatiles in soil samples by Fourier transform infrared spectroscopy

    International Nuclear Information System (INIS)

    Clapper-Gowdy, M.; Dermirgian, J.; Robitaille, G.

    1995-01-01

    This paper describes a novel Fourier transform infrared (FTIR) spectroscopic method that can be used to rapidly screen soil samples from potentially hazardous waste sites. Samples are heated in a thermal desorption unit and the resultant vapors are collected and analyzed in a long-path gas cell mounted in a FTIR. Laboratory analysis of a soil sample by FTIR takes approximately 10 minutes. This method has been developed to identify and quantify microgram concentrations of explosives in soil samples and is directly applicable to the detection of selected volatile organics, semivolatile organics, and pesticides

  13. BMAA extraction of cyanobacteria samples: which method to choose?

    Science.gov (United States)

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  14. Sampling methods for amphibians in streams in the Pacific Northwest.

    Science.gov (United States)

    R. Bruce Bury; Paul Stephen. Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  15. Efficiency of Picture Description and Storytelling Methods in Language Sampling According to the Mean Length of Utterance Index

    Directory of Open Access Journals (Sweden)

    Salime Jafari

    2012-10-01

    Full Text Available Background and Aim: Due to limitation of standardized tests for Persian-speakers with language disorders, spontaneous language sampling collection is an important part of assessment of languageprotocol. Therefore, selection of a language sampling method, which will provide information of linguistic competence in a short time, is important. Therefore, in this study, we compared the languagesamples elicited with picture description and storytelling methods in order to determine the effectiveness of the two methods.Methods: In this study 30 first-grade elementary school girls were selected with simple sampling. To investigate picture description method, we used two illustrated stories with four pictures. Languagesamples were collected through storytelling by telling a famous children’s story. To determine the effectiveness of these two methods the two indices of duration of sampling and mean length ofutterance (MLU were compared.Results: There was no significant difference between MLU in description and storytelling methods(p>0.05. However, duration of sampling was shorter in the picture description method than the storytelling method (p<0.05.Conclusion: Findings show that, the two methods of picture description and storytelling have the same potential in language sampling. Since, picture description method can provide language samples with the same complexity in a shorter time than storytelling, it can be used as a beneficial method forclinical purposes.

  16. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters

    OpenAIRE

    Calfee, M. Worth; Rose, Laura J.; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2013-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37 mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50...

  17. Method of quantitative analysis of fluorine in environmental samples using a pure-Ge detector

    International Nuclear Information System (INIS)

    Sera, K.; Terasaki, K.; Saitoh, Y.; Itoh, J.; Futatsugawa, S.; Murao, S.; Sakurai, S.

    2004-01-01

    We recently developed and reported a three-detector measuring system making use of a pure-Ge detector combined with two Si(Li) detectors. The efficiency curve of the pure-Ge detector was determined as relative efficiencies to those of the existing Si(Li) detectors and accuracy of it was confirmed by analyzing a few samples whose elemental concentrations were known. It was found that detection of fluorine becomes possible by analyzing prompt γ-rays and the detection limit was found to be less than 0.1 ppm for water samples. In this work, a method of quantitative analysis of fluorine has been established in order to investigate environmental contamination by fluorine. This method is based on the fact that both characteristic x-rays from many elements and 110 keV prompt γ-rays from fluorine can be detected in the same spectrum. The present method is applied to analyses of a few environmental samples such as tealeaves, feed for domestic animals and human bone. The results are consistent with those obtained by other methods and it is found that the present method is quite useful and convenient for investigation studies on regional pollution by fluorine. (author)

  18. Linear model correction: A method for transferring a near-infrared multivariate calibration model without standard samples

    Science.gov (United States)

    Liu, Yan; Cai, Wensheng; Shao, Xueguang

    2016-12-01

    Calibration transfer is essential for practical applications of near infrared (NIR) spectroscopy because the measurements of the spectra may be performed on different instruments and the difference between the instruments must be corrected. For most of calibration transfer methods, standard samples are necessary to construct the transfer model using the spectra of the samples measured on two instruments, named as master and slave instrument, respectively. In this work, a method named as linear model correction (LMC) is proposed for calibration transfer without standard samples. The method is based on the fact that, for the samples with similar physical and chemical properties, the spectra measured on different instruments are linearly correlated. The fact makes the coefficients of the linear models constructed by the spectra measured on different instruments are similar in profile. Therefore, by using the constrained optimization method, the coefficients of the master model can be transferred into that of the slave model with a few spectra measured on slave instrument. Two NIR datasets of corn and plant leaf samples measured with different instruments are used to test the performance of the method. The results show that, for both the datasets, the spectra can be correctly predicted using the transferred partial least squares (PLS) models. Because standard samples are not necessary in the method, it may be more useful in practical uses.

  19. A passive guard for low thermal conductivity measurement of small samples by the hot plate method

    International Nuclear Information System (INIS)

    Jannot, Yves; Godefroy, Justine; Degiovanni, Alain; Grigorova-Moutiers, Veneta

    2017-01-01

    Hot plate methods under steady state conditions are based on a 1D model to estimate the thermal conductivity, using measurements of the temperatures T 0 and T 1 of the two sides of the sample and of the heat flux crossing it. To be consistent with the hypothesis of the 1D heat flux, either a hot plate guarded apparatus is used, or the temperature is measured at the centre of the sample. On one hand the latter method can be used only if the ratio thickness/width of the sample is sufficiently low and on the other hand the guarded hot plate method requires large width samples (typical cross section of 0.6  ×  0.6 m 2 ). That is why both methods cannot be used for low width samples. The method presented in this paper is based on an optimal choice of the temperatures T 0 and T 1 compared to the ambient temperature T a , enabling the estimation of the thermal conductivity with a centered hot plate method, by applying the 1D heat flux model. It will be shown that these optimal values do not depend on the size or on the thermal conductivity of samples (in the range 0.015–0.2 W m −1 K −1 ), but only on T a . The experimental results obtained validate the method for several reference samples for values of the ratio thickness/width up to 0.3, thus enabling the measurement of the thermal conductivity of samples having a small cross-section, down to 0.045  ×  0.045 m 2 . (paper)

  20. Development, validation and testing of a skin sampling method for assessment of metal exposure.

    Science.gov (United States)

    Erfani, Behnaz; Midander, Klara; Lidén, Carola; Julander, Anneli

    2017-07-01

    Nickel, cobalt and chromium are frequent skin sensitizers. Skin exposure results in eczema in sensitized individuals, the risk being related to the skin dose. To develop a self-sampling method for quantification of skin exposure to metals, to validate the method, and to assess its feasibility. Defined metal doses (0.01-5 µg) were applied to the fingers of 5 participants. Skin areas (2 cm 2 ) were sampled with 1% HNO 3 , either as 0.1 ml on a swab, or as 0.5 ml on a wipe. Furthermore, 17 participants performed self-sampling by swab after 2 h of leisure activity. Samples were extracted in 1% HNO 3 and analysed by inductively coupled plasma mass spectrometry. The sampling efficiency by swab was 46%, as compared with 93% for acid wipe sampling, for all tested doses. Most metal from the skin dose was detected in the first swab (33-43%). Despite lower sampling efficiency by swab, skin doses of metals following 2 h of leisure activity without hand washing were quantified in all participants, and ranged from 0.0016 to 0.15 µg/cm 2 , from 0.00014 to -0.0020 µg/cm 2 and from 0.00048 to -0.027 µg/cm 2 for nickel, cobalt, and chromium, respectively. The results indicate a future potential of skin sampling by swab to detect and monitor metals on skin by self-sampling. This will contribute to better knowledge of metal skin exposure among dermatitis patients, workers, and the general population. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Sampling Methods and the Accredited Population in Athletic Training Education Research

    Science.gov (United States)

    Carr, W. David; Volberding, Jennifer

    2009-01-01

    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  2. Cadmium and lead determination by ICPMS: Method optimization and application in carabao milk samples

    Directory of Open Access Journals (Sweden)

    Riza A. Magbitang

    2012-06-01

    Full Text Available A method utilizing inductively coupled plasma mass spectrometry (ICPMS as the element-selective detector with microwave-assisted nitric acid digestion as the sample pre-treatment technique was developed for the simultaneous determination of cadmium (Cd and lead (Pb in milk samples. The estimated detection limits were 0.09ìg kg-1 and 0.33ìg kg-1 for Cd and Pb, respectively. The method was linear in the concentration range 0.01 to 500ìg kg-1with correlation coefficients of 0.999 for both analytes.The method was validated using certified reference material BCR 150 and the determined values for Cd and Pb were 18.24 ± 0.18 ìg kg-1 and 807.57 ± 7.07ìg kg-1, respectively. Further validation using another certified reference material, NIST 1643e, resulted in determined concentrations of 6.48 ± 0.10 ìg L-1 for Cd and 21.96 ± 0.87 ìg L-1 for Pb. These determined values agree well with the certified values in the reference materials.The method was applied to processed and raw carabao milk samples collected in Nueva Ecija, Philippines.The Cd levels determined in the samples were in the range 0.11 ± 0.07 to 5.17 ± 0.13 ìg kg-1 for the processed milk samples, and 0.11 ± 0.07 to 0.45 ± 0.09 ìg kg-1 for the raw milk samples. The concentrations of Pb were in the range 0.49 ± 0.21 to 5.82 ± 0.17 ìg kg-1 for the processed milk samples, and 0.72 ± 0.18 to 6.79 ± 0.20 ìg kg-1 for the raw milk samples.

  3. Method and apparatus for continuous sampling

    International Nuclear Information System (INIS)

    Marcussen, C.

    1982-01-01

    An apparatus and method for continuously sampling a pulverous material flow includes means for extracting a representative subflow from a pulverous material flow. A screw conveyor is provided to cause the extracted subflow to be pushed upwardly through a duct to an overflow. Means for transmitting a radiation beam transversely to the subflow in the duct, and means for sensing the transmitted beam through opposite pairs of windows in the duct are provided to measure the concentration of one or more constituents in the subflow. (author)

  4. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    Science.gov (United States)

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of

  5. Terrestrial gamma radiation baseline mapping using ultra low density sampling methods

    International Nuclear Information System (INIS)

    Kleinschmidt, R.; Watson, D.

    2016-01-01

    Baseline terrestrial gamma radiation maps are indispensable for providing basic reference information that may be used in assessing the impact of a radiation related incident, performing epidemiological studies, remediating land contaminated with radioactive materials, assessment of land use applications and resource prospectivity. For a large land mass, such as Queensland, Australia (over 1.7 million km 2 ), it is prohibitively expensive and practically difficult to undertake detailed in-situ radiometric surveys of this scale. It is proposed that an existing, ultra-low density sampling program already undertaken for the purpose of a nationwide soil survey project be utilised to develop a baseline terrestrial gamma radiation map. Geoelement data derived from the National Geochemistry Survey of Australia (NGSA) was used to construct a baseline terrestrial gamma air kerma rate map, delineated by major drainage catchments, for Queensland. Three drainage catchments (sampled at the catchment outlet) spanning low, medium and high radioelement concentrations were selected for validation of the methodology using radiometric techniques including in-situ measurements and soil sampling for high resolution gamma spectrometry, and comparative non-radiometric analysis. A Queensland mean terrestrial air kerma rate, as calculated from the NGSA outlet sediment uranium, thorium and potassium concentrations, of 49 ± 69 nGy h −1 (n = 311, 3σ 99% confidence level) is proposed as being suitable for use as a generic terrestrial air kerma rate background range. Validation results indicate that catchment outlet measurements are representative of the range of results obtained across the catchment and that the NGSA geoelement data is suitable for calculation and mapping of terrestrial air kerma rate. - Highlights: • A baseline terrestrial air kerma map of Queensland, Australia was developed using geochemical data from a major drainage catchment ultra-low density sampling program

  6. Comparison of alkali fusion and acid digestion methods for radiochemical separation of Uranium from dietary samples

    International Nuclear Information System (INIS)

    Kamesh Viswanathan, B.; Arunachalam, Kantha D.; Sathesh Kumar, A.; Jayakrishana, K.; Shanmugamsundaram, H.; Rao, D.D.

    2014-01-01

    Several methods exist for separation and measurement of uranium in dietary samples such as neutron activation analysis (NAA), alpha spectrometric determination, inductively coupled plasma mass spectrometry (ICP-MS) and fluorimetry. For qualitative determination of activity, NAA and alpha spectrometry are said to be superior to evaluate the isotopes of uranium ( 238 U, 234 U and 235 U). In case of alpha spectrometry, the samples have to undergo radiochemical analysis for separation from other elements for uranium detection. In our studies, uranium was determined in food matrices by acid digestion (AD) and alkali fusion (AF) methods. The recovery yield of uranium in food matrices was compared in order to get consistent yield. The average activity levels of 238 U and 234 U in food samples were calculated based on recovery yield of 232 U in the samples. The average recovery of 232 U in AD method was 22 ± 8% and in AF method, it was 14.9 ± 1.3%. The spread is more in AD method than the AF method from their mean. The lowest recovery of 232 U was found in AF method. This is due to the interference of other elements in the sample during electroplating. Experimental results showed that the uranium separation by AD method has better recovery than the AF method. The consistency in recovery of 232 U was better for AF method, which was lower than the AD method. However, overall for both the methods, the recovery can be termed as poor and need rigorous follow up studies for consistently higher recoveries (>50%) in these type of biological samples. There are reports indicating satisfactory recoveries of around 80% with 232 U as tracer in the food matrices

  7. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    Science.gov (United States)

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  8. A NEW METHOD FOR NON DESTRUCTIVE ESTIMATION OF Jc IN YBaCuO CERAMIC SAMPLES

    Directory of Open Access Journals (Sweden)

    Giancarlo Cordeiro Costa

    2014-12-01

    Full Text Available This work presents a new method for estimation of Jc as a bulk characteristic of YBCO blocks. The experimental magnetic interaction force between a SmCo permanent magnet and a YBCO block was compared to finite element method (FEM simulations results, allowing us to search a best fitting value to the critical current of the superconducting sample. As FEM simulations were based on Bean model , the critical current density was taken as an unknown parameter. This is a non destructive estimation method. since there is no need of breaking even a little piece of the sample for analysis.

  9. Perilymph sampling from the cochlear apex: a reliable method to obtain higher purity perilymph samples from scala tympani.

    Science.gov (United States)

    Salt, Alec N; Hale, Shane A; Plonkte, Stefan K R

    2006-05-15

    Measurements of drug levels in the fluids of the inner ear are required to establish kinetic parameters and to determine the influence of specific local delivery protocols. For most substances, this requires cochlear fluids samples to be obtained for analysis. When auditory function is of primary interest, the drug level in the perilymph of scala tympani (ST) is most relevant, since drug in this scala has ready access to the auditory sensory cells. In many prior studies, ST perilymph samples have been obtained from the basal turn, either by aspiration through the round window membrane (RWM) or through an opening in the bony wall. A number of studies have demonstrated that such samples are likely to be contaminated with cerebrospinal fluid (CSF). CSF enters the basal turn of ST through the cochlear aqueduct when the bony capsule is perforated or when fluid is aspirated. The degree of sample contamination has, however, not been widely appreciated. Recent studies have shown that perilymph samples taken through the round window membrane are highly contaminated with CSF, with samples greater than 2microL in volume containing more CSF than perilymph. In spite of this knowledge, many groups continue to sample from the base of the cochlea, as it is a well-established method. We have developed an alternative, technically simple method to increase the proportion of ST perilymph in a fluid sample. The sample is taken from the apex of the cochlea, a site that is distant from the cochlear aqueduct. A previous problem with sampling through a perforation in the bone was that the native perilymph rapidly leaked out driven by CSF pressure and was lost to the middle ear space. We therefore developed a procedure to collect all the fluid that emerged from the perforated apex after perforation. We evaluated the method using a marker ion trimethylphenylammonium (TMPA). TMPA was applied to the perilymph of guinea pigs either by RW irrigation or by microinjection into the apical turn. The

  10. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  11. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    Energy Technology Data Exchange (ETDEWEB)

    Addleman, Raymond S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Naes, Benjamin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Olsen, Khris B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chouyyok, Wilaiwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Willingham, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spigner, Angel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-30

    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmental sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for

  12. Comparison of microstickies measurement methods. Part II, Results and discussion

    Science.gov (United States)

    Mahendra R. Doshi; Angeles Blanco; Carlos Negro; Concepcion Monte; Gilles M. Dorris; Carlos C. Castro; Axel Hamann; R. Daniel Haynes; Carl Houtman; Karen Scallon; Hans-Joachim Putz; Hans Johansson; R. A. Venditti; K. Copeland; H.-M. Chang

    2003-01-01

    In part I of the article we discussed sample preparation procedure and described various methods used for the measurement of microstickies. Some of the important features of different methods are highlighted in Table 1. Temperatures used in the measurement methods vary from room temperature in some cases, 45 °C to 65 °C in other cases. Sample size ranges from as low as...

  13. New sampling method in continuous energy Monte Carlo calculation for pebble bed reactors

    International Nuclear Information System (INIS)

    Murata, Isao; Takahashi, Akito; Mori, Takamasa; Nakagawa, Masayuki.

    1997-01-01

    A pebble bed reactor generally has double heterogeneity consisting of two kinds of spherical fuel element. In the core, there exist many fuel balls piled up randomly in a high packing fraction. And each fuel ball contains a lot of small fuel particles which are also distributed randomly. In this study, to realize precise neutron transport calculation of such reactors with the continuous energy Monte Carlo method, a new sampling method has been developed. The new method has been implemented in the general purpose Monte Carlo code MCNP to develop a modified version MCNP-BALL. This method was validated by calculating inventory of spherical fuel elements arranged successively by sampling during transport calculation and also by performing criticality calculations in ordered packing models. From the results, it was confirmed that the inventory of spherical fuel elements could be reproduced using MCNP-BALL within a sufficient accuracy of 0.2%. And the comparison of criticality calculations in ordered packing models between MCNP-BALL and the reference method shows excellent agreement in neutron spectrum as well as multiplication factor. MCNP-BALL enables us to analyze pebble bed type cores such as PROTEUS precisely with the continuous energy Monte Carlo method. (author)

  14. Towards a new method for the quantification of metabolites in the biological sample

    International Nuclear Information System (INIS)

    Neugnot, B.

    2005-03-01

    The quantification of metabolites is a key step in drug development. The aim of this Ph.D. work was to study the feasibility of a new method for this quantification, in the biological sample, without the drawbacks (cost, time, ethics) of the classical quantification methods based on metabolites synthesis or administration to man of the radiolabelled drug. Our strategy consists in determining the response factor, in mass spectrometry, of the metabolites. This approach is based on tritium labelling of the metabolites, ex vivo, by isotopic exchange. The labelling step was studied with deuterium. Metabolites of a model drug, recovered from in vitro or urinary samples, were labelled by three ways (Crab tree's catalyst ID2, deuterated trifluoroacetic acid or rhodium chloride ID20). Then, the transposition to tritium labelling was studied and the first results are very promising for the ultimate validation of the method. (author)

  15. Some refinements on the comparison of areal sampling methods via simulation

    Science.gov (United States)

    Jeffrey Gove

    2017-01-01

    The design of forest inventories and development of new sampling methods useful in such inventories normally have a two-fold target of design unbiasedness and minimum variance in mind. Many considerations such as costs go into the choices of sampling method for operational and other levels of inventory. However, the variance in terms of meeting a specified level of...

  16. Methods of pre-concentration of radionuclides from large volume samples

    International Nuclear Information System (INIS)

    Olahova, K.; Matel, L.; Rosskopfova, O.

    2006-01-01

    The development of radioanalytical methods for low level radionuclides in environmental samples is presented. In particular, emphasis is placed on the introduction of extraction chromatography as a tool for improving the quality of results as well as reducing the analysis time. However, the advantageous application of extraction chromatography often depends on the effective use of suitable preconcentration techniques, such as co-precipitation, to reduce the amount of matrix components which accompany the analysis interest. On-going investigations in this field relevant to the determination of environmental levels of actinides and 90 Sr are discussed. (authors)

  17. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    International Nuclear Information System (INIS)

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-01-01

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  18. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    Science.gov (United States)

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Evaluation of surface sampling method performance for Bacillus Spores on clean and dirty outdoor surfaces.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Mollye C.; Einfeld, Wayne; Boucher, Raymond M.; Brown, Gary Stephen; Tezak, Matthew Stephen

    2011-06-01

    Recovery of Bacillus atrophaeous spores from grime-treated and clean surfaces was measured in a controlled chamber study to assess sampling method performance. Outdoor surfaces investigated by wipe and vacuum sampling methods included stainless steel, glass, marble and concrete. Bacillus atrophaeous spores were used as a surrogate for Bacillus anthracis spores in this study designed to assess whether grime-coated surfaces significantly affected surface sampling method performance when compared to clean surfaces. A series of chamber tests were carried out in which known amounts of spores were allowed to gravitationally settle onto both clean and dirty surfaces. Reference coupons were co-located with test coupons in all chamber experiments to provide a quantitative measure of initial surface concentrations of spores on all surfaces, thereby allowing sampling recovery calculations. Results from these tests, carried out under both low and high humidity conditions, show that spore recovery from grime-coated surfaces is the same as or better than spore recovery from clean surfaces. Statistically significant differences between method performance for grime-coated and clean surfaces were observed in only about half of the chamber tests conducted.

  20. Sampling methods in archaeomagnetic dating: A comparison using case studies from Wörterberg, Eisenerz and Gams Valley (Austria)

    Science.gov (United States)

    Trapanese, A.; Batt, C. M.; Schnepp, E.

    The aim of this research was to review the relative merits of different methods of taking samples for archaeomagnetic dating. To allow different methods to be investigated, two archaeological structures and one modern fireplace were sampled in Austria. On each structure a variety of sampling methods were used: the tube and disc techniques of Clark et al. (Clark, A.J., Tarling, D.H., Noel, M., 1988. Developments in archaeomagnetic dating in Great Britain. Journal of Archaeological Science 15, 645-667), the drill core technique, the mould plastered hand block method of Thellier, and a modification of it. All samples were oriented with a magnetic compass and sun compass, where weather conditions allowed. Approximately 12 discs, tubes, drill cores or plaster hand blocks were collected from each structure, with one mould plaster hand block being collected and cut into specimens. The natural remanent magnetisation (NRM) of the samples was measured and stepwise alternating field (AF) or thermal demagnetisation was applied. Samples were measured either in the UK or in Austria, which allowed the comparison of results between magnetometers with different sensitivity. The tubes and plastered hand block specimens showed good agreement in directional results, and the samples obtained showed good stability. The discs proved to be unreliable as both NRM and the characteristic remanent magnetisation (ChRM) distribution were very scattered. The failure of some methods may be related to the suitability of the material sampled, for example if it was disturbed before sampling, had been insufficiently heated or did not contain appropriate magnetic minerals to retain a remanent magnetisation. Caution is also recommended for laboratory procedures as the cutting of poorly consolidated specimens may disturb the material and therefore the remanent magnetisation. Criteria and guidelines were established to aid researchers in selecting the most appropriate method for a particular

  1. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik

    2005-01-01

    Sample preparation is considered one of the limiting steps in microbial metabolome analysis. Eukaryotes and prokaryotes behave very differently during the several steps of classical sample preparation methods for analysis of metabolites. Even within the eukaryote kingdom there is a vast diversity...

  2. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination.

    Science.gov (United States)

    Exum, Natalie G; Kosek, Margaret N; Davis, Meghan F; Schwab, Kellogg J

    2017-08-22

    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 ( p samples (Pearson (0.53, p method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials.

  3. Utility of the microculture method for Leishmania detection in non-invasive samples obtained from a blood bank.

    Science.gov (United States)

    Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Kocazeybek, Bekir; Kosan, Erdogan

    2013-10-01

    In recent years, the role of donor blood has taken an important place in epidemiology of Leishmaniasis. According to the WHO, the numbers of patients considered as symptomatic are only 5-20% of individuals with asymptomatic leishmaniasis. In this study for detection of Leishmania infection in donor blood samples, 343 samples from the Capa Red Crescent Blood Center were obtained and primarily analyzed by microscopic and serological methods. Subsequently, the traditional culture (NNN), Immuno-chromatographic test (ICT) and Polymerase Chain Reaction (PCR) methods were applied to 21 samples which of them were found positive with at least one method. Buffy coat (BC) samples from 343 blood donors were analyzed: 15 (4.3%) were positive by a microculture method (MCM); and 4 (1.1%) by smear. The sera of these 343 samples included 9 (2.6%) determined positive by ELISA and 7 (2%) positive by IFAT. Thus, 21 of (6.1%) the 343 subjects studied by smear, MCM, IFAT and ELISA techniques were identified as positive for leishmaniasis at least one of the techniques and the sensitivity assessed. According to our data, the sensitivity of the methods are identified as MCM (71%), smear (19%), IFAT (33%), ELISA (42%), NNN (4%), PCR (14%) and ICT (4%). Thus, with this study for the first time, the sensitivity of a MCM was examined in blood donors by comparing MCM with the methods used in the diagnosis of leishmaniasis. As a result, MCM was found the most sensitive method for detection of Leishmania parasites in samples obtained from a blood bank. In addition, the presence of Leishmania parasites was detected in donor bloods in Istanbul, a non-endemic region of Turkey, and these results is a vital importance for the health of blood recipients. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Comparison of four sampling methods for the detection of Salmonella in broiler litter.

    Science.gov (United States)

    Buhr, R J; Richardson, L J; Cason, J A; Cox, N A; Fairchild, B D

    2007-01-01

    Experiments were conducted to compare litter sampling methods for the detection of Salmonella. In experiment 1, chicks were challenged orally with a suspension of naladixic acid-resistant Salmonella and wing banded, and additional nonchallenged chicks were placed into each of 2 challenge pens. Nonchallenged chicks were placed into each nonchallenge pen located adjacent to the challenge pens. At 7, 8, 10, and 11 wk of age the litter was sampled using 4 methods: fecal droppings, litter grab, drag swab, and sock. For the challenge pens, Salmonella-positive samples were detected in 3 of 16 fecal samples, 6 of 16 litter grab samples, 7 of 16 drag swabs samples, and 7 of 16 sock samples. Samples from the nonchallenge pens were Salmonella positive in 2 of 16 litter grab samples, 9 of 16 drag swab samples, and 9 of 16 sock samples. In experiment 2, chicks were challenged with Salmonella, and the litter in the challenge and adjacent nonchallenge pens were sampled at 4, 6, and 8 wk of age with broilers remaining in all pens. For the challenge pens, Salmonella was detected in 10 of 36 fecal samples, 20 of 36 litter grab samples, 14 of 36 drag swab samples, and 26 of 36 sock samples. Samples from the adjacent nonchallenge pens were positive for Salmonella in 6 of 36 fecal droppings samples, 4 of 36 litter grab samples, 7 of 36 drag swab samples, and 19 of 36 sock samples. Sock samples had the highest rates of Salmonella detection. In experiment 3, the litter from a Salmonella-challenged flock was sampled at 7, 8, and 9 wk by socks and drag swabs. In addition, comparisons with drag swabs that were stepped on during sampling were made. Both socks (24 of 36, 67%) and drag swabs that were stepped on (25 of 36, 69%) showed significantly more Salmonella-positive samples than the traditional drag swab method (16 of 36, 44%). Drag swabs that were stepped on had comparable Salmonella detection level to that for socks. Litter sampling methods that incorporate stepping on the sample

  5. Development of methods to measure hemoglobin adducts by gel electrophoresis - Preliminary results

    International Nuclear Information System (INIS)

    Sun, J.D.; McBride, S.M.

    1988-01-01

    Chemical adducts formed on blood hemoglobin may be a useful biomarker for assessing human exposures to these compounds. This paper reports preliminary results in the development of methods to measure such adducts that may be generally applicable for a wide variety of chemicals. Male F344/N rats were intraperitoneally injected with 14 C-BaP dissolved in corn oil. Twenty-four hours later, the rats were sacrificed. Blood samples were collected and globin was isolated. Globin protein was then cleaved into peptide fragments using cyanogen bromide and the fragments separated using 2-dimensional gel electrophoresis. The results showed that the adducted 14 C-globin fragments migrated to different areas of the gel than did unadducted fragments. Further research is being conducted to develop methods that will allow quantitation of separated adducted globin fragments from human blood samples without the use of a radiolabel. (author)

  6. Test of methods for retrospective activity size distribution determination from filter samples

    International Nuclear Information System (INIS)

    Meisenberg, Oliver; Tschiersch, Jochen

    2015-01-01

    Determining the activity size distribution of radioactive aerosol particles requires sophisticated and heavy equipment, which makes measurements at large number of sites difficult and expensive. Therefore three methods for a retrospective determination of size distributions from aerosol filter samples in the laboratory were tested for their applicability. Extraction into a carrier liquid with subsequent nebulisation showed size distributions with a slight but correctable bias towards larger diameters compared with the original size distribution. Yields in the order of magnitude of 1% could be achieved. Sonication-assisted extraction into a carrier liquid caused a coagulation mode to appear in the size distribution. Sonication-assisted extraction into the air did not show acceptable results due to small yields. The method of extraction into a carrier liquid without sonication was applied to aerosol samples from Chernobyl in order to calculate inhalation dose coefficients for 137 Cs based on the individual size distribution. The effective dose coefficient is about half of that calculated with a default reference size distribution. - Highlights: • Activity size distributions can be recovered after aerosol sampling on filters. • Extraction into a carrier liquid and subsequent nebulisation is appropriate. • This facilitates the determination of activity size distributions for individuals. • Size distributions from this method can be used for individual dose coefficients. • Dose coefficients were calculated for the workers at the new Chernobyl shelter

  7. A performance comparison of sampling methods in the assessment of species composition patterns and environment–vegetation relationships in species-rich grasslands

    OpenAIRE

    Grzegorz Swacha; Zoltán Botta-Dukát; Zygmunt Kącki; Daniel Pruchniewicz; Ludwik Żołnierz

    2017-01-01

    The influence that different sampling methods have on the results and the interpretation of vegetation analysis has been much debated, but little is yet known about how the spatial arrangement of samples affect patterns of species composition and environment–vegetation relationships within the same vegetation type. We compared three data sets of the same sample size obtained by three standard sampling methods: preferential, random, and systematic. These different sampling methods were applied...

  8. Application of the neutron activation analysis method to the multielemental determination of food samples

    International Nuclear Information System (INIS)

    Maihara, V.A.

    1985-01-01

    The thermal neutron activation analysis method was applied to the determination of elements present at low concentrations and trace levels in samples of bread and milk powder using non-destructive analyses were based on gamma ray spectrometric measurements of samples and standards irradiated for periods which varied from some minutes to eight hours in a thermal neutron flux of about 10 12 n cm -2 s -1 . The concentrations obtained for milk powder were compared with the data obtained by other autors from different contries. For the bread, that comparison was not possible, because data about trace analysis in bread samples were not found. Besides, the results obtained for the various brands of bread and milk by means of non destructive and destructive analyses were compared using Student's t criterion. Some basic considerations about 'Detection Limit' were done, mainly in relation to its application in the technique used in the present work. The detection and determination limits of the trace elements analysed by destructive and non destructive techniques in bread and milk powder samples were determined using the Currie and Girardi methods. The precision of the analyses and the results obtained for the detection limits of the analysed trace elements are discussed. (Author) [pt

  9. Evaluation of methods to sample fecal indicator bacteria in foreshore sand and pore water at freshwater beaches.

    Science.gov (United States)

    Vogel, Laura J; Edge, Thomas A; O'Carroll, Denis M; Solo-Gabriele, Helena M; Kushnir, Caitlin S E; Robinson, Clare E

    2017-09-15

    Fecal indicator bacteria (FIB) are known to accumulate in foreshore beach sand and pore water (referred to as foreshore reservoir) where they act as a non-point source for contaminating adjacent surface waters. While guidelines exist for sampling surface waters at recreational beaches, there is no widely-accepted method to collect sand/sediment or pore water samples for FIB enumeration. The effect of different sampling strategies in quantifying the abundance of FIB in the foreshore reservoir is unclear. Sampling was conducted at six freshwater beaches with different sand types to evaluate sampling methods for characterizing the abundance of E. coli in the foreshore reservoir as well as the partitioning of E. coli between different components in the foreshore reservoir (pore water, saturated sand, unsaturated sand). Methods were evaluated for collection of pore water (drive point, shovel, and careful excavation), unsaturated sand (top 1 cm, top 5 cm), and saturated sand (sediment core, shovel, and careful excavation). Ankle-depth surface water samples were also collected for comparison. Pore water sampled with a shovel resulted in the highest observed E. coli concentrations (only statistically significant at fine sand beaches) and lowest variability compared to other sampling methods. Collection of the top 1 cm of unsaturated sand resulted in higher and more variable concentrations than the top 5 cm of sand. There were no statistical differences in E. coli concentrations when using different methods to sample the saturated sand. Overall, the unsaturated sand had the highest amount of E. coli when compared to saturated sand and pore water (considered on a bulk volumetric basis). The findings presented will help determine the appropriate sampling strategy for characterizing FIB abundance in the foreshore reservoir as a means of predicting its potential impact on nearshore surface water quality and public health risk. Copyright © 2017 Elsevier Ltd. All rights

  10. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    Science.gov (United States)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  11. Usefulness of FTA® cards as a Pneumocystis-DNA extraction method in bronchoalveolar lavage samples.

    Science.gov (United States)

    Rodiño, Jenniffer M; Aguilar, Yudy A; Rueda, Zulma Vanessa; Vélez, Lázaro A

    2016-01-01

    FTA® cards (Fast Technology for Analysis of Nucleic Acids) are an alternative DNA extraction method in bronchoalveolar lavage (BAL) samples for Pneumocystis jirovecii molecular analyses. The goal was to evaluate the usefulness of FTA® cards to detect P. jirovecii-DNA by PCR in BAL samples compared to silica adsorption chromatography (SAC). This study used 134 BAL samples from immunocompromised patients previously studied to establish microbiological aetiology of pneumonia, among them 15 cases of Pneumocystis pneumonia (PCP) documented by staining and 119 with other alternative diagnoses. The FTA® system and SAC were used for DNA extraction and then amplified by nested PCR to detect P. jirovecii. Performance and concordance of the two DNA extraction methods compared to P. jirovecii microscopy were calculated. The influence of the macroscopic characteristics, transportation of samples and the duration of the FTA® card storage (1, 7, 10 or 12 months) were also evaluated. Among 134 BAL samples, 56% were positive for P. jirovecii-DNA by SAC and 27% by FTA®. All 15 diagnosed by microscopy were detected by FTA® and SAC. Specificity of the FTA® system and SAC were 82.4% and 49.6%, respectively. Compared to SAC, positivity by FTA® decreased with the presence of blood in BAL (62% vs 13.5%). The agreement between samples at 7, 10 and 12 months was 92.5% for FTA®. Positive cases by FTA® remained the same after shipment by mail. Results suggest that FTA® is a practical, safe and economical method to preserve P. jirovecii-DNA in BAL samples for molecular studies.

  12. Evaluation of biological samples for specimen banking and biomonitoring by nuclear methods

    International Nuclear Information System (INIS)

    Stone, S.F.; Zeisler, R.

    1984-01-01

    In a pilot program for environmental specimen banking, human livers and marine mussels (Mytilus edulis) were sampled, analyzed and banked. Nuclear methods played a major role in the evaluation of the samples by providing concentration data for up to 37 major, mineral, and trace elements. Instrumental neutron activation analysis was complemented by neutron-capture prompt gamma activation analysis, radiochemical separations and, for the mussels, by instrumental X-ray fluorescence analysis. A cryogenic homogenization procedure was applied for sample preparation and evaluated. Assessment of accuracy was made by analyzing Standard Reference Materials and by intercomparing the techniques. Results are reported for 66 individual human liver specimens, collected at three locations in the United States, and for batches of 65 mussels from a collection made at Narragansett Bay, RI. 19 references, 23 figures, 4 tables

  13. Methods for efficient analysis of tocopherols, tocotrienols and their metabolites in animal samples with HPLC-EC

    Directory of Open Access Journals (Sweden)

    Mao-Jung Lee

    2018-01-01

    Full Text Available Tocopherols and tocotrienols, collectively known as vitamin E, have received a great deal of attention because of their interesting biological activities. In the present study, we reexamined and improved previous methods of sample preparation and the conditions of high-performance liquid chromatography for more accurate quantification of tocopherols, tocotrienols and their major chain-degradation metabolites. For the analysis of serum tocopherols/tocotrienols, we reconfirmed our method of mixing serum with ethanol followed by hexane extraction. For the analysis of tissue samples, we improved our methods by extracting tocopherols/tocotrienols directly from tissue homogenate with hexane. For the analysis of total amounts (conjugated and unconjugated forms of side-chain degradation metabolites, the samples need to be deconjugated by incubating with β-glucuronidase and sulfatase; serum samples can be directly used for the incubation, whereas for tissue homogenates a pre-deproteination step is needed. The present methods are sensitive, convenient and are suitable for the determination of different forms of vitamin E and their metabolites in animal and human studies. Results from the analysis of serum, liver, kidney, lung and urine samples from mice that had been treated with mixtures of tocotrienols and tocopherols are presented as examples.

  14. Analytical results from Tank 38H criticality Sample HTF-093

    International Nuclear Information System (INIS)

    Wilmarth, W.R.

    2000-01-01

    Resumption of processing in the 242-16H Evaporator could cause salt dissolution in the Waste Concentration Receipt Tank (Tank 38H). Therefore, High Level Waste personnel sampled the tank at the salt surface. Results of elemental analysis of the dried sludge solids from this sample (HTF-093) show significant quantities of neutron poisons (i.e., sodium, iron, and manganese) present to mitigate the potential for nuclear criticality. Comparison of this sample with the previous chemical and radiometric analyses of H-Area Evaporator samples show high poison to actinide ratios

  15. Soil Particle Size Analysis by Laser Diffractometry: Result Comparison with Pipette Method

    Science.gov (United States)

    Šinkovičová, Miroslava; Igaz, Dušan; Kondrlová, Elena; Jarošová, Miriam

    2017-10-01

    Soil texture as the basic soil physical property provides a basic information on the soil grain size distribution as well as grain size fraction representation. Currently, there are several methods of particle dimension measurement available that are based on different physical principles. Pipette method based on the different sedimentation velocity of particles with different diameter is considered to be one of the standard methods of individual grain size fraction distribution determination. Following the technical advancement, optical methods such as laser diffraction can be also used nowadays for grain size distribution determination in the soil. According to the literature review of domestic as well as international sources related to this topic, it is obvious that the results obtained by laser diffractometry do not correspond with the results obtained by pipette method. The main aim of this paper was to analyse 132 samples of medium fine soil, taken from the Nitra River catchment in Slovakia, from depths of 15-20 cm and 40-45 cm, respectively, using laser analysers: ANALYSETTE 22 MicroTec plus (Fritsch GmbH) and Mastersizer 2000 (Malvern Instruments Ltd). The results obtained by laser diffractometry were compared with pipette method and the regression relationships using linear, exponential, power and polynomial trend were derived. Regressions with the three highest regression coefficients (R2) were further investigated. The fit with the highest tightness was observed for the polynomial regression. In view of the results obtained, we recommend using the estimate of the representation of the clay fraction (analysis is done according to laser diffractometry. The advantages of laser diffraction method comprise the short analysis time, usage of small sample amount, application for the various grain size fraction and soil type classification systems, and a wide range of determined fractions. Therefore, it is necessary to focus on this issue further to address the

  16. Method for Measuring Thermal Conductivity of Small Samples Having Very Low Thermal Conductivity

    Science.gov (United States)

    Miller, Robert A.; Kuczmarski, Maria a.

    2009-01-01

    This paper describes the development of a hot plate method capable of using air as a standard reference material for the steady-state measurement of the thermal conductivity of very small test samples having thermal conductivity on the order of air. As with other approaches, care is taken to ensure that the heat flow through the test sample is essentially one-dimensional. However, unlike other approaches, no attempt is made to use heated guards to block the flow of heat from the hot plate to the surroundings. It is argued that since large correction factors must be applied to account for guard imperfections when sample dimensions are small, it may be preferable to simply measure and correct for the heat that flows from the heater disc to directions other than into the sample. Experimental measurements taken in a prototype apparatus, combined with extensive computational modeling of the heat transfer in the apparatus, show that sufficiently accurate measurements can be obtained to allow determination of the thermal conductivity of low thermal conductivity materials. Suggestions are made for further improvements in the method based on results from regression analyses of the generated data.

  17. Evaluation of sampling methods for measuring exposure to volatile inorganic acids in workplace air. Part 2: Sampling capacity and breakthrough tests for sodium carbonate-impregnated filters.

    Science.gov (United States)

    Demange, Martine; Oury, Véronique; Rousset, Davy

    2011-11-01

    In France, the MétroPol 009 method used to measure workplace exposure to inorganic acids, such as HF, HCl, and HNO3, consists of a closed-face cassette fitted with a prefilter to collect particles, and two sodium carbonate-impregnated filters to collect acid vapor. This method was compared with other European methods during the development of a three-part standard (ISO 21438) on the determination of inorganic acids in workplace air by ion chromatography. Results of this work, presented in a companion paper, led to a need to go deeper into the performance of the MétroPol 009 method regarding evaluation of the breakthrough of the acids, both alone and in mixtures, interference from particulate salts, the amount of sodium carbonate required to impregnate the sampling filter, the influence of sampler components, and so on. Results enabled improvements to be made to the sampling device with respect to the required amount of sodium carbonate to sample high HCl or HNO3 concentrations (500 μL of 5% Na2CO3 on each of two impregnated filters). In addition, a PVC-A filter used as a prefilter in a sampling device showed a propensity to retain HNO3 vapor so a PTFE filter was considered more suitable for use as a prefilter. Neither the material of the sampling cassette (polystyrene or polypropylene) nor the sampling flowrate (1 L/min or 2 L/min) influenced the performance of the sampling device, as a recovery of about 100% was achieved in all experiments for HNO3, HCl, and HF, as well as HNO3+HF and HNO3+HCl mixtures, over a wide range of concentrations. However, this work points to the possibility of interference between an acid and salts of other acids. For instance, interference can occur through interaction of HNO3 with chloride salts: the stronger the acid, the greater the interference. Methods based on impregnated filters are reliable for quantitative recovery of inorganic volatile acids in workplace atmosphere but are valuable only in the absence of interferents.

  18. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  19. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    International Nuclear Information System (INIS)

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy's (DOE's) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples

  20. Multiple predictor smoothing methods for sensitivity analysis: Example results

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described in the first part of this presentation: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. In this, the second and concluding part of the presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  1. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  2. Method of separate determination of high-ohmic sample resistance and contact resistance

    Directory of Open Access Journals (Sweden)

    Vadim A. Golubiatnikov

    2015-09-01

    Full Text Available A method of separate determination of two-pole sample volume resistance and contact resistance is suggested. The method is applicable to high-ohmic semiconductor samples: semi-insulating gallium arsenide, detector cadmium-zinc telluride (CZT, etc. The method is based on near-contact region illumination by monochromatic radiation of variable intensity from light emitting diodes with quantum energies exceeding the band gap of the material. It is necessary to obtain sample photo-current dependence upon light emitting diode current and to find the linear portion of this dependence. Extrapolation of this linear portion to the Y-axis gives the cut-off current. As the bias voltage is known, it is easy to calculate sample volume resistance. Then, using dark current value, one can determine the total contact resistance. The method was tested for n-type semi-insulating GaAs. The contact resistance value was shown to be approximately equal to the sample volume resistance. Thus, the influence of contacts must be taken into account when electrophysical data are analyzed.

  3. DEVELOPMENT OF SAMPLING AND ANALYTICAL METHODS FOR THE MEASUREMENT OF NITROUS OXIDE FROM FOSSIL FUEL COMBUSTION SOURCES

    Science.gov (United States)

    The report documents the technical approach and results achieved while developing a grab sampling method and an automated, on-line gas chromatography method suitable to characterize nitrous oxide (N2O) emissions from fossil fuel combustion sources. The two methods developed have...

  4. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  5. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics.

    Science.gov (United States)

    Feng, Shu; Gale, Michael J; Fay, Jonathan D; Faridi, Ambar; Titus, Hope E; Garg, Anupam K; Michaels, Keith V; Erker, Laura R; Peters, Dawn; Smith, Travis B; Pennesi, Mark E

    2015-09-01

    To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population.

  6. Methods to maximise recovery of environmental DNA from water samples.

    Directory of Open Access Journals (Sweden)

    Rheyda Hinlo

    Full Text Available The environmental DNA (eDNA method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days. This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.

  7. Standard methods for sampling freshwater fishes: Opportunities for international collaboration

    Science.gov (United States)

    Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.

    2017-01-01

    With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.

  8. Sampling methods for the study of pneumococcal carriage: a systematic review.

    Science.gov (United States)

    Gladstone, R A; Jefferies, J M; Faust, S N; Clarke, S C

    2012-11-06

    Streptococcus pneumoniae is an important pathogen worldwide. Accurate sampling of S. pneumoniae carriage is central to surveillance studies before and following conjugate vaccination programmes to combat pneumococcal disease. Any bias introduced during sampling will affect downstream recovery and typing. Many variables exist for the method of collection and initial processing, which can make inter-laboratory or international comparisons of data complex. In February 2003, a World Health Organisation working group published a standard method for the detection of pneumococcal carriage for vaccine trials to reduce or eliminate variability. We sought to describe the variables associated with the sampling of S. pneumoniae from collection to storage in the context of the methods recommended by the WHO and those used in pneumococcal carriage studies since its publication. A search of published literature in the online PubMed database was performed on the 1st June 2012, to identify published studies that collected pneumococcal carriage isolates, conducted after the publication of the WHO standard method. After undertaking a systematic analysis of the literature, we show that a number of differences in pneumococcal sampling protocol continue to exist between studies since the WHO publication. The majority of studies sample from the nasopharynx, but the choice of swab and swab transport media is more variable between studies. At present there is insufficient experimental data that supports the optimal sensitivity of any standard method. This may have contributed to incomplete adoption of the primary stages of the WHO detection protocol, alongside pragmatic or logistical issues associated with study design. Consequently studies may not provide a true estimate of pneumococcal carriage. Optimal sampling of carriage could lead to improvements in downstream analysis and the evaluation of pneumococcal vaccine impact and extrapolation to pneumococcal disease control therefore

  9. A faster sample preparation method for determination of polonium-210 in fish

    International Nuclear Information System (INIS)

    Sadi, B.B.; Jing Chen; Kochermin, Vera; Godwin Tung; Sorina Chiorean

    2016-01-01

    In order to facilitate Health Canada’s study on background radiation levels in country foods, an in-house radio-analytical method has been developed for determination of polonium-210 ( 210 Po) in fish samples. The method was validated by measurement of 210 Po in a certified reference material. It was also evaluated by comparing 210 Po concentrations in a number of fish samples by another method. The in-house method offers faster sample dissolution using an automated digestion system compared to currently used wet-ashing on a hot plate. It also utilizes pre-packed Sr-resin® cartridges for rapid and reproducible separation of 210 Po versus time-consuming manually packed Sr-resin® columns. (author)

  10. Using the sampling method to propagate uncertainties of physical parameters in systems with fissile material

    International Nuclear Information System (INIS)

    Campolina, Daniel de Almeida Magalhães

    2015-01-01

    manufacturers, there will be a reduction in the value compared to the conservative model. The results made it possible to verify the correct functioning of the program developed and show the potential of the sampling method for propagation of uncertainties, especially when many uncertainties are evaluated together in the same input. (author)

  11. Evaluation of sampling methods for toxicological testing of indoor air particulate matter.

    Science.gov (United States)

    Tirkkonen, Jenni; Täubel, Martin; Hirvonen, Maija-Riitta; Leppänen, Hanna; Lindsley, William G; Chen, Bean T; Hyvärinen, Anne; Huttunen, Kati

    2016-09-01

    There is a need for toxicity tests capable of recognizing indoor environments with compromised air quality, especially in the context of moisture damage. One of the key issues is sampling, which should both provide meaningful material for analyses and fulfill requirements imposed by practitioners using toxicity tests for health risk assessment. We aimed to evaluate different existing methods of sampling indoor particulate matter (PM) to develop a suitable sampling strategy for a toxicological assay. During three sampling campaigns in moisture-damaged and non-damaged school buildings, we evaluated one passive and three active sampling methods: the Settled Dust Box (SDB), the Button Aerosol Sampler, the Harvard Impactor and the National Institute for Occupational Safety and Health (NIOSH) Bioaerosol Cyclone Sampler. Mouse RAW264.7 macrophages were exposed to particle suspensions and cell metabolic activity (CMA), production of nitric oxide (NO) and tumor necrosis factor (TNFα) were determined after 24 h of exposure. The repeatability of the toxicological analyses was very good for all tested sampler types. Variability within the schools was found to be high especially between different classrooms in the moisture-damaged school. Passively collected settled dust and PM collected actively with the NIOSH Sampler (Stage 1) caused a clear response in exposed cells. The results suggested the higher relative immunotoxicological activity of dust from the moisture-damaged school. The NIOSH Sampler is a promising candidate for the collection of size-fractionated PM to be used in toxicity testing. The applicability of such sampling strategy in grading moisture damage severity in buildings needs to be developed further in a larger cohort of buildings.

  12. A generalized transmission method for gamma-efficiency determinations in soil samples

    International Nuclear Information System (INIS)

    Bolivar, J.P.; Garcia-Tenorio, R.; Garcia-Leon, M.

    1994-01-01

    In this paper, a generalization of the γ-ray transmission method which is useful for measurements on soil samples, for example, is presented. The correction factor, f, is given, which is a function of the apparent density of the soil and the γ-ray energy. With this method, the need for individual determinations of f, for each energy and apparent soil density is avoided. Although the method has been developed for soils, the general philosophy can be applied to other sample matrices, such as water or vegetables for example. (author)

  13. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  14. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  15. A method for determination of mass per unit area inhomogeneity of thin samples in XRF analysis

    International Nuclear Information System (INIS)

    Sitko, R.; Jurczyk, J.

    1999-01-01

    The authors have presented a simple method for the determination of possible inhomogeneity of thin samples in a wavedispersive XRF analysis after previous examination of intensity distribution of exciting radiation on sample's surface. Investigations were carried out using as an example microsamples of mono- and polycrystals. Samples were prepared by digesting an analysed material directly on the substrate. The obtained results have been presented in a graphical way. (author)

  16. Air sampling procedures to evaluate microbial contamination: a comparison between active and passive methods in operating theatres

    Directory of Open Access Journals (Sweden)

    Napoli Christian

    2012-08-01

    Full Text Available Abstract Background Since air can play a central role as a reservoir for microorganisms, in controlled environments such as operating theatres regular microbial monitoring is useful to measure air quality and identify critical situations. The aim of this study is to assess microbial contamination levels in operating theatres using both an active and a passive sampling method and then to assess if there is a correlation between the results of the two different sampling methods. Methods The study was performed in 32 turbulent air flow operating theatres of a University Hospital in Southern Italy. Active sampling was carried out using the Surface Air System and passive sampling with settle plates, in accordance with ISO 14698. The Total Viable Count (TVC was evaluated at rest (in the morning before the beginning of surgical activity and in operational (during surgery. Results The mean TVC at rest was 12.4 CFU/m3 and 722.5 CFU/m2/h for active and passive samplings respectively. The mean in operational TVC was 93.8 CFU/m3 (SD = 52.69; range = 22-256 and 10496.5 CFU/m2/h (SD = 7460.5; range = 1415.5-25479.7 for active and passive samplings respectively. Statistical analysis confirmed that the two methods correlate in a comparable way with the quality of air. Conclusion It is possible to conclude that both methods can be used for general monitoring of air contamination, such as routine surveillance programs. However, the choice must be made between one or the other to obtain specific information.

  17. Trace elements detection in whole food samples by Neutron Activation Analysis, k0-method

    International Nuclear Information System (INIS)

    Sathler, Márcia Maia; Menezes, Maria Ângela de Barros Correia; Salles, Paula Maria Borges de

    2017-01-01

    Inorganic elements, from natural and anthropogenic sources are present in foods in different concentrations. With the increase in anthropogenic activities, there was also a considerable increase in the emission of these elements in the environment, leading to the need of monitoring the elemental composition of foods available for consumption. Numerous techniques have been used to detect inorganic elements in biological and environmental matrices, always aiming at reaching lower detection limits in order to evaluate the trace element content in the sample. Neutron activation analysis (INAA), applying the k 0 -method, produces accurate and precise results without the need of chemical preparation of the samples – that could cause their contamination. This study evaluated the presence of inorganic elements in whole foods samples, mainly elements on trace levels. For this purpose, seven samples of different types of whole foods were irradiated in the TRIGA MARK I IPR-R1 research reactor - located at CDTN/CNEN, in Belo Horizonte, MG. It was possible to detect twenty two elements above the limit of detection in, at least, one of the samples analyzed. This study reaffirms the INAA, k 0 - method, as a safe and efficient technique for detecting trace elements in food samples. (author)

  18. An extension of command shaping methods for controlling residual vibration using frequency sampling

    Science.gov (United States)

    Singer, Neil C.; Seering, Warren P.

    1992-01-01

    The authors present an extension to the impulse shaping technique for commanding machines to move with reduced residual vibration. The extension, called frequency sampling, is a method for generating constraints that are used to obtain shaping sequences which minimize residual vibration in systems such as robots whose resonant frequencies change during motion. The authors present a review of impulse shaping methods, a development of the proposed extension, and a comparison of results of tests conducted on a simple model of the space shuttle robot arm. Frequency shaping provides a method for minimizing the impulse sequence duration required to give the desired insensitivity.

  19. A new method for automatic discontinuity traces sampling on rock mass 3D model

    Science.gov (United States)

    Umili, G.; Ferrero, A.; Einstein, H. H.

    2013-02-01

    A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.

  20. Novel degenerate PCR method for whole genome amplification applied to Peru Margin (ODP Leg 201 subsurface samples

    Directory of Open Access Journals (Sweden)

    Amanda eMartino

    2012-01-01

    Full Text Available A degenerate PCR-based method of whole-genome amplification, designed to work fluidly with 454 sequencing technology, was developed and tested for use on deep marine subsurface DNA samples. The method, which we have called Random Amplification Metagenomic PCR (RAMP, involves the use of specific primers from Roche 454 amplicon sequencing, modified by the addition of a degenerate region at the 3’ end. It utilizes a PCR reaction, which resulted in no amplification from blanks, even after 50 cycles of PCR. After efforts to optimize experimental conditions, the method was tested with DNA extracted from cultured E. coli cells, and genome coverage was estimated after sequencing on three different occasions. Coverage did not vary greatly with the different experimental conditions tested, and was around 62% with a sequencing effort equivalent to a theoretical genome coverage of 14.10X. The GC content of the sequenced amplification product was within 2% of the predicted values for this strain of E. coli. The method was also applied to DNA extracted from marine subsurface samples from ODP Leg 201 site 1229 (Peru Margin, and results of a taxonomic analysis revealed microbial communities dominated by Proteobacteria, Chloroflexi, Firmicutes, Euryarchaeota, and Crenarchaeota, among others. These results were similar to those obtained previously for those samples; however, variations in the proportions of taxa show that community analysis can be sensitive to both the amplification technique used and the method of assigning sequences to taxonomic groups. Overall, we find that RAMP represents a valid methodology for amplifying metagenomes from low biomass samples.

  1. Validation of curve-fitting method for blood retention of 99mTc-GSA. Comparison with blood sampling method

    International Nuclear Information System (INIS)

    Ha-Kawa, Sang Kil; Suga, Yutaka; Kouda, Katsuyasu; Ikeda, Koshi; Tanaka, Yoshimasa

    1997-01-01

    We investigated a curve-fitting method for the rate of blood retention of 99m Tc-galactosyl serum albumin (GSA) as a substitute for the blood sampling method. Seven healthy volunteers and 27 patients with liver disease underwent 99m Tc-GSA scanning. After normalization of the y-intercept as 100 percent, a biexponential regression curve for the precordial time-activity curve provided the percent injected dose (%ID) of 99m Tc-GSA in the blood without blood sampling. The discrepancy between %ID obtained by the curve-fitting method and that by the multiple blood samples was minimal in normal volunteers 3.1±2.1% (mean±standard deviation, n=77 sampling). Slightly greater discrepancy was observed in patients with liver disease (7.5±6.1%, n=135 sampling). The %ID at 15 min after injection obtained from the fitted curve was significantly greater in patients with liver cirrhosis than in the controls (53.2±11.6%, n=13; vs. 31.9±2.8%, n=7, p 99m Tc-GSA and the plasma retention rate for indocyanine green (r=-0.869, p 99m Tc-GSA and could be a substitute for the blood sampling method. (author)

  2. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  3. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    Science.gov (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  4. Evaluation of the point-centred-quarter method of sampling ...

    African Journals Online (AJOL)

    -quarter method.The parameter which was most efficiently sampled was species composition relativedensity) with 90% replicate similarity being achieved with 100 point-centred-quarters. However, this technique cannot be recommended, even ...

  5. An Optimized Method for Quantification of Pathogenic Leptospira in Environmental Water Samples.

    Science.gov (United States)

    Riediger, Irina N; Hoffmaster, Alex R; Casanovas-Massana, Arnau; Biondo, Alexander W; Ko, Albert I; Stoddard, Robyn A

    2016-01-01

    Leptospirosis is a zoonotic disease usually acquired by contact with water contaminated with urine of infected animals. However, few molecular methods have been used to monitor or quantify pathogenic Leptospira in environmental water samples. Here we optimized a DNA extraction method for the quantification of leptospires using a previously described Taqman-based qPCR method targeting lipL32, a gene unique to and highly conserved in pathogenic Leptospira. QIAamp DNA mini, MO BIO PowerWater DNA and PowerSoil DNA Isolation kits were evaluated to extract DNA from sewage, pond, river and ultrapure water samples spiked with leptospires. Performance of each kit varied with sample type. Sample processing methods were further evaluated and optimized using the PowerSoil DNA kit due to its performance on turbid water samples and reproducibility. Centrifugation speeds, water volumes and use of Escherichia coli as a carrier were compared to improve DNA recovery. All matrices showed a strong linearity in a range of concentrations from 106 to 10° leptospires/mL and lower limits of detection ranging from Leptospira in environmental waters (river, pond and sewage) which consists of the concentration of 40 mL samples by centrifugation at 15,000×g for 20 minutes at 4°C, followed by DNA extraction with the PowerSoil DNA Isolation kit. Although the method described herein needs to be validated in environmental studies, it potentially provides the opportunity for effective, timely and sensitive assessment of environmental leptospiral burden.

  6. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  7. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.

    Science.gov (United States)

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A

    2003-02-01

    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  8. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  9. A Proteomics Sample Preparation Method for Mature, Recalcitrant Leaves of Perennial Plants

    Science.gov (United States)

    Na, Zhang; Chengying, Lao; Bo, Wang; Dingxiang, Peng; Lijun, Liu

    2014-01-01

    Sample preparation is key to the success of proteomics studies. In the present study, two sample preparation methods were tested for their suitability on the mature, recalcitrant leaves of six representative perennial plants (grape, plum, pear, peach, orange, and ramie). An improved sample preparation method was obtained: Tris and Triton X-100 were added together instead of CHAPS to the lysis buffer, and a 20% TCA-water solution and 100% precooled acetone were added after the protein extraction for the further purification of protein. This method effectively eliminates nonprotein impurities and obtains a clear two-dimensional gel electrophoresis array. The method facilitates the separation of high-molecular-weight proteins and increases the resolution of low-abundance proteins. This method provides a widely applicable and economically feasible technology for the proteomic study of the mature, recalcitrant leaves of perennial plants. PMID:25028960

  10. A proteomics sample preparation method for mature, recalcitrant leaves of perennial plants.

    Directory of Open Access Journals (Sweden)

    Deng Gang

    Full Text Available Sample preparation is key to the success of proteomics studies. In the present study, two sample preparation methods were tested for their suitability on the mature, recalcitrant leaves of six representative perennial plants (grape, plum, pear, peach, orange, and ramie. An improved sample preparation method was obtained: Tris and Triton X-100 were added together instead of CHAPS to the lysis buffer, and a 20% TCA-water solution and 100% precooled acetone were added after the protein extraction for the further purification of protein. This method effectively eliminates nonprotein impurities and obtains a clear two-dimensional gel electrophoresis array. The method facilitates the separation of high-molecular-weight proteins and increases the resolution of low-abundance proteins. This method provides a widely applicable and economically feasible technology for the proteomic study of the mature, recalcitrant leaves of perennial plants.

  11. Results of EDS uranium samples characterization after hydrogen loading

    International Nuclear Information System (INIS)

    Chicea, D.; Dash, J.

    2003-01-01

    Several experiments of loading natural uranium foils with hydrogen were done. Electrolysis was used for loading hydrogen into uranium, because it is the most efficient way for H loading. The composition of the surface and near surface of the samples was determined using an Oxford EDS spectrometer on a Scanning Electron Microscope, manufactured by ISI. Images were taken with several magnifications up to 3.4KX. Results reveal that when low current density was used, the surface patterns changed from granules on the surface having a typical size of 2-4 microns to pits under the surface having a typical size under one micron. When high current density was used the surface changed and presented deep fissures. The deep fissures are the result of the mechanical strain induced by the lattice expansion caused by hydrogen absorption. The surface composition was determined before and after hydrogen loading. Uranium, thorium platinum and carbon concentration were measured. Experiments suggest that the amount of thorium increases on the uranium sample with the total electric charge transported through electrolyte. Carbon concentration was found to decrease on the surface of the sample as the total electric charge transported through electrolyte increased. Platinum is used in electrolysis experiment as anode primarily because it does not dissolve in electrolyte and therefore it is not electro-deposited on the cathode surface. The results of the platinum concentration measurements on the surface of the samples we loaded with hydrogen reveal that the platinum concentration increased dramatically as the current density increased and that created platinum spots on the cathode surface. Work is in progress on the subject. (authors)

  12. Use of Monte Carlo Bootstrap Method in the Analysis of Sample Sufficiency for Radioecological Data

    International Nuclear Information System (INIS)

    Silva, A. N. C. da; Amaral, R. S.; Araujo Santos Jr, J.; Wilson Vieira, J.; Lima, F. R. de A.

    2015-01-01

    There are operational difficulties in obtaining samples for radioecological studies. Population data may no longer be available during the study and obtaining new samples may not be possible. These problems do the researcher sometimes work with a small number of data. Therefore, it is difficult to know whether the number of samples will be sufficient to estimate the desired parameter. Hence, it is critical do the analysis of sample sufficiency. It is not interesting uses the classical methods of statistic to analyze sample sufficiency in Radioecology, because naturally occurring radionuclides have a random distribution in soil, usually arise outliers and gaps with missing values. The present work was developed aiming to apply the Monte Carlo Bootstrap method in the analysis of sample sufficiency with quantitative estimation of a single variable such as specific activity of a natural radioisotope present in plants. The pseudo population was a small sample with 14 values of specific activity of 226 Ra in forage palm (Opuntia spp.). Using the R software was performed a computational procedure to calculate the number of the sample values. The re sampling process with replacement took the 14 values of original sample and produced 10,000 bootstrap samples for each round. Then was calculated the estimated average θ for samples with 2, 5, 8, 11 and 14 values randomly selected. The results showed that if the researcher work with only 11 sample values, the average parameter will be within a confidence interval with 90% probability . (Author)

  13. A Literature Study of Matrix Element Influenced to the Result of Analysis Using Absorption Atomic Spectroscopy Method (AAS)

    International Nuclear Information System (INIS)

    Tyas-Djuhariningrum

    2004-01-01

    The gold sample analysis can be deviated more than >10% to those thrue value caused by the matrix element. So that the matrix element character need to be study in order to reduce the deviation. In rock samples, the matrix elements can cause self quenching, self absorption and ionization process, so there is a result analysis error. In the rock geochemical process, the elements of the same group at the periodic system have the tendency to be together because of their same characteristic. In absorption Atomic Spectroscopy analysis, the elements associate can absorb primer energy with similar wave length so that it can cause deviation in the result interpretation. The aim of study is to predict matrix element influences from rock sample with application standard method for reducing deviation. In quantitative way, assessment of primer light intensity that will be absorbed is proportional to the concentration atom in the sample that relationship between photon intensity with concentration in part per million is linier (ppm). These methods for eliminating matrix elements influence consist of three methods : external standard method, internal standard method, and addition standard method. External standard method for all matrix element, internal standard method for elimination matrix element that have similar characteristics, addition standard methods for elimination matrix elements in Au, Pt samples. The third of standard posess here accuracy are about 95-97%. (author)

  14. Sampling in Atypical Endometrial Hyperplasia: Which Method Results in the Lowest Underestimation of Endometrial Cancer? A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Bourdel, Nicolas; Chauvet, Pauline; Tognazza, Enrica; Pereira, Bruno; Botchorishvili, Revaz; Canis, Michel

    2016-01-01

    Our objective was to identify the most accurate method of endometrial sampling for the diagnosis of complex atypical hyperplasia (CAH), and the related risk of underestimation of endometrial cancer. We conducted a systematic literature search in PubMed and EMBASE (January 1999-September 2013) to identify all registered articles on this subject. Studies were selected with a 2-step method. First, titles and abstracts were analyzed by 2 reviewers, and 69 relevant articles were selected for full reading. Then, the full articles were evaluated to determine whether full inclusion criteria were met. We selected 27 studies, taking into consideration the comparison between histology of endometrial hyperplasia obtained by diagnostic tests of interest (uterine curettage, hysteroscopically guided biopsy, or hysteroscopic endometrial resection) and subsequent results of hysterectomy. Analysis of the studies reviewed focused on 1106 patients with a preoperative diagnosis of atypical endometrial hyperplasia. The mean risk of finding endometrial cancer at hysterectomy after atypical endometrial hyperplasia diagnosed by uterine curettage was 32.7% (95% confidence interval [CI], 26.2-39.9), with a risk of 45.3% (95% CI, 32.8-58.5) after hysteroscopically guided biopsy and 5.8% (95% CI, 0.8-31.7) after hysteroscopic resection. In total, the risk of underestimation of endometrial cancer reaches a very high rate in patients with CAH using the classic method of evaluation (i.e., uterine curettage or hysteroscopically guided biopsy). This rate of underdiagnosed endometrial cancer leads to the risk of inappropriate surgical procedures (31.7% of tubal conservation in the data available and no abdominal exploration in 24.6% of the cases). Hysteroscopic resection seems to reduce the risk of underdiagnosed endometrial cancer. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  15. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  16. Adaptive cluster sampling: An efficient method for assessing inconspicuous species

    Science.gov (United States)

    Andrea M. Silletti; Joan Walker

    2003-01-01

    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  17. A distance limited method for sampling downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  18. A method to determine density in wood samples using attenuation of 59.5 KeV gamma radiation

    International Nuclear Information System (INIS)

    Dinator, M.I.; Morales, J.R.; Aliaga, N.; Karsulovic, J.T.; Sanchez, J.; Leon, L.A.

    1996-01-01

    A nondestructive method to determine the density of wood samples is presented. The photon mass attenuation coefficient in samples of Pino Radiata was measured at 59.5 KeV with a radioactive source of Am-241. The value of 0.192 ± 0.002 cm 2 /g was obtained with a gamma spectroscopy system and later used on the determination of the mass density in sixteen samples of the same species. Comparison of these results with those of gravimetric method through a linear regression showed a slope of 1.001 and correlation factor of 0.94. (author)

  19. Hierarchical Coupling of First-Principles Molecular Dynamics with Advanced Sampling Methods.

    Science.gov (United States)

    Sevgen, Emre; Giberti, Federico; Sidky, Hythem; Whitmer, Jonathan K; Galli, Giulia; Gygi, Francois; de Pablo, Juan J

    2018-05-14

    We present a seamless coupling of a suite of codes designed to perform advanced sampling simulations, with a first-principles molecular dynamics (MD) engine. As an illustrative example, we discuss results for the free energy and potential surfaces of the alanine dipeptide obtained using both local and hybrid density functionals (DFT), and we compare them with those of a widely used classical force field, Amber99sb. In our calculations, the efficiency of first-principles MD using hybrid functionals is augmented by hierarchical sampling, where hybrid free energy calculations are initiated using estimates obtained with local functionals. We find that the free energy surfaces obtained from classical and first-principles calculations differ. Compared to DFT results, the classical force field overestimates the internal energy contribution of high free energy states, and it underestimates the entropic contribution along the entire free energy profile. Using the string method, we illustrate how these differences lead to different transition pathways connecting the metastable minima of the alanine dipeptide. In larger peptides, those differences would lead to qualitatively different results for the equilibrium structure and conformation of these molecules.

  20. Comparing two sampling methods to engage hard-to-reach communities in research priority setting.

    Science.gov (United States)

    Valerio, Melissa A; Rodriguez, Natalia; Winkler, Paula; Lopez, Jaime; Dennison, Meagen; Liang, Yuanyuan; Turner, Barbara J

    2016-10-28

    Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1) snowball sampling, a chain- referral method or 2) purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community). Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities' stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 %) consented, 52 (95 %) attended the first meeting, and 36 (65 %) attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 %) consented, 36 (58 %) attended the first meeting, and 26 (42 %) attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045) which was higher for the purposive/convenience sampling group and for city improvements/transportation services (P = 0.004) which was higher for the snowball sampling group. In each of the two similar hard-to-reach communities, a community advisory board partnered with researchers

  1. Sampling for Patient Exit Interviews: Assessment of Methods Using Mathematical Derivation and Computer Simulations.

    Science.gov (United States)

    Geldsetzer, Pascal; Fink, Günther; Vaikath, Maria; Bärnighausen, Till

    2018-02-01

    (1) To evaluate the operational efficiency of various sampling methods for patient exit interviews; (2) to discuss under what circumstances each method yields an unbiased sample; and (3) to propose a new, operationally efficient, and unbiased sampling method. Literature review, mathematical derivation, and Monte Carlo simulations. Our simulations show that in patient exit interviews it is most operationally efficient if the interviewer, after completing an interview, selects the next patient exiting the clinical consultation. We demonstrate mathematically that this method yields a biased sample: patients who spend a longer time with the clinician are overrepresented. This bias can be removed by selecting the next patient who enters, rather than exits, the consultation room. We show that this sampling method is operationally more efficient than alternative methods (systematic and simple random sampling) in most primary health care settings. Under the assumption that the order in which patients enter the consultation room is unrelated to the length of time spent with the clinician and the interviewer, selecting the next patient entering the consultation room tends to be the operationally most efficient unbiased sampling method for patient exit interviews. © 2016 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.

  2. Sampling Methods for Detection and Monitoring of the Asian Citrus Psyllid (Hemiptera: Psyllidae).

    Science.gov (United States)

    Monzo, C; Arevalo, H A; Jones, M M; Vanaclocha, P; Croxton, S D; Qureshi, J A; Stansly, P A

    2015-06-01

    The Asian citrus psyllid (ACP), Diaphorina citri Kuwayama is a key pest of citrus due to its role as vector of citrus greening disease or "huanglongbing." ACP monitoring is considered an indispensable tool for management of vector and disease. In the present study, datasets collected between 2009 and 2013 from 245 citrus blocks were used to evaluate precision, sensitivity for detection, and efficiency of five sampling methods. The number of samples needed to reach a 0.25 standard error-mean ratio was estimated using Taylor's power law and used to compare precision among sampling methods. Comparison of detection sensitivity and time expenditure (cost) between stem-tap and other sampling methodologies conducted consecutively at the same location were also assessed. Stem-tap sampling was the most efficient sampling method when ACP densities were moderate to high and served as the basis for comparison with all other methods. Protocols that grouped trees near randomly selected locations across the block were more efficient than sampling trees at random across the block. Sweep net sampling was similar to stem-taps in number of captures per sampled unit, but less precise at any ACP density. Yellow sticky traps were 14 times more sensitive than stem-taps but much more time consuming and thus less efficient except at very low population densities. Visual sampling was efficient for detecting and monitoring ACP at low densities. Suction sampling was time consuming and taxing but the most sensitive of all methods for detection of sparse populations. This information can be used to optimize ACP monitoring efforts. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  4. A direct sampling method for inverse electromagnetic medium scattering

    KAUST Repository

    Ito, Kazufumi; Jin, Bangti; Zou, Jun

    2013-01-01

    In this paper, we study the inverse electromagnetic medium scattering problem of estimating the support and shape of medium scatterers from scattered electric/magnetic near-field data. We shall develop a novel direct sampling method based

  5. Evaluation of three sampling methods to monitor outcomes of antiretroviral treatment programmes in low- and middle-income countries.

    Science.gov (United States)

    Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François

    2010-11-10

    Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.

  6. Application of mercurometric analysis methods to radioactive (and/or toxic) samples: Pycnometry and porosimetry

    International Nuclear Information System (INIS)

    Sannen, L.

    1991-01-01

    The analytical tools and methods used in the laboratory of High and Medium Activity of the Nuclear Research Centre in Mol to determine the density and the open porosity of radioactive (and/or toxic) samples are described. The density is determined by a vacuum pycnometer with plunger displacement. This home-made apparatus has been automated up to a high degree so that operation is easily performed in the remote handling conditions of a hot cell environment. The amount of mercury displaced by the sample is measured. The accuracy is better than 0.2 %. The porosimeter is a commercial apparatus which was modified to improve the hot cell compatibility and to provide fast processing of the data. The open porosity and its pore size distribution are determined from the measurement of the amount of mercury intruded into the sample under increasing pressure. The paper describes both instruments and the working methods. Also included are some examples of measurement results. (author). 5 figs

  7. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study.

    Science.gov (United States)

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-06-22

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis ® μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis ® μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis ® μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis ® μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  8. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study

    Science.gov (United States)

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D’Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-01-01

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations. PMID:28640202

  9. Two methods of self-sampling compared to clinician sampling to detect reproductive tract infections in Gugulethu, South Africa

    NARCIS (Netherlands)

    van de Wijgert, Janneke; Altini, Lydia; Jones, Heidi; de Kock, Alana; Young, Taryn; Williamson, Anna-Lise; Hoosen, Anwar; Coetzee, Nicol

    2006-01-01

    To assess the validity, feasibility, and acceptability of 2 methods of self-sampling compared to clinician sampling during a speculum examination. To improve screening for reproductive tract infections (RTIs) in resource-poor settings. In a public clinic in Cape Town, 450 women underwent a speculum

  10. The use of physical methods for elemental analysis of ecological samples

    International Nuclear Information System (INIS)

    Kudryashov, V.I.; Zhuravleva, E.L.; Maslov, O.D.

    1996-01-01

    The possibility of the application of difference X-ray and instrumental activation methods elemental analysis of rock ice, snow, water, soil and other natural samples was investigated. The content of some elements in ice samples from the glaciers of the Pamirs-Alaj mountain system for period 1973-1984 years has been determined. The recommendations for the choice of analysis methods with the aim of the environmental control have been given. (author). 10 refs., 6 figs., 1 tab

  11. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    Science.gov (United States)

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  12. Application of WSP method in analysis of environmental samples

    International Nuclear Information System (INIS)

    Stacho, M.; Slugen, V.; Hinca, R.; Sojak, S.; Krnac, S.

    2014-01-01

    Detection of activity in natural samples is specific especially because of its low level and high background interferences. Reduction of background interferences could be reached using low background chamber. Measurement geometry in shape of Marinelli beaker is commonly used according to low level of activity in natural samples. The Peak Net Area (PNA) method is the world-wide accepted technique for analysis of gamma-ray spectra. It is based on the net area calculation of the full energy peak, therefore, it takes into account only a fraction of measured gamma-ray spectrum. On the other hand, the Whole Spectrum Processing (WSP) approach to the gamma analysis makes possible to use entire information being in the spectrum. This significantly raises efficiency and improves energy resolution of the analysis. A principal step for the WSP application is building up the suitable response operator. Problems are put in an appearance when suitable standard calibration sources are unavailable. It may be occurred in the case of large volume samples and/or in the analysis of high energy range. Combined experimental and mathematical calibration may be a suitable solution. Many different detectors have been used to register the gamma ray and its energy. HPGe detectors produce the highest resolution commonly available today. Therefore they are they the most often used detectors in natural samples activity analysis. Scintillation detectors analysed using PNA method could be also used in simple cases, but for complicated spectra are practically inapplicable. WSP approach improves resolution of scintillation detectors and expands their applicability. WSP method allowed significant improvement of the energetic resolution and separation of "1"3"7Cs 661 keV peak from "2"1"4Bi 609 keV peak. At the other hand the statistical fluctuations in the lower part of the spectrum highlighted by background subtraction causes that this part is still not reliably analyzable. (authors)

  13. A fast learning method for large scale and multi-class samples of SVM

    Science.gov (United States)

    Fan, Yu; Guo, Huiming

    2017-06-01

    A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.

  14. Improving the Accuracy of the Hyperspectral Model for Apple Canopy Water Content Prediction using the Equidistant Sampling Method.

    Science.gov (United States)

    Zhao, Huan-San; Zhu, Xi-Cun; Li, Cheng; Wei, Yu; Zhao, Geng-Xing; Jiang, Yuan-Mao

    2017-09-11

    The influence of the equidistant sampling method was explored in a hyperspectral model for the accurate prediction of the water content of apple tree canopy. The relationship between spectral reflectance and water content was explored using the sample partition methods of equidistant sampling and random sampling, and a stepwise regression model of the apple canopy water content was established. The results showed that the random sampling model was Y = 0.4797 - 721787.3883 × Z 3 - 766567.1103 × Z 5 - 771392.9030 × Z 6 ; the equidistant sampling model was Y = 0.4613 - 480610.4213 × Z 2 - 552189.0450 × Z 5 - 1006181.8358 × Z 6 . After verification, the equidistant sampling method was verified to offer a superior prediction ability. The calibration set coefficient of determination of 0.6599 and validation set coefficient of determination of 0.8221 were higher than that of the random sampling model by 9.20% and 10.90%, respectively. The root mean square error (RMSE) of 0.0365 and relative error (RE) of 0.0626 were lower than that of the random sampling model by 17.23% and 17.09%, respectively. Dividing the calibration set and validation set by the equidistant sampling method can improve the prediction accuracy of the hyperspectral model of apple canopy water content.

  15. Comparison of fine particle measurements from a direct-reading instrument and a gravimetric sampling method.

    Science.gov (United States)

    Kim, Jee Young; Magari, Shannon R; Herrick, Robert F; Smith, Thomas J; Christiani, David C

    2004-11-01

    Particulate air pollution, specifically the fine particle fraction (PM2.5), has been associated with increased cardiopulmonary morbidity and mortality in general population studies. Occupational exposure to fine particulate matter can exceed ambient levels by a large factor. Due to increased interest in the health effects of particulate matter, many particle sampling methods have been developed In this study, two such measurement methods were used simultaneously and compared. PM2.5 was sampled using a filter-based gravimetric sampling method and a direct-reading instrument, the TSI Inc. model 8520 DUSTTRAK aerosol monitor. Both sampling methods were used to determine the PM2.5 exposure in a group of boilermakers exposed to welding fumes and residual fuel oil ash. The geometric mean PM2.5 concentration was 0.30 mg/m3 (GSD 3.25) and 0.31 mg/m3 (GSD 2.90)from the DUSTTRAK and gravimetric method, respectively. The Spearman rank correlation coefficient for the gravimetric and DUSTTRAK PM2.5 concentrations was 0.68. Linear regression models indicated that log, DUSTTRAK PM2.5 concentrations significantly predicted loge gravimetric PM2.5 concentrations (p gravimetric PM2.5 concentrations was found to be modified by surrogate measures for seasonal variation and type of aerosol. PM2.5 measurements from the DUSTTRAK are well correlated and highly predictive of measurements from the gravimetric sampling method for the aerosols in these work environments. However, results from this study suggest that aerosol particle characteristics may affect the relationship between the gravimetric and DUSTTRAK PM2.5 measurements. Recalibration of the DUSTTRAK for the specific aerosol, as recommended by the manufacturer, may be necessary to produce valid measures of airborne particulate matter.

  16. Statistical methods for detecting differentially abundant features in clinical metagenomic samples.

    Directory of Open Access Journals (Sweden)

    James Robert White

    2009-04-01

    Full Text Available Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them.We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software

  17. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    Science.gov (United States)

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  18. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten

    2014-01-01

    was to evaluate two different DNA extraction methods in order to choose the most efficient method for studying intestinal bacterial diversity using Denaturing Gradient Gel Electrophoresis (DGGE). FINDINGS: In this study, a semi-automatic DNA extraction system (easyMag®, BioMérieux, Marcy I'Etoile, France......BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study...... by easyMag® from the same fecal samples. Furthermore, DNA extracts obtained using easyMag® seemed to contain inhibitory compounds, since in order to perform a successful PCR-analysis, the sample should be diluted at least 10 times. DGGE performed on PCR from DNA extracted by QIAamp DNA Stool Mini Kit DNA...

  19. The economic impact of poor sample quality in clinical chemistry laboratories: results from a global survey.

    Science.gov (United States)

    Erdal, Erik P; Mitra, Debanjali; Khangulov, Victor S; Church, Stephen; Plokhoy, Elizabeth

    2017-03-01

    Background Despite advances in clinical chemistry testing, poor blood sample quality continues to impact laboratory operations and the quality of results. While previous studies have identified the preanalytical causes of lower sample quality, few studies have examined the economic impact of poor sample quality on the laboratory. Specifically, the costs associated with workarounds related to fibrin and gel contaminants remain largely unexplored. Methods A quantitative survey of clinical chemistry laboratory stakeholders across 10 international regions, including countries in North America, Europe and Oceania, was conducted to examine current blood sample testing practices, sample quality issues and practices to remediate poor sample quality. Survey data were used to estimate costs incurred by laboratories to mitigate sample quality issues. Results Responses from 164 participants were included in the analysis, which was focused on three specific issues: fibrin strands, fibrin masses and gel globules. Fibrin strands were the most commonly reported issue, with an overall incidence rate of ∼3%. Further, 65% of respondents indicated that these issues contribute to analyzer probe clogging, and the majority of laboratories had visual inspection and manual remediation practices in place to address fibrin- and gel-related quality problems (55% and 70%, respectively). Probe maintenance/replacement, visual inspection and manual remediation were estimated to carry significant costs for the laboratories surveyed. Annual cost associated with lower sample quality and remediation related to fibrin and/or gel globules for an average US laboratory was estimated to be $100,247. Conclusions Measures to improve blood sample quality present an important step towards improved laboratory operations.

  20. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods

    Science.gov (United States)

    Cynthia D. Huebner

    2007-01-01

    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  1. Suitability of the line intersect method for sampling hardwood logging residues

    Science.gov (United States)

    A. Jeff Martin

    1976-01-01

    The line intersect method of sampling logging residues was tested in Appalachian hardwoods and was found to provide unbiased estimates of the volume of residue in cubic feet per acre. Thirty-two chains of sample line were established on each of sixteen 1-acre plots on cutover areas in a variety of conditions. Estimates from these samples were then compared to actual...

  2. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    International Nuclear Information System (INIS)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits

  3. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    Science.gov (United States)

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  4. Finish-Kazakhstan cooperation on an aerosols sampling - testing of a new methods for nuclear monitoring improvement

    International Nuclear Information System (INIS)

    Tarvajnen, M.; Akhmetov, M.A.; Ptitskaya, L.D.; Osintsev, A.Yu.; Zhantikin, T.M.; Eligbaeva, G.

    2001-01-01

    The aerosols sampling is the powerful method of air radioactivity monitoring both the natural and artificial origin. Up to the today the IAEA does not engage of aerosols sampling study. To study of possibility of this method examination for radiation monitoring - the state authorities of Finland and the Republic of Kazakhstan - Department of Radiation and Nuclear Safety (Stuck) and Kazakhstan Atomic Energy Committee - jointly carried out the field tests in Kazakhstan. The test was began in the Kurchatov in April 2000 - at the desire of IAEA working team on Iraq - close to former Semipalatinsk test site and was ended in Astana in August of 2001. The main aim of the field test was study of possibility and appropriateness of concept and technology of aerosols sampling developed for the complete condition of environment. In the paper the role of participating sides in the field test and main results and conclusions are discussed as well. The gained experience will allow developing the method aerosols sampling for IAEA international safeguard measures strengthening application

  5. Small Body GN and C Research Report: G-SAMPLE - An In-Flight Dynamical Method for Identifying Sample Mass [External Release Version

    Science.gov (United States)

    Carson, John M., III; Bayard, David S.

    2006-01-01

    G-SAMPLE is an in-flight dynamical method for use by sample collection missions to identify the presence and quantity of collected sample material. The G-SAMPLE method implements a maximum-likelihood estimator to identify the collected sample mass, based on onboard force sensor measurements, thruster firings, and a dynamics model of the spacecraft. With G-SAMPLE, sample mass identification becomes a computation rather than an extra hardware requirement; the added cost of cameras or other sensors for sample mass detection is avoided. Realistic simulation examples are provided for a spacecraft configuration with a sample collection device mounted on the end of an extended boom. In one representative example, a 1000 gram sample mass is estimated to within 110 grams (95% confidence) under realistic assumptions of thruster profile error, spacecraft parameter uncertainty, and sensor noise. For convenience to future mission design, an overall sample-mass estimation error budget is developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  6. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  7. Effect of sample preparation methods on photometric determination of the tellurium and cobalt content in the samples of copper concentrates

    Directory of Open Access Journals (Sweden)

    Viktoriya Butenko

    2016-03-01

    Full Text Available Methods of determination of cobalt and nickel in copper concentrates currently used in factory laboratories are very labor intensive and time consuming. The limiting stage of the analysis is preliminary chemical sample preparation. Carrying out the decomposition process of industrial samples with concentrated mineral acids in open systems does not allow to improve the metrological characteristics of the methods, for this reason improvement the methods of sample preparation is quite relevant and has a practical interest. The work was dedicated to the determination of the optimal conditions of preliminary chemical preparation of copper concentrate samples for the subsequent determination of cobalt and tellurium in the obtained solution using tellurium-spectrophotometric method. Decomposition of the samples was carried out by acid dissolving in individual mineral acids and their mixtures by heating in an open system as well as by using ultrasonification and microwave radiation in a closed system. In order to select the optimal conditions for the decomposition of the samples in a closed system the phase contact time and ultrasonic generator’s power were varied. Intensification of the processes of decomposition of copper concentrates with nitric acid (1:1, ultrasound and microwave radiation allowed to transfer quantitatively cobalt and tellurium into solution spending 20 and 30 min respectively. This reduced the amount of reactants used and improved the accuracy of determination by running the process in strictly identical conditions.

  8. Optimization of fecal cytology in the dog: comparison of three sampling methods.

    Science.gov (United States)

    Frezoulis, Petros S; Angelidou, Elisavet; Diakou, Anastasia; Rallis, Timoleon S; Mylonakis, Mathios E

    2017-09-01

    Dry-mount fecal cytology (FC) is a component of the diagnostic evaluation of gastrointestinal diseases. There is limited information on the possible effect of the sampling method on the cytologic findings of healthy dogs or dogs admitted with diarrhea. We aimed to: (1) establish sampling method-specific expected values of selected cytologic parameters (isolated or clustered epithelial cells, neutrophils, lymphocytes, macrophages, spore-forming rods) in clinically healthy dogs; (2) investigate if the detection of cytologic abnormalities differs among methods in dogs admitted with diarrhea; and (3) investigate if there is any association between FC abnormalities and the anatomic origin (small- or large-bowel diarrhea) or the chronicity of diarrhea. Sampling with digital examination (DE), rectal scraping (RS), and rectal lavage (RL) was prospectively assessed in 37 healthy and 34 diarrheic dogs. The median numbers of isolated ( p = 0.000) or clustered ( p = 0.002) epithelial cells, and of lymphocytes ( p = 0.000), differed among the 3 methods in healthy dogs. In the diarrheic dogs, the RL method was the least sensitive in detecting neutrophils, and isolated or clustered epithelial cells. Cytologic abnormalities were not associated with the origin or the chronicity of diarrhea. Sampling methods differed in their sensitivity to detect abnormalities in FC; DE or RS may be of higher sensitivity compared to RL. Anatomic origin or chronicity of diarrhea do not seem to affect the detection of cytologic abnormalities.

  9. SEAMIST trademark soil sampling for tritiated water: First year's results

    International Nuclear Information System (INIS)

    Mallon, B.; Martins, S.A.; Houpis, J.L.; Lowry, W.; Cremer, C.D.

    1992-01-01

    SEAMIST trademark is a recently developed sampling system that enables one to measure various soil parameters by means of an inverted, removable, impermeable membrane tube inserted in a borehole. This membrane tube can have various measuring devices installed on it, such as gas ports, adsorbent pads, and electrical sensors. These membrane tubes are made of a laminated polymer. The Lawrence Livermore National Laboratory in Livermore, California, has installed two of these systems to monitor tritium in soil resulting from a leak in an underground storage tank. One tube is equipped with gas ports to sample soil vapor and the other with adsorbent pads to sample soil moisture. Borehole stability was maintained using either sand-filled or air-inflated tubes. Both system implementations yielded concentrations or activities that compared well with the measured concentrations of tritium in the soil taken during borehole construction. In addition, an analysis of the data suggest that both systems prevented the vertical migration of tritium in the boreholes. Also, a neutron probe was successfully used in a blank membrane inserted in one of the boreholes to monitor the moisture in the soil without exposing the probe to the tritium. The neutron log showed excellent agreement with the soil moisture content measured in soil samples taken during borehole construction. This paper describes the two SEAMIST trademark systems used and presents sampling results and comparisons

  10. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Science.gov (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Note: A new method for directly reducing the sampling jitter noise of the digital phasemeter

    Science.gov (United States)

    Liang, Yu-Rong

    2018-03-01

    The sampling jitter noise is one non-negligible noise source of the digital phasemeter used for space gravitational wave detection missions. This note provides a new method for directly reducing the sampling jitter noise of the digital phasemeter, by adding a dedicated signal of which the frequency, amplitude, and initial phase should be pre-set. In contrast to the phase correction using the pilot-tone in the work of Burnett, Gerberding et al., Liang et al., Ales et al., Gerberding et al., and Ware et al. [M.Sc. thesis, Luleå University of Technology, 2010; Classical Quantum Gravity 30, 235029 (2013); Rev. Sci. Instrum. 86, 016106 (2015); Rev. Sci. Instrum. 86, 084502 (2015); Rev. Sci. Instrum. 86, 074501 (2015); and Proceedings of the Earth Science Technology Conference (NASA, USA, 2006)], the new method is intrinsically additive noise suppression. The experiment results validate that the new method directly reduces the sampling jitter noise without data post-processing and provides the same phase measurement noise level (10-6 rad/Hz1/2 at 0.1 Hz) as the pilot-tone correction.

  12. Comparison of Three Sample Preparation Methods for Analysis of Chemical Warfare Agent Stimulants in Water

    International Nuclear Information System (INIS)

    Alessandro Sassolini

    2015-01-01

    Analytical chemistry in CBRNe (Chemical Biological Radiological Nuclear explosive) context requires not only high quality data; quickness, ruggedness and robustness are also mandatory. In this work, three samples preparation methods were compared using several organophosphorus pesticides as test compounds, used as stimulants of nerve CWA (Chemical Warfare Agents) to choose the one with best characteristics. Result was obtained better with the Dispersive Liquid-Liquid Micro Extraction (DLLME), relatively new in CBRNe field, obtaining uncertainty for different simulants between 8 and 15 % while a quantification limit between 0.01 and 0.08 μg/ l. To optimize this extraction method, different organo chlorinated solvents also tested but not relevant difference in these tests was obtained. In this work, all samples were analyzed by using a gas chromatography coupled with mass spectrometer (GC-MS) and also with Gas Chromatograph coupled with Nitrogen Phosphorous Detector (NPD) for DLLME samples to evaluate a low cost and rugged instrument adapt to field analytical methods with good performance in terms of uncertainty and sensibility even if poorer respect to the mass spectrometry. (author)

  13. New kinetic-spectrophotometric method for monitoring the concentration of iodine in river and city water samples.

    Science.gov (United States)

    Farmany, A; Khosravi, A; Abbasi, S; Cheraghi, J; Hushmandfar, R; Sobhanardakani, S; Noorizadeh, H; Mortazavi, S S

    2013-01-01

    A new kinetic method has been developed for the determination of iodine in water samples. The method is based on the catalytic effect of I(-) with the oxidation of Indigo Carmine (IC) by KBrO(3) in the sulfuric acid medium. The optimum conditions obtained are 0.16 M sulfuric acid, 1 × 10(-3) M of IC, 1 × 10(-2) M KBrO(3), reaction temperature of 35°C, and reaction time of 80 s at 612 nm. Under the optimized conditions, the method allowed the quantification of I(-) in a range of 12-375 ng/mL with a detection limit of 0.46 ng/mL. The method was applied to the determination of iodine in river and city water samples with the satisfactorily results.

  14. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  15. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  16. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    Science.gov (United States)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  17. Appropriate xenon-inhalation speed in xenon-enhanced CT using the end-tidal gas-sampling method

    International Nuclear Information System (INIS)

    Suga, Sadao; Toya, Shigeo; Kawase, Takeshi; Koyama, Hideki; Shiga, Hayao

    1986-01-01

    This report describes some problems when end-tidal xenon gas is substituted for the arterial xenon concentration in xenon-enhanced CT. The authors used a newly developed xenon inhalator with a xenon-gas-concentration analyzer and performed xenon-enhanced CT by means of the ''arterio-venous shunt'' method and the ''end-tidal gas-sampling'' method simultaneously. By the former method, the arterial build-up rate (K) was obtained directly from the CT slices of a blood circuit passing through the phantom. By the latter method, it was calculated from the xenon concentration of end-tidal gas sampled from the mask. The speed of xenon supply was varied between 0.6 - 1.2 L/min. in 11 patients with or without a cerebral lesion. The results revealed that rapid xenon inhalation caused a discrepancy in the arterial K between the ''shunt'' method and the ''end-tidal'' method. This discrepancy may be responsible for the mixing of inhalated gas and expired gas in respiratory dead space, such as the nasal cavity or the mask. The cerebral blood flow was underestimated because of the higher arterial K in the latter method. Too much slow inhalation, however, was timewasting, and it increased the body motion in the subanesthetic state. Therefore, an inhalation speed of the arterial K of as much as 0.2 was ideal to represent the end-tidal xenon concentration for the arterial K in the ''end-tidal gas-sampling'' method. When attention is given to this point, this method may offer a reliable absolute value in xenon-enhanced CT. (author)

  18. Investigate the capability of INAA absolute method to determine the concentrations of 238U and 232Th in rock samples

    International Nuclear Information System (INIS)

    Alnour, I.A.

    2014-01-01

    This work aimed to study the capability of INAA absolute method in determining the elemental concentration of 238 U and 232 Th in the rock samples. The INAA absolute method was implemented in PUSPATI TRIGA Mark II research reactor, Malaysian Nuclear Agency (NM). The accuracy of INAA absolute method was performed by analyzing the IAEA certified reference material (CRM) Soil-7. The analytical results showed the deviations between experimental and certified values were mostly less than 10 % with Z-score in most cases less than 1. In general, the results of analysed CRM Soil-7 show a good agreement between certified and experimental results which mean that the INAA absolute method can be used accurately for elemental analysis of uranium and thorium in various types of samples. The concentration of 238 U and 232 Th ranged from 1.77 to 24.25 and 0.88 to 95.50 ppm respectively. The highest value of 238 U and 232 Th was recorded for granite rock sample G17 of 238 U and sample G9 of 232 Th, whereas the lower value was 1.77 ppm of 238 U recorded in sandstone rock and 0.88 ppm of 232 Th for gabbro. Moreover, a comparison of the 238 U and 232 Th results obtained by the INAA absolute method shows an acceptable level of consistency with those obtained by the INAA relative method. (author)

  19. Comparison of vapor sampling system (VSS) and in situ vapor sampling (ISVS) methods on Tanks C-107, BY-108, and S-102. Revision 1

    International Nuclear Information System (INIS)

    Huckaby, J.L.; Edwards, J.A.; Evans, J.C.

    1996-08-01

    This report discusses comparison tests for two methods of collecting vapor samples from the Hanford Site high-level radioactive waste tank headspaces. The two sampling methods compared are the truck-mounted vapor sampling system (VSS) and the cart-mounted in-situ vapor sampling (ISVS). Three tanks were sampled by both the VSS and ISVS methods from the same access risers within the same 8-hour period. These tanks have diverse headspace compositions and they represent the highest known level of several key vapor analytes

  20. Monte Carlo Methods Development and Applications in Conformational Sampling of Proteins

    DEFF Research Database (Denmark)

    Tian, Pengfei

    quantitative insights into their thermodynamic and mechanistic properties that are difficult to probe in laboratory experiments. However, despite the rapid progress in the development of molecular simulation, there are still two limiting factors, (1), the current molecular mechanics force fields alone...... sampling methods to address these two problems. First of all, a novel technique has been developed for reliably estimating diffusion coefficients for use in the enhanced sampling of molecular simulations. A broad applicability of this method is illustrated by studying various simulation problems...

  1. A METHOD FOR PREPARING A SUBSTRATE BY APPLYING A SAMPLE TO BE ANALYSED

    DEFF Research Database (Denmark)

    2017-01-01

    The invention relates to a method for preparing a substrate (105a) comprising a sample reception area (110) and a sensing area (111). The method comprises the steps of: 1) applying a sample on the sample reception area; 2) rotating the substrate around a predetermined axis; 3) during rotation......, at least part of the liquid travels from the sample reception area to the sensing area due to capillary forces acting between the liquid and the substrate; and 4) removing the wave of particles and liquid formed at one end of the substrate. The sensing area is closer to the predetermined axis than...... the sample reception area. The sample comprises a liquid part and particles suspended therein....

  2. Influence of Sample Size on Automatic Positional Accuracy Assessment Methods for Urban Areas

    Directory of Open Access Journals (Sweden)

    Francisco J. Ariza-López

    2018-05-01

    Full Text Available In recent years, new approaches aimed to increase the automation level of positional accuracy assessment processes for spatial data have been developed. However, in such cases, an aspect as significant as sample size has not yet been addressed. In this paper, we study the influence of sample size when estimating the planimetric positional accuracy of urban databases by means of an automatic assessment using polygon-based methodology. Our study is based on a simulation process, which extracts pairs of homologous polygons from the assessed and reference data sources and applies two buffer-based methods. The parameter used for determining the different sizes (which range from 5 km up to 100 km has been the length of the polygons’ perimeter, and for each sample size 1000 simulations were run. After completing the simulation process, the comparisons between the estimated distribution functions for each sample and population distribution function were carried out by means of the Kolmogorov–Smirnov test. Results show a significant reduction in the variability of estimations when sample size increased from 5 km to 100 km.

  3. Simultaneous Determination of TetracyclinesResidues in Bovine Milk Samples by Solid Phase Extraction and HPLC-FL Method

    Directory of Open Access Journals (Sweden)

    Mehra Mesgari Abbasi

    2011-06-01

    Full Text Available Introduction:Tetracyclines (TCs are widely used in animal husbandry and their residues in milk may resultinharmful effects on human. The aim of this study was to investigate the presence of TCs residues in various bovine milk samples from local markets of Ardabil, Iran. Methods:One hundred and fourteen pasteurized, sterilized and raw milk samples were collected from markets of Ardabil. Tetracycline, Oxytetracycline and Chlortetracycline (TCs residues extraction carried out by Solid Phase Extraction method. Determination of TCs residues were performed by high performance liquid chromatography (HPLC method using Fluorescence detector.Results: The mean of total TCs residues in all samples (114 samples was 97.6 ±16.9ng/g and that of pasteurized, sterilized and raw milk samples were 87.1 ± 17.7, 112.0 ± 57.3 and 154.0 ± 66.3ng/g respectively. Twenty five point four percent of the all samples, and24.4%, 30% and 28.6% of the pasteurized, sterilized and raw milk samples, respectively had higher TCs residues than the recommended maximum levels (100ng/g. Conclusion:This study indicates the presence of tetracycline residues more than allowed amount. Regulatory authorities should ensure proper withdrawal period before milking the animals and definite supervisions are necessary on application of these drugs.

  4. Surface plasmon resonance based sensing of different chemical and biological samples using admittance loci method

    Science.gov (United States)

    Brahmachari, Kaushik; Ghosh, Sharmila; Ray, Mina

    2013-06-01

    The admittance loci method plays an important role in the design of multilayer thin film structures. In this paper, admittance loci method has been explored theoretically for sensing of various chemical and biological samples based on surface plasmon resonance (SPR) phenomenon. A dielectric multilayer structure consisting of a Boro silicate glass (BSG) substrate, calcium fluoride (CaF2) and zirconium dioxide (ZrO2) along with different dielectric layers has been investigated. Moreover, admittance loci as well as SPR curves of metal-dielectric multilayer structure consisting of the BSG substrate, gold metal film and various dielectric samples has been simulated in MATLAB environment. To validate the proposed simulation results, calibration curves have also been provided.

  5. Estimation of AUC or Partial AUC under Test-Result-Dependent Sampling.

    Science.gov (United States)

    Wang, Xiaofei; Ma, Junling; George, Stephen; Zhou, Haibo

    2012-01-01

    The area under the ROC curve (AUC) and partial area under the ROC curve (pAUC) are summary measures used to assess the accuracy of a biomarker in discriminating true disease status. The standard sampling approach used in biomarker validation studies is often inefficient and costly, especially when ascertaining the true disease status is costly and invasive. To improve efficiency and reduce the cost of biomarker validation studies, we consider a test-result-dependent sampling (TDS) scheme, in which subject selection for determining the disease state is dependent on the result of a biomarker assay. We first estimate the test-result distribution using data arising from the TDS design. With the estimated empirical test-result distribution, we propose consistent nonparametric estimators for AUC and pAUC and establish the asymptotic properties of the proposed estimators. Simulation studies show that the proposed estimators have good finite sample properties and that the TDS design yields more efficient AUC and pAUC estimates than a simple random sampling (SRS) design. A data example based on an ongoing cancer clinical trial is provided to illustrate the TDS design and the proposed estimators. This work can find broad applications in design and analysis of biomarker validation studies.

  6. Novel Degenerate PCR Method for Whole-Genome Amplification Applied to Peru Margin (ODP Leg 201) Subsurface Samples

    Science.gov (United States)

    Martino, Amanda J.; Rhodes, Matthew E.; Biddle, Jennifer F.; Brandt, Leah D.; Tomsho, Lynn P.; House, Christopher H.

    2011-01-01

    A degenerate polymerase chain reaction (PCR)-based method of whole-genome amplification, designed to work fluidly with 454 sequencing technology, was developed and tested for use on deep marine subsurface DNA samples. While optimized here for use with Roche 454 technology, the general framework presented may be applicable to other next generation sequencing systems as well (e.g., Illumina, Ion Torrent). The method, which we have called random amplification metagenomic PCR (RAMP), involves the use of specific primers from Roche 454 amplicon sequencing, modified by the addition of a degenerate region at the 3′ end. It utilizes a PCR reaction, which resulted in no amplification from blanks, even after 50 cycles of PCR. After efforts to optimize experimental conditions, the method was tested with DNA extracted from cultured E. coli cells, and genome coverage was estimated after sequencing on three different occasions. Coverage did not vary greatly with the different experimental conditions tested, and was around 62% with a sequencing effort equivalent to a theoretical genome coverage of 14.10×. The GC content of the sequenced amplification product was within 2% of the predicted values for this strain of E. coli. The method was also applied to DNA extracted from marine subsurface samples from ODP Leg 201 site 1229 (Peru Margin), and results of a taxonomic analysis revealed microbial communities dominated by Proteobacteria, Chloroflexi, Firmicutes, Euryarchaeota, and Crenarchaeota, among others. These results were similar to those obtained previously for those samples; however, variations in the proportions of taxa identified illustrates well the generally accepted view that community analysis is sensitive to both the amplification technique used and the method of assigning sequences to taxonomic groups. Overall, we find that RAMP represents a valid methodology for amplifying metagenomes from low-biomass samples. PMID:22319519

  7. A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.

    Science.gov (United States)

    Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R

    2017-07-01

    The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Trace elements detection in whole food samples by Neutron Activation Analysis, k{sub 0}-method

    Energy Technology Data Exchange (ETDEWEB)

    Sathler, Márcia Maia; Menezes, Maria Ângela de Barros Correia, E-mail: maia.sathler@gmail.com, E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Salles, Paula Maria Borges de, E-mail: pauladesalles@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    Inorganic elements, from natural and anthropogenic sources are present in foods in different concentrations. With the increase in anthropogenic activities, there was also a considerable increase in the emission of these elements in the environment, leading to the need of monitoring the elemental composition of foods available for consumption. Numerous techniques have been used to detect inorganic elements in biological and environmental matrices, always aiming at reaching lower detection limits in order to evaluate the trace element content in the sample. Neutron activation analysis (INAA), applying the k{sub 0}-method, produces accurate and precise results without the need of chemical preparation of the samples – that could cause their contamination. This study evaluated the presence of inorganic elements in whole foods samples, mainly elements on trace levels. For this purpose, seven samples of different types of whole foods were irradiated in the TRIGA MARK I IPR-R1 research reactor - located at CDTN/CNEN, in Belo Horizonte, MG. It was possible to detect twenty two elements above the limit of detection in, at least, one of the samples analyzed. This study reaffirms the INAA, k{sub 0} - method, as a safe and efficient technique for detecting trace elements in food samples. (author)

  9. The Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Folley, G.; Pearson, L.; Crosby, C. [Alaska Dept. of Environmental Conservation, Soldotna, AK (United States); DeCola, E.; Robertson, T. [Nuka Research and Planning Group, Seldovia, AK (United States)

    2006-07-01

    A comprehensive water quality sampling program was conducted in response to the oil spill that occurred when the M/V Selendang Ayu ship ran aground near a major fishing port at Unalaska Island, Alaska in December 2004. In particular, the sampling program focused on the threat of spilled oil to the local commercial fisheries resources. Spill scientists were unable to confidently model the movement of oil away from the wreck because of limited oceanographic data. In order to determine which fish species were at risk of oil contamination, a real-time assessment of how and where the oil was moving was needed, because the wreck became a continual source of oil release for several weeks after the initial grounding. The newly developed methods and procedures used to detect whole oil during the sampling program will be presented in the Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual which is currently under development. The purpose of the manual is to provide instructions to spill managers while they try to determine where spilled oil has or has not been encountered. The manual will include a meaningful data set that can be analyzed in real time to assess oil movement and concentration. Sections on oil properties and processes will be included along with scientific water quality sampling methods for whole and dissolved phase oil to assess potential contamination of commercial fishery resources and gear in Alaska waters during an oil spill. The manual will present a general discussion of factors that should be considered when designing a sampling program after a spill. In order to implement Alaska's improved seafood safety measures, the spatial scope of spilled oil must be known. A water quality sampling program can provide state and federal fishery managers and food safety inspectors with important information as they identify at-risk fisheries. 11 refs., 7 figs.

  10. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    Science.gov (United States)

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array.

  12. Final Report for X-ray Diffraction Sample Preparation Method Development

    Energy Technology Data Exchange (ETDEWEB)

    Ely, T. M. [Hanford Site (HNF), Richland, WA (United States); Meznarich, H. K. [Hanford Site (HNF), Richland, WA (United States); Valero, T. [Hanford Site (HNF), Richland, WA (United States)

    2018-01-30

    WRPS-1500790, “X-ray Diffraction Saltcake Sample Preparation Method Development Plan/Procedure,” was originally prepared with the intent of improving the specimen preparation methodology used to generate saltcake specimens suitable for XRD-based solid phase characterization. At the time that this test plan document was originally developed, packed powder in cavity supports with collodion binder was the established XRD specimen preparation method. An alternate specimen preparation method less vulnerable, if not completely invulnerable to preferred orientation effects, was desired as a replacement for the method.

  13. Development and evaluation of a new method for sampling and monitoring the symphylid population in pineapple.

    Science.gov (United States)

    Soler, Alain; Gaude, Jean-Marie; Marie-Alphonsine, Paul-Alex; Vinatier, Fabrice; Dole, Bernard; Govindin, Jean-Claude; Fournier, Patrick; Queneherve, Patrick

    2011-09-01

    Symphylids (Hanseniella sp.) are polyphagous soilborne parasites. Today, symphylid populations on pineapple are monitored by observing root symptoms and the presence of symphylids at the bottom of basal leaves. The authors developed a reliable method with a bait and trap device to monitor symphylid populations in pineapple or fallow crops. The spatial distribution of the symphylid populations was evaluated using the variance/mean ratios and spatial analyses based on Moran's and Geary's indices. The method has been tested to monitor symphylid populations at different developmental stages of pineapple. Adding potato baits to the soil samples increased the trapping efficiency of symphylids when compared with 'soil only' and 'bait only' methods. The handling of the samples is also facilitated by the new device. Results showed that the vertical distribution of symphylids may be uniform deeply inside the soil profile under pineapple, up to 50 cm. Results showed that symphylid populations are highly aggregated, showing a spot area about 4-6 m wide for their development. The new method allows better and easier evaluation of symphylid populations. It may be very useful in the evaluation of new IPM methods to control symphylids under pineapple. Copyright © 2011 Society of Chemical Industry.

  14. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  15. A comparison of results for samples collected with bailers constructed of different materials

    International Nuclear Information System (INIS)

    Thomey, N.; Ogle, R.; Jackson, J.

    1992-01-01

    A bailer is one of the most common sampling devices used to collect ground water samples. Bailers constructed from various materials are available; teflon, polyvinyl chloride (PVC), polyethylene, and stainless steel are all commonly used. It is widely recognized that sample results can be affected by the material from which the bailer is constructed. Teflon and stainless steel are usually recommended based upon their inert properties. The cost of these bailers is significantly higher than other types. For the purposes of petroleum storage tank investigations, sampling devices that would not compromise sample quality but be more economical than teflon or stainless steel would be especially desirable. Water samples were collected using the different types of bailers; teflon, stainless steel, PVC, and polyethylene. Split samples were analyzed for benzene, toluene, ethylbenzene, total xylenes, and Total Petroleum Hydrocarbons. The analytical results were compared to determine if differences were due to normal analytical variances or due to interaction of the sample with the sampling device. No difference was noted in the results which were obtained

  16. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  17. Evaluation of Two Surface Sampling Methods for Microbiological and Chemical Analyses To Assess the Presence of Biofilms in Food Companies.

    Science.gov (United States)

    Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De

    2017-12-01

    Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.

  18. Comparison of alkaline fusion and acid digestion methods for the determination of rhenium in rock and soil samples by ICP-MS

    International Nuclear Information System (INIS)

    Uchida, Shigeo; Tagami, Keiko; Tabei, Ken

    2005-01-01

    A simple acid digestion method was studied in order to analyze many samples at once to understand Re behavior in the terrestrial environment, because, under normal laboratory conditions, digestion methods generally used, such as Carius tube digestions, Teflon vessel digestions and alkaline fusions, can handle only a small number of samples at one time to ensure complete sample digestion. In this study, the Re results for reference materials (RMs) obtained by the acid digestion method were compared with those by the alkaline fusion digestion method to get applicability of the acid digestion method for Re determination in soil by inductively coupled plasma mass spectrometry. Alkaline fusion was chosen for the comparison because it is known to have the highest capability to dissolve Re in geological materials among digestion methods. The average total Re recoveries measured using the 185 Re spike for RMs, such as rock, soil and sediment, were 90.6 ± 4.0% for alkaline fusion and 92.2 ± 7.3% for acid digestion, showing no differences between them. However, Re results obtained by the acid digestion method were usually slightly lower than those by the alkaline fusion (Student's t-test, P -1 , the acid digestion method could dissolve about 80% of the sample Re. Although the acid digestion method is unable to dissolve all Re in the sample, however, the Re discharged to soils could be more extractable than the Re in the dissolution-resistant part; thus, the acid digestion method could be useful for obtaining Re levels in soil samples

  19. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    Directory of Open Access Journals (Sweden)

    Gemma Henderson

    Full Text Available Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However

  20. Improved Methods of Carnivore Faecal Sample Preservation, DNA Extraction and Quantification for Accurate Genotyping of Wild Tigers

    Science.gov (United States)

    Harika, Katakam; Mahla, Ranjeet Singh; Shivaji, Sisinthy

    2012-01-01

    Background Non-invasively collected samples allow a variety of genetic studies on endangered and elusive species. However due to low amplification success and high genotyping error rates fewer samples can be identified up to the individual level. Number of PCRs needed to obtain reliable genotypes also noticeably increase. Methods We developed a quantitative PCR assay to measure and grade amplifiable nuclear DNA in feline faecal extracts. We determined DNA degradation in experimentally aged faecal samples and tested a suite of pre-PCR protocols to considerably improve DNA retrieval. Results Average DNA concentrations of Grade I, II and III extracts were 982pg/µl, 9.5pg/µl and 0.4pg/µl respectively. Nearly 10% of extracts had no amplifiable DNA. Microsatellite PCR success and allelic dropout rates were 92% and 1.5% in Grade I, 79% and 5% in Grade II, and 54% and 16% in Grade III respectively. Our results on experimentally aged faecal samples showed that ageing has a significant effect on quantity and quality of amplifiable DNA (pDNA degradation occurs within 3 days of exposure to direct sunlight. DNA concentrations of Day 1 samples stored by ethanol and silica methods for a month varied significantly from fresh Day 1 extracts (p0.05). DNA concentrations of fresh tiger and leopard faecal extracts without addition of carrier RNA were 816.5pg/µl (±115.5) and 690.1pg/µl (±207.1), while concentrations with addition of carrier RNA were 49414.5pg/µl (±9370.6) and 20982.7pg/µl (±6835.8) respectively. Conclusions Our results indicate that carnivore faecal samples should be collected as freshly as possible, are better preserved by two-step method and should be extracted with addition of carrier RNA. We recommend quantification of template DNA as this facilitates several downstream protocols. PMID:23071624

  1. Inside-sediment partitioning of PAH, PCB and organochlorine compounds and inferences on sampling and normalization methods

    International Nuclear Information System (INIS)

    Opel, Oliver; Palm, Wolf-Ulrich; Steffen, Dieter; Ruck, Wolfgang K.L.

    2011-01-01

    Comparability of sediment analyses for semivolatile organic substances is still low. Neither screening of the sediments nor organic-carbon based normalization is sufficient to obtain comparable results. We are showing the interdependency of grain-size effects with inside-sediment organic-matter distribution for PAH, PCB and organochlorine compounds. Surface sediment samples collected by Van-Veen grab were sieved and analyzed for 16 PAH, 6 PCB and 18 organochlorine pesticides (OCP) as well as organic-matter content. Since bulk concentrations are influenced by grain-size effects themselves, we used a novel normalization method based on the sum of concentrations in the separate grain-size fractions of the sediments. By calculating relative normalized concentrations, it was possible to clearly show underlying mechanisms throughout a heterogeneous set of samples. Furthermore, we were able to show that, for comparability, screening at <125 μm is best suited and can be further improved by additional organic-carbon normalization. - Research highlights: → New method for the comparison of heterogeneous sets of sediment samples. → Assessment of organic pollutants partitioning mechanisms in sediments. → Proposed method for more comparable sediment sampling. - Inside-sediment partitioning mechanisms are shown using a new mathematical approach and discussed in terms of sediment sampling and normalization.

  2. Reliability of the k{sub 0}-standardization method using geological sample analysed in a proficiency test

    Energy Technology Data Exchange (ETDEWEB)

    Pelaes, Ana Clara O.; Menezes, Maria Ângela de B.C., E-mail: anacpelaes@gmail.com, E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-11-01

    The Neutron Activation Analysis (NAA) is an analytical technique to determine the elemental chemical composition in samples of several matrices, that has been applied by the Laboratory for Neutron Activation Analysis, located at Centro de Desenvolvimento da Tecnologia Nuclear /Comissão Nacional de Energia Nuclear (Nuclear Technology Development Center/Brazilian Commission for Nuclear Energy), CDTN/CNEN, since the starting up of the TRIGA MARK I IPR-R1 reactor, in 1960. Among the methods of application of the technique, the k{sub 0}-standardization method, which was established at CDTN in 1995, is the most efficient and in 2003 it was reestablished and optimized. In order to verify the reproducibility of the results generated by the application of the k{sub 0}-standardization method at CDTN, aliquots of a geological sample sent by WEPAL (Wageningen Evaluating Programs for Analytical Laboratories) were analysed and its results were compared with the results obtained through the Intercomparison of Results organized by the International Atomic Energy Agency in 2015. WEPAL is an accredited institution for the organisation of interlaboratory studies, preparing and organizing proficiency testing schemes all over the world. Therefore, the comparison with the results provided aims to contribute to the continuous improvement of the quality of the results obtained by the CDTN. The objective of this study was to verify the reliability of the method applied two years after the intercomparison round. (author)

  3. Standard methods for sampling freshwater fishes: opportunities for international collaboration

    OpenAIRE

    Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, T. Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D.S.; Lester, Nigel P.; Porath, Mark; Winfield, Ian J.

    2017-01-01

    With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by...

  4. A Proposal of New Spherical Particle Modeling Method Based on Stochastic Sampling of Particle Locations in Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Do Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jea Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    To the high computational efficiency and user convenience, the implicit method had received attention; however, it is noted that the implicit method in the previous studies has low accuracy at high packing fraction. In this study, a new implicit method, which can be used at any packing fraction with high accuracy, is proposed. In this study, the implicit modeling method in the spherical particle distributed medium for using the MC simulation is proposed. A new concept in the spherical particle sampling was developed to solve the problems in the previous implicit methods. The sampling method was verified by simulating the sampling method in the infinite and finite medium. The results show that the particle implicit modeling with the proposed method was accurately performed in all packing fraction boundaries. It is expected that the proposed method can be efficiently utilized for the spherical particle distributed mediums, which are the fusion reactor blanket, VHTR reactors, and shielding analysis.

  5. Method to make accurate concentration and isotopic measurements for small gas samples

    Science.gov (United States)

    Palmer, M. R.; Wahl, E.; Cunningham, K. L.

    2013-12-01

    Carbon isotopic ratio measurements of CO2 and CH4 provide valuable insight into carbon cycle processes. However, many of these studies, like soil gas, soil flux, and water head space experiments, provide very small gas sample volumes, too small for direct measurement by current constant-flow Cavity Ring-Down (CRDS) isotopic analyzers. Previously, we addressed this issue by developing a sample introduction module which enabled the isotopic ratio measurement of 40ml samples or smaller. However, the system, called the Small Sample Isotope Module (SSIM), does dilute the sample during the delivery with inert carrier gas which causes a ~5% reduction in concentration. The isotopic ratio measurements are not affected by this small dilution, but researchers are naturally interested accurate concentration measurements. We present the accuracy and precision of a new method of using this delivery module which we call 'double injection.' Two portions of the 40ml of the sample (20ml each) are introduced to the analyzer, the first injection of which flushes out the diluting gas and the second injection is measured. The accuracy of this new method is demonstrated by comparing the concentration and isotopic ratio measurements for a gas sampled directly and that same gas measured through the SSIM. The data show that the CO2 concentration measurements were the same within instrument precision. The isotopic ratio precision (1σ) of repeated measurements was 0.16 permil for CO2 and 1.15 permil for CH4 at ambient concentrations. This new method provides a significant enhancement in the information provided by small samples.

  6. Multicommuted flow injection method for fast photometric determination of phenolic compounds in commercial virgin olive oil samples.

    Science.gov (United States)

    Lara-Ortega, Felipe J; Sainz-Gonzalo, Francisco J; Gilbert-López, Bienvenida; García-Reyes, Juan F; Molina-Díaz, Antonio

    2016-01-15

    A multicommuted flow injection method has been developed for the determination of phenolic species in virgin olive oil samples. The method is based on the inhibitory effect of antioxidants on a stable and colored radical cation formation from the colorless compound N,N-dimethyl-p-phenylenediamine (DMPD(•+)) in acidic medium in the presence of Fe(III) as oxidant. The signal inhibition by phenolic species and other antioxidants is proportional to their concentration in the olive oil sample. Absorbance was recorded at 515nm by means of a modular fiber optic spectrometer. Oleuropein was used as the standard for phenols determination and 6-hydroxy-2,5,7,8-tetramethylchroman-2-carboxylic acid (trolox) was the reference standard used for total antioxidant content calculation. Linear response was observed within the range of 250-1000mg/kg oleuropein, which was in accordance with phenolic contents observed in commercial extra virgin olive oil in the present study. Fast and low-volume liquid-liquid extraction of the samples using 60% MeOH was made previous to their insertion in the flow multicommuted system. The five three-way solenoid valves used for multicommuted liquid handling were controlled by a homemade electronic interface and Java-written software. The proposed approach was applied to different commercial extra virgin olive oil samples and the results were consistent with those obtained by the Folin Ciocalteu (FC) method. Total time for the sample preparation and the analysis required in the present approach can be drastically reduced: the throughput of the present analysis is 8 samples/h in contrast to 1sample/h of the conventional FC method. The present method is easy to implement in routine analysis and can be regarded as a feasible alternative to FC method. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Post-sampling release of free fatty acids - effects of heat stabilization and methods of euthanasia.

    Science.gov (United States)

    Jernerén, Fredrik; Söderquist, Marcus; Karlsson, Oskar

    2015-01-01

    The field of lipid research has made progress and it is now possible to study the lipidome of cells and organelles. A basic requirement of a successful lipid study is adequate pre-analytical sample handling, as some lipids can be unstable and postmortem changes can cause substantial accumulation of free fatty acids (FFAs). The aim of the present study was to investigate the effects of conductive heat stabilization and euthanasia methods on FFA levels in the rat brain and liver using liquid chromatography tandem mass spectrometry. The analysis of brain homogenates clearly demonstrated phospholipase activity and time-dependent post-sampling changes in the lipid pool of snap frozen non-stabilized tissue. There was a significant increase in FFAs already at 2min, which continued over time. Heat stabilization was shown to be an efficient method to reduce phospholipase activity and ex vivo lipolysis. Post-sampling effects due to tissue thawing and sample preparation induced a massive release of FFAs (up to 3700%) from non-stabilized liver and brain tissues compared to heat stabilized tissue. Furthermore, the choice of euthanasia method significantly influenced the levels of FFAs in the brain. The FFAs were decreased by 15-44% in the group of animals euthanized by pentobarbital injection compared with CO2 inhalation or decapitation. Our results highlight the importance of considering euthanasia methods and pre-analytical treatment in lipid analysis, factors which may otherwise interfere with the outcome of the experiments. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Bats from Fazenda Intervales, Southeastern Brazil: species account and comparison between different sampling methods

    Directory of Open Access Journals (Sweden)

    Christine V. Portfors

    2000-06-01

    Full Text Available Assessing the composition of an area's bat fauna is typically accomplished by using captures or by monitoring echolocation calls with bat detectors. The two methods may not provide the same data regarding species composition. Mist nets and harp traps may be biased towards sampling low flying species, and bat detectors biased towards detecting high intensity echolocators. A comparison of the bat fauna of Fazenda Intervales, southeastern Brazil, as revealed by mist nets and harp trap captures, checking roosts and by monitoring echolocation calls of flying bats illustrates this point. A total of 17 species of bats was sampled. Fourteen bat species were captured and the echolocation calls of 12 species were recorded, three of them not revealed by mist nets or harp traps. The different sampling methods provided different pictures of the bat fauna. Phyllostomid bats dominated the catches in mist nets, but in the field their echolocation calls were never detected. No single sampling approach provided a complete assessment of the bat fauna in the study area. In general, bats producing low intensity echolocation calls, such as phyllostomids, are more easily assessed by netting, and bats producing high intensity echolocation calls are better surveyed by bat detectors. The results demonstrate that a combined and varied approach to sampling is required for a complete assessment of the bat fauna of an area.

  9. Solvent-assisted dispersive solid-phase extraction: A sample preparation method for trace detection of diazinon in urine and environmental water samples.

    Science.gov (United States)

    Aladaghlo, Zolfaghar; Fakhari, Alireza; Behbahani, Mohammad

    2016-09-02

    In this research, a sample preparation method termed solvent-assisted dispersive solid-phase extraction (SA-DSPE) was applied. The used sample preparation method was based on the dispersion of the sorbent into the aqueous sample to maximize the interaction surface. In this approach, the dispersion of the sorbent at a very low milligram level was received by inserting a solution of the sorbent and disperser solvent into the aqueous sample. The cloudy solution created from the dispersion of the sorbent in the bulk aqueous sample. After pre-concentration of the diazinon, the cloudy solution was centrifuged and diazinon in the sediment phase dissolved in ethanol and determined by gas chromatography-flame ionization detector. Under the optimized conditions (pH of solution=7.0, Sorbent: benzophenone, 2%, Disperser solvent: ethanol, 500μL, Centrifuge: centrifuged at 4000rpm for 3min), the method detection limit for diazinon was 0.2, 0.3, 0.3 and 0.3μgL(-1) for distilled water, lake water, waste water and urine sample, respectively. Furthermore, the pre-concentration factor was 363.8, 356.1, 360.7 and 353.38 in distilled water, waste water, lake water and urine sample, respectively. SA-DSPE was successfully used for trace monitoring of diazinon in urine, lake and waste water samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Characterization of solid heterogeneous waste fuel - the effect of sampling and preparation method; Karaktaerisering av fasta inhomogena avfallsbraenslen - inverkan av metoder foer provtagning och provberedning

    Energy Technology Data Exchange (ETDEWEB)

    Wikstroem-Blomqvist, Evalena; Franke, Jolanta; Johansson, Ingvar

    2007-12-15

    The aim of the project is to evaluate the possibilities to simplify the methods used during sampling and laboratory preparation of heterogeneous waste materials. Existing methods for solid fuel material is summarized and evaluated in the project. As a result of the project two new simplified methods, one for field sampling and one for laboratory preparation work has been suggested. One large challenge regarding waste sampling is to achieve a representative sample due to the considerable heterogeneity of the material. How do you perform a sampling campaign that will give rise to representative results without too large costs? The single largest important source of error is the sampling procedure, equivalent to about 80% of the total error. Meanwhile the sample reduction and laboratory work only represents 15 % and 5 % respectively. Thus, to minimize the total error it is very important that the sampling is well planned in a testing program. In the end a very small analytical sample (1 gram) should reflected a large heterogeneous sample population of 1000 of tons. In this project two sampling campaigns, the fall of 2006 and early winter 2007, were conducted at the waste power plant Renova in Gothenburg, Sweden. The first campaign consisted of three different sample sizes with different number of sub-samples. One reference sample (50 tons and 48 sub-samples), two samples consisting of 16 tons and 8 sub-samples and finally two 4 tons consisting of 2 sub-samples each. During the second sampling campaign, four additional 4 ton samples were taken to repeat and thus evaluate the simplified sampling method. This project concludes that the simplified sampling methods only consisting of two sub-samples and a total sample volume of 4 tons give rise to results with as good quality and precision is the more complicated methods tested. Moreover the results from the two sampling campaigns generated equivalent results. The preparation methods used in the laboratory can as well be

  12. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Rapid screening method for plutonium in mixed waste samples

    International Nuclear Information System (INIS)

    Somers, W.; Culp, T.; Miller, R.

    1987-01-01

    A waste stream sampling program was undertaken to determine those waste streams which contained hazardous constituents, and would therefore be regulated as a hazardous waste under the Resource Conservation and Recovery Act. The waste streams also had the potential of containing radioactive material, either plutonium, americium, or depleted uranium. Because of the potential for contamination with radioactive material, a method of rapidly screening the liquid samples for radioactive material was required. A counting technique was devised to count a small aliquot of a sample, determine plutonium concentration, and allow the sample to be shipped the same day they were collected. This technique utilized the low energy photons (x-rays) that accompany α decay. This direct, non-destructive x-ray analysis was applied to quantitatively determine Pu-239 concentrations in industrial samples. Samples contained a Pu-239, Am-241 mixture; the ratio and/or concentrations of these two radionuclides was not constant. A computer program was designed and implemented to calculate Pu-239 activity and concentration (g/ml) using the 59.5 keV Am-241 peak to determine Am-241's contribution to the 17 keV region. Am's contribution was subtracted, yielding net counts in the 17 keV region due to Pu. 2 figs., 1 tab

  14. Molecular analyses of two bacterial sampling methods in ligature-induced periodontitis in rats.

    Science.gov (United States)

    Fontana, Carla Raquel; Grecco, Clovis; Bagnato, Vanderlei Salvador; de Freitas, Laura Marise; Boussios, Constantinos I; Soukos, Nikolaos S

    2018-02-01

    The prevalence profile of periodontal pathogens in dental plaque can vary as a function of the detection method; however, the sampling technique may also play a role in determining dental plaque microbial profiles. We sought to determine the bacterial composition comparing two sampling methods, one well stablished and a new one proposed here. In this study, a ligature-induced periodontitis model was used in 30 rats. Twenty-seven days later, ligatures were removed and microbiological samples were obtained directly from the ligatures as well as from the periodontal pockets using absorbent paper points. Microbial analysis was performed using DNA probes to a panel of 40 periodontal species in the checkerboard assay. The bacterial composition patterns were similar for both sampling methods. However, detection levels for all species were markedly higher for ligatures compared with paper points. Ligature samples provided more bacterial counts than paper points, suggesting that the technique for induction of periodontitis could also be applied for sampling in rats. Our findings may be helpful in designing studies of induced periodontal disease-associated microbiota.

  15. TMI-2 core debris analytical methods and results

    International Nuclear Information System (INIS)

    Akers, D.W.; Cook, B.A.

    1984-01-01

    A series of six grab samples was taken from the debris bed of the TMI-2 core in early September 1983. Five of these samples were sent to the Idaho National Engineering Laboratory for analysis. Presented is the analysis strategy for the samples and some of the data obtained from the early stages of examination of the samples (i.e., particle size-analysis, gamma spectrometry results, and fissile/fertile material analysis)

  16. Method for estimating modulation transfer function from sample images.

    Science.gov (United States)

    Saiga, Rino; Takeuchi, Akihisa; Uesugi, Kentaro; Terada, Yasuko; Suzuki, Yoshio; Mizutani, Ryuta

    2018-02-01

    The modulation transfer function (MTF) represents the frequency domain response of imaging modalities. Here, we report a method for estimating the MTF from sample images. Test images were generated from a number of images, including those taken with an electron microscope and with an observation satellite. These original images were convolved with point spread functions (PSFs) including those of circular apertures. The resultant test images were subjected to a Fourier transformation. The logarithm of the squared norm of the Fourier transform was plotted against the squared distance from the origin. Linear correlations were observed in the logarithmic plots, indicating that the PSF of the test images can be approximated with a Gaussian. The MTF was then calculated from the Gaussian-approximated PSF. The obtained MTF closely coincided with the MTF predicted from the original PSF. The MTF of an x-ray microtomographic section of a fly brain was also estimated with this method. The obtained MTF showed good agreement with the MTF determined from an edge profile of an aluminum test object. We suggest that this approach is an alternative way of estimating the MTF, independently of the image type. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade; Manavi, Kasra; Burgos, Juan; Denny, Jory; Thomas, Shawna; Amato, Nancy M.

    2012-01-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  18. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade

    2012-05-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  19. Detecting Renibacterium salmoninarum in wild brown trout by use of multiple organ samples and diagnostic methods

    Science.gov (United States)

    Guomundsdottir, S.; Applegate, Lynn M.; Arnason, I.O.; Kristmundsson, A.; Purcell, Maureen K.; Elliott, Diane G.

    2017-01-01

    Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease (BKD), is endemic in many wild trout species in northerly regions. The aim of the present study was to determine the optimal R. salmoninarum sampling/testing strategy for wild brown trout (Salmo trutta L.) populations in Iceland. Fish were netted in a lake and multiple organs—kidney, spleen, gills, oesophagus and mid-gut—were sampled and subjected to five detection tests i.e. culture, polyclonal enzyme-linked immunosorbent assay (pELISA) and three different PCR tests. The results showed that each fish had encountered R. salmoninarum but there were marked differences between results obtained depending on organ and test. The bacterium was not cultured from any kidney sample while all kidney samples were positive by pELISA. At least one organ from 92.9% of the fish tested positive by PCR. The results demonstrated that the choice of tissue and diagnostic method can dramatically influence the outcome of R. salmoninarum surveys.

  20. An efficient modularized sample-based method to estimate the first-order Sobol' index

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Sobol' index is a prominent methodology in global sensitivity analysis. This paper aims to directly estimate the Sobol' index based only on available input–output samples, even if the underlying model is unavailable. For this purpose, a new method to calculate the first-order Sobol' index is proposed. The innovation is that the conditional variance and mean in the formula of the first-order index are calculated at an unknown but existing location of model inputs, instead of an explicit user-defined location. The proposed method is modularized in two aspects: 1) index calculations for different model inputs are separate and use the same set of samples; and 2) model input sampling, model evaluation, and index calculation are separate. Due to this modularization, the proposed method is capable to compute the first-order index if only input–output samples are available but the underlying model is unavailable, and its computational cost is not proportional to the dimension of the model inputs. In addition, the proposed method can also estimate the first-order index with correlated model inputs. Considering that the first-order index is a desired metric to rank model inputs but current methods can only handle independent model inputs, the proposed method contributes to fill this gap. - Highlights: • An efficient method to estimate the first-order Sobol' index. • Estimate the index from input–output samples directly. • Computational cost is not proportional to the number of model inputs. • Handle both uncorrelated and correlated model inputs.