WorldWideScience

Sample records for sample applications irreg

  1. A Survey of Blue-Noise Sampling and Its Applications

    KAUST Repository

    Yan, Dongming; Guo, Jian-Wei; Wang, Bin; Zhang, Xiao-Peng; Wonka, Peter

    2015-01-01

    In this paper, we survey recent approaches to blue-noise sampling and discuss their beneficial applications. We discuss the sampling algorithms that use points as sampling primitives and classify the sampling algorithms based on various aspects, e.g., the sampling domain and the type of algorithm. We demonstrate several well-known applications that can be improved by recent blue-noise sampling techniques, as well as some new applications such as dynamic sampling and blue-noise remeshing.

  2. A Survey of Blue-Noise Sampling and Its Applications

    KAUST Repository

    Yan, Dongming

    2015-05-05

    In this paper, we survey recent approaches to blue-noise sampling and discuss their beneficial applications. We discuss the sampling algorithms that use points as sampling primitives and classify the sampling algorithms based on various aspects, e.g., the sampling domain and the type of algorithm. We demonstrate several well-known applications that can be improved by recent blue-noise sampling techniques, as well as some new applications such as dynamic sampling and blue-noise remeshing.

  3. Defense AT&L. Volume 37, Number 4, July-August 2008

    Science.gov (United States)

    2008-08-01

    ir - ritating tendency to overstay their welcome and overtake common sense, so we need to have some irregs too.” He let out a huge laugh, scattering...dynamics-based model of counterinsurgency that provides insights into ir - regular warfare. The awards will be presented to winners May 11 at the DoD...North Dakota, Okla- homa , Puerto Rico, Rhode Island, South Dakota, South Carolina, Tennessee, U.S. Virgin Islands, Vermont, West Virginia, and Wyoming

  4. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S

    2015-01-01

    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  5. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    Directory of Open Access Journals (Sweden)

    Helena Prosen

    2014-05-01

    Full Text Available Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc. published in the last decade. Several innovative liquid-phase microextraction (LPME techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME, hollow fiber-liquid phase microextraction (HF-LPME, dispersive liquid-liquid microextraction (DLLME. Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  6. Applications of liquid-phase microextraction in the sample preparation of environmental solid samples.

    Science.gov (United States)

    Prosen, Helena

    2014-05-23

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several innovative liquid-phase microextraction (LPME) techniques that have emerged recently have also been applied as an aid in sample preparation of these samples: single-drop microextraction (SDME), hollow fiber-liquid phase microextraction (HF-LPME), dispersive liquid-liquid microextraction (DLLME). Besides the common organic solvents, surfactants and ionic liquids are also used. However, these techniques have to be combined with another technique to release the analytes from the solid sample into an aqueous solution. In the present review, the published methods were categorized into three groups: LPME in combination with a conventional solvent extraction; LPME in combination with an environmentally friendly extraction; LPME without previous extraction. The applicability of these approaches to the sample preparation for the determination of pollutants in solid environmental samples is discussed, with emphasis on their strengths, weak points and environmental impact.

  7. Applications of Liquid-Phase Microextraction in the Sample Preparation of Environmental Solid Samples

    OpenAIRE

    Helena Prosen

    2014-01-01

    Solvent extraction remains one of the fundamental sample preparation techniques in the analysis of environmental solid samples, but organic solvents are toxic and environmentally harmful, therefore one of the possible greening directions is its miniaturization. The present review covers the relevant research from the field of application of microextraction to the sample preparation of environmental solid samples (soil, sediments, sewage sludge, dust etc.) published in the last decade. Several...

  8. Applicability of neutron activation analysis to geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Ebihara, Mitsuru [Tokyo Metropolitan Univ., Graduate School of Science, Tokyo (Japan)

    2003-03-01

    The applicability of neutron activation analysis (NAA) to geological samples in space is discussed by referring to future space mission programs, by which the extraterrestrial samples are to be delivered to the earth for scientific inspections. It is concluded that both destructive and non-destructive NAA are highly effective in analyzing these samples. (author)

  9. Applicability of neutron activation analysis to geological samples

    International Nuclear Information System (INIS)

    Ebihara, Mitsuru

    2003-01-01

    The applicability of neutron activation analysis (NAA) to geological samples in space is discussed by referring to future space mission programs, by which the extraterrestrial samples are to be delivered to the earth for scientific inspections. It is concluded that both destructive and non-destructive NAA are highly effective in analyzing these samples. (author)

  10. Application of digital sampling techniques to particle identification

    International Nuclear Information System (INIS)

    Bardelli, L.; Poggi, G.; Bini, M.; Carraresi, L.; Pasquali, G.; Taccetti, N.

    2003-01-01

    An application of digital sampling techniques is presented which can greatly simplify experiments involving sub-nanosecond time-mark determinations and energy measurements with nuclear detectors, used for Pulse Shape Analysis and Time of Flight measurements in heavy ion experiments. In this work a 100 M Sample/s, 12 bit analog to digital converter has been used: examples of this technique applied to Silicon and CsI(Tl) detectors in heavy-ions experiments involving particle identification via Pulse Shape analysis and Time of Flight measurements are presented. The system is suited for applications to large detector arrays and to different kinds of detectors. Some preliminary results regarding the simulation of current signals in Silicon detectors are also discussed. (authors)

  11. Recent advances in applications of nanomaterials for sample preparation.

    Science.gov (United States)

    Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei

    2016-01-01

    Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. A low-volume cavity ring-down spectrometer for sample-limited applications

    Science.gov (United States)

    Stowasser, C.; Farinas, A. D.; Ware, J.; Wistisen, D. W.; Rella, C.; Wahl, E.; Crosson, E.; Blunier, T.

    2014-08-01

    In atmospheric and environmental sciences, optical spectrometers are used for the measurements of greenhouse gas mole fractions and the isotopic composition of water vapor or greenhouse gases. The large sample cell volumes (tens of milliliters to several liters) in commercially available spectrometers constrain the usefulness of such instruments for applications that are limited in sample size and/or need to track fast variations in the sample stream. In an effort to make spectrometers more suitable for sample-limited applications, we developed a low-volume analyzer capable of measuring mole fractions of methane and carbon monoxide based on a commercial cavity ring-down spectrometer. The instrument has a small sample cell (9.6 ml) and can selectively be operated at a sample cell pressure of 140, 45, or 20 Torr (effective internal volume of 1.8, 0.57, and 0.25 ml). We present the new sample cell design and the flow path configuration, which are optimized for small sample sizes. To quantify the spectrometer's usefulness for sample-limited applications, we determine the renewal rate of sample molecules within the low-volume spectrometer. Furthermore, we show that the performance of the low-volume spectrometer matches the performance of the standard commercial analyzers by investigating linearity, precision, and instrumental drift.

  13. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    Science.gov (United States)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  14. Application of secondary ion mass spectrometry (SIMS) to biological sample analysis

    International Nuclear Information System (INIS)

    Tamura, Hifumi

    1990-01-01

    Some major issues and problems related with the analysis of biological samples are discussed, focusing on demonstrated and possible solutions and the application of secondary ion mass spectrometry (SIMS) to investigation of the composition of biological samples. The effective use of secondary electrons in combination with negative ions is most practical for the analysis of biological samples. Regardless of whether positive or negative ions are used, the electric potential at the surface of a sample stays around a constant value because of the absense of the accumulation of electric charges at the surface, leading to almost complete avoidance of the charging of the biological sample. A soft tissue sample can suffer damage to the tissue or migration of atoms in removing water from the sample. Some processes including fixation and freeze drying are available to prevent this. The application of SIMS to biological analysis is still in the basic research stage and further studies will be required to develop practical methods. Possible areas of its application include medicine, pathology, toxicology, pharmacology, plant physiology and other areas related with marine life and marine contamination. (N.K.)

  15. Applications of infrared photo-acoustic spectroscopy for wood samples

    Science.gov (United States)

    Mon-Lin Kuo; John F. McClelland; Siquan Luo; Po-Liang Chien; R.D. Walker; Chung-Yun Hse

    1988-01-01

    Various infrared (IR) spectroscopic techniques for the analysis of wood samples are briefly discussed. Theories and instrumentation of the newly developed photoacoustic spectroscopic (PAS) technique for measuring absorbance spectra of solids are presented. Some important applications of the PAS technique in wood science research are discussed. The application of the...

  16. Application-Specific Graph Sampling for Frequent Subgraph Mining and Community Detection

    Energy Technology Data Exchange (ETDEWEB)

    Purohit, Sumit; Choudhury, Sutanay; Holder, Lawrence B.

    2017-12-11

    Graph mining is an important data analysis methodology, but struggles as the input graph size increases. The scalability and usability challenges posed by such large graphs make it imperative to sample the input graph and reduce its size. The critical challenge in sampling is to identify the appropriate algorithm to insure the resulting analysis does not suffer heavily from the data reduction. Predicting the expected performance degradation for a given graph and sampling algorithm is also useful. In this paper, we present different sampling approaches for graph mining applications such as Frequent Subgrpah Mining (FSM), and Community Detection (CD). We explore graph metrics such as PageRank, Triangles, and Diversity to sample a graph and conclude that for heterogeneous graphs Triangles and Diversity perform better than degree based metrics. We also present two new sampling variations for targeted graph mining applications. We present empirical results to show that knowledge of the target application, along with input graph properties can be used to select the best sampling algorithm. We also conclude that performance degradation is an abrupt, rather than gradual phenomena, as the sample size decreases. We present the empirical results to show that the performance degradation follows a logistic function.

  17. ICP-MS applications for the analysis of geological materials and environmental samples

    International Nuclear Information System (INIS)

    Bendl, J.

    1997-01-01

    This work deals with applications of inductively coupled plasma - mass spectrometry applications for the analysis of geological materials and environmental samples. There are instrumentation, calibration, alternatives of sample introduction, interferences, trace elements analysis, rare earth elements and uranium and thorium, precious metals, isotopic analysis and environmental analysis discussed

  18. [Mass spectrometry technology and its application in analysis of biological samples].

    Science.gov (United States)

    Zhao, Long-Shan; Li, Qing; Guo, Chao-Wei; Chen, Xiao-Hui; Bi, Kai-Shun

    2012-02-01

    With the excellent merits of wide analytical range, high sensitivity, small sample size, fast analysis speed, good repeatability, simple operation, low mobile phase consumption, as well as its capability of simultaneous isolation and identification, etc, mass spectrometry techniques have become widely used in the area of environmental science, energy chemical industry, biological medicine, and so on. This article reviews the application of mass spectrometry technology in biological sample analysis in the latest three years with the focus on the new applications in pharmacokinetics and bioequivalence, toxicokinetics, pharmacokinetic-pharmacodynamic, population pharmacokinetics, identification and fragmentation pathways of drugs and their metabolites and metabonomics to provide references for further study of biological sample analysis.

  19. Applications of Asymptotic Sampling on High Dimensional Structural Dynamic Problems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    The paper represents application of the asymptotic sampling on various structural models subjected to random excitations. A detailed study on the effect of different distributions of the so-called support points is performed. This study shows that the distribution of the support points has consid...... dimensional reliability problems in structural dynamics.......The paper represents application of the asymptotic sampling on various structural models subjected to random excitations. A detailed study on the effect of different distributions of the so-called support points is performed. This study shows that the distribution of the support points has...... is minimized. Next, the method is applied on different cases of linear and nonlinear systems with a large number of random variables representing the dynamic excitation. The results show that asymptotic sampling is capable of providing good approximations of low failure probability events for very high...

  20. Vanishing auxiliary variables in PPS sampling - with applications in microscopy

    DEFF Research Database (Denmark)

    Andersen, Ina Trolle; Hahn, Ute; Jensen, Eva B. Vedel

    Recently, non-uniform sampling has been suggested in microscopy to increase efficiency. More precisely, sampling proportional to size (PPS) has been introduced where the probability of sampling a unit in the population is proportional to the value of an auxiliary variable. Unfortunately, vanishing...... auxiliary variables are a common phenomenon in microscopy and, accordingly, part of the population is not accessible, using PPS sampling. We propose a modification of the design, for which an optimal solution can be found, using a model assisted approach. The optimal design has independent interest...... in sampling theory. We verify robustness of the new approach by numerical results, and we use real data to illustrate the applicability....

  1. Measurement of extremely (2) H-enriched water samples by laser spectrometry: application to batch electrolytic concentration of environmental tritium samples.

    Science.gov (United States)

    Wassenaar, L I; Kumar, B; Douence, C; Belachew, D L; Aggarwal, P K

    2016-02-15

    Natural water samples artificially or experimentally enriched in deuterium ((2) H) at concentrations up to 10,000 ppm are required for various medical, environmental and hydrological tracer applications, but are difficult to measure using conventional stable isotope ratio mass spectrometry. Here we demonstrate that off-axis integrated cavity output (OA-ICOS) laser spectrometry, along with (2) H-enriched laboratory calibration standards and appropriate analysis templates, allows for low-cost, fast, and accurate determinations of water samples having δ(2) HVSMOW-SLAP values up to at least 57,000 ‰ (~9000 ppm) at a processing rate of 60 samples per day. As one practical application, extremely (2) H-enriched samples were measured by laser spectrometry and compared to the traditional (3) H Spike-Proxy method in order to determine tritium enrichment factors in the batch electrolysis of environmental waters. Highly (2) H-enriched samples were taken from different sets of electrolytically concentrated standards and low-level (tritium samples, and all cases returned accurate and precise initial low-level (3) H results. The ability to quickly and accurately measure extremely (2) H-enriched waters by laser spectrometry will facilitate the use of deuterium as a tracer in numerous environmental and other applications. For low-level tritium operations, this new analytical ability facilitated a 10-20 % increase in sample productivity through the elimination of spike standards and gravimetrics, and provides immediate feedback on electrolytic enrichment cell performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Modification and Application of a Leaf Blower-vac for Field Sampling of Arthropods.

    Science.gov (United States)

    Zou, Yi; van Telgen, Mario D; Chen, Junhui; Xiao, Haijun; de Kraker, Joop; Bianchi, Felix J J A; van der Werf, Wopke

    2016-08-10

    Rice fields host a large diversity of arthropods, but investigating their population dynamics and interactions is challenging. Here we describe the modification and application of a leaf blower-vac for suction sampling of arthropod populations in rice. When used in combination with an enclosure, application of this sampling device provides absolute estimates of the populations of arthropods as numbers per standardized sampling area. The sampling efficiency depends critically on the sampling duration. In a mature rice crop, a two-minute sampling in an enclosure of 0.13 m(2) yields more than 90% of the arthropod population. The device also allows sampling of arthropods dwelling on the water surface or the soil in rice paddies, but it is not suitable for sampling fast flying insects, such as predatory Odonata or larger hymenopterous parasitoids. The modified blower-vac is simple to construct, and cheaper and easier to handle than traditional suction sampling devices, such as D-vac. The low cost makes the modified blower-vac also accessible to researchers in developing countries.

  3. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  4. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  5. PIXE and its applications to biological samples

    International Nuclear Information System (INIS)

    Aldape, F.; Flores, M.J.

    1996-01-01

    Throughout this century, industrialized society has seriously affected the ecology by introducing huge amounts of pollutants into the atmosphere as well as marine and soil environments. On the other hand, it is known that these pollutants, in excess of certain levels of concentration, not only put at risk the life of living beings but may also cause the extinction of some species. It is therefore of basic importance to substantially increase quantitative determinations of trace element concentrations in biological specimens in order to assess the effects of pollutants. It is in this field that PIXE plays a key role in these studies, where its unique analytical properties are decisive. Moreover, since the importance of these research has been recognized in many countries, many scientists have been encouraged to continue or initiate new research programmes aimed to solve the worldwide pollution problem. This document presents an overview of those papers reporting the application of PIXE analysis to biological samples during this last decade of the 20th century and recounts the number of PIXE laboratories dedicating their efforts to find the clues of the biological effects of the presence of pollutants introduced in living beings. Sample preparation methods, different kinds of samples under study and the use of complementary analytical techniques are also illustrated. (author). 108 refs

  6. Application of WSP method in analysis of environmental samples

    International Nuclear Information System (INIS)

    Stacho, M.; Slugen, V.; Hinca, R.; Sojak, S.; Krnac, S.

    2014-01-01

    Detection of activity in natural samples is specific especially because of its low level and high background interferences. Reduction of background interferences could be reached using low background chamber. Measurement geometry in shape of Marinelli beaker is commonly used according to low level of activity in natural samples. The Peak Net Area (PNA) method is the world-wide accepted technique for analysis of gamma-ray spectra. It is based on the net area calculation of the full energy peak, therefore, it takes into account only a fraction of measured gamma-ray spectrum. On the other hand, the Whole Spectrum Processing (WSP) approach to the gamma analysis makes possible to use entire information being in the spectrum. This significantly raises efficiency and improves energy resolution of the analysis. A principal step for the WSP application is building up the suitable response operator. Problems are put in an appearance when suitable standard calibration sources are unavailable. It may be occurred in the case of large volume samples and/or in the analysis of high energy range. Combined experimental and mathematical calibration may be a suitable solution. Many different detectors have been used to register the gamma ray and its energy. HPGe detectors produce the highest resolution commonly available today. Therefore they are they the most often used detectors in natural samples activity analysis. Scintillation detectors analysed using PNA method could be also used in simple cases, but for complicated spectra are practically inapplicable. WSP approach improves resolution of scintillation detectors and expands their applicability. WSP method allowed significant improvement of the energetic resolution and separation of "1"3"7Cs 661 keV peak from "2"1"4Bi 609 keV peak. At the other hand the statistical fluctuations in the lower part of the spectrum highlighted by background subtraction causes that this part is still not reliably analyzable. (authors)

  7. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  8. Synthesis and application of magnetic molecularly imprinted polymers in sample preparation.

    Science.gov (United States)

    Huang, Shuyao; Xu, Jianqiao; Zheng, Jiating; Zhu, Fang; Xie, Lijun; Ouyang, Gangfeng

    2018-04-12

    Magnetic molecularly imprinted polymers (MMIPs) have superior advantages in sample pretreatment because of their high selectivity for target analytes and the fast and easy isolation from samples. To meet the demand of both good magnetic property and good extraction performance, MMIPs with various structures, from traditional core-shell structures to novel composite structures with a larger specific surface area and more accessible binding sites, are fabricated by different preparation technologies. Moreover, as the molecularly imprinted polymer (MIP) layers determine the affinity, selectivity, and saturated adsorption amount of MMIPs, the development and innovation of the MIP layer are attracting attention and are reviewed here. Many studies that used MMIPs as sorbents in dispersive solid-phase extraction of complex samples, including environmental, food, and biofluid samples, are summarized. Graphical abstract The application of magnetic molecularly imprinted polymers (MIPs) in the sample preparation procedure improves the analytical performances for complex samples. MITs molecular imprinting technologies.

  9. Predicting the compressibility behaviour of tire shred samples for landfill applications.

    Science.gov (United States)

    Warith, M A; Rao, Sudhakar M

    2006-01-01

    Tire shreds have been used as an alternative to crushed stones (gravel) as drainage media in landfill leachate collection systems. The highly compressible nature of tire shreds (25-47% axial strain on vertical stress applications of 20-700 kPa) may reduce the thickness of the tire shred drainage layer to less than 300 mm (minimum design requirement) during the life of the municipal solid waste landfill. There hence exists a need to predict axial strains of tire shred samples in response to vertical stress applications so that the initial thickness of the tire shred drainage layer can be corrected for compression. The present study performs one-dimensional compressibility tests on four tire shred samples and compares the results with stress/strain curves from other studies. The stress/strain curves are developed into charts for choosing the correct initial thickness of tire shred layers that maintain the minimum thickness of 300 mm throughout the life of the landfill. The charts are developed for a range of vertical stresses based on the design height of municipal waste cell and bulk unit weight of municipal waste. Experimental results also showed that despite experiencing large axial strains, the average permeability of the tire shred sample consistently remained two to three orders of magnitude higher than the design performance criterion of 0.01cm/s for landfill drainage layers. Laboratory experiments, however, need to verify whether long-term chemical and bio-chemical reactions between landfill leachate and the tire shred layer will deteriorate their mechanical functions (hydraulic conductivity, compressibility, strength) beyond permissible limits for geotechnical applications.

  10. Sample Development on Java Smart-Card Electronic Wallet Application

    OpenAIRE

    Toma Cristian

    2009-01-01

    In this paper, are highlighted concepts as: complete Java card application, life cycle of an applet, and a practical electronic wallet sample implemented in Java card technology. As a practical approach it would be interesting building applets for ID, Driving License, Health-Insurance smart cards, for encrypt and digitally sign documents, for E-Commerce and for accessing critical resources in government and military field. The end of this article it is presented a java card electronic wallet ...

  11. Monte Carlo Methods Development and Applications in Conformational Sampling of Proteins

    DEFF Research Database (Denmark)

    Tian, Pengfei

    quantitative insights into their thermodynamic and mechanistic properties that are difficult to probe in laboratory experiments. However, despite the rapid progress in the development of molecular simulation, there are still two limiting factors, (1), the current molecular mechanics force fields alone...... sampling methods to address these two problems. First of all, a novel technique has been developed for reliably estimating diffusion coefficients for use in the enhanced sampling of molecular simulations. A broad applicability of this method is illustrated by studying various simulation problems...

  12. Application of digital sampling techniques to particle identification in scintillation detectors

    International Nuclear Information System (INIS)

    Bardelli, L.; Bini, M.; Poggi, G.; Taccetti, N.

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique

  13. Control sample design using a geodemographic discriminator: An application of Super Profiles

    Science.gov (United States)

    Brown, Peter J. B.; McCulloch, Peter G.; Williams, Evelyn M. I.; Ashurst, Darren C.

    The development and application of an innovative sampling framework for use in a British study of the early detection of gastric cancer are described. The Super Profiles geodemographic discriminator is used in the identification of geographically distinct control and contrast areas from which samples of cancer registry case records may be drawn for comparison with the records of patients participating in the gastric cancer intervention project. Preliminary results of the application of the framework are presented and confirm its effectiveness in satisfactorily reflecting known patterns of variation in cancer occurrence by age, gender and social class. The method works well for cancers with a known and clear social gradient, such as lung and breast cancer, moderately well for gastric cancer and somewhat less well for oesophageal cancer, where the social class gradient is less clear.

  14. 78 FR 23896 - Notice of Funds Availability: Inviting Applications for the Quality Samples Program

    Science.gov (United States)

    2013-04-23

    ... proposals for the 2014 Quality Samples Program (QSP). The intended effect of this notice is to solicit... Strategy (UES) application Internet Web site. The UES allows applicants to submit a single consolidated and... of the FAS marketing programs, financial assistance programs, and market access programs. The...

  15. Measurement and application of purine derivatives: Creatinine ratio in spot urine samples of ruminants

    International Nuclear Information System (INIS)

    Chen, X.B.; Jayasuriya, M.C.N.; Makkar, H.P.S.

    2004-01-01

    The daily excretion of purine derivatives in urine has been used to estimate the supply of microbial protein to ruminant animals. The method provides a simple and non-invasive tool to indicate the nutritional status of farm animals. However due to the need for complete collection of urine the potential application at farm level is restricted. Research conducted under the FAO/IAEA Co-ordinated Research Project has indicated that it is possible to use the purine derivatives:creatinine ratio measured in several spot urine samples collected within a day, as an index of microbial protein supply in a banding system for farm application. Some theoretical and experimental aspects in the measurement of purine derivatives:creatinine ratio in spot urine samples and the possible application of the banding system at the farm level are discussed. (author)

  16. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  17. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  18. Application of Fourier analysis to the study of roughness profiles of eroded samples

    International Nuclear Information System (INIS)

    Bethencourt, M.; Botana, F.J.; Calvino, J.J.; Marcos, M.; Rodriguez-Chacon, M.A.

    1998-01-01

    Fourier transforms are applied to analyse surface roughness profiles recorded on samples coming from erosion-corrosion essays. The information retrieved using this method clearly complements that revealed by the more classical roughness amplitude parameters. The analysis procedure here proposed can be applied not only to characterise the surface of corroded samples but, in general, to evaluate the quality of any surface after application of finishing treatments. (Author) 7 refs

  19. The application of variable sampling method in the audit testing of insurance companies' premium income

    Directory of Open Access Journals (Sweden)

    Jovković Biljana

    2012-12-01

    Full Text Available The aim of this paper is to present the procedure of audit sampling using the variable sampling methods for conducting the tests of income from insurance premiums in insurance company 'Takovo'. Since the incomes from the insurance premiums from vehicle insurance and third-party vehicle insurance have the dominant share of the insurance company's income, the application of this method will be shown in the audit examination of these incomes - incomes from VI and TPVI premiums. For investigating the applicability of these methods in testing the income of other insurance companies, we shall implement the method of variable sampling in the audit testing of the premium income from the three leading insurance companies in Serbia, 'Dunav', 'DDOR' and 'Delta Generali' Insurance.

  20. APPLICATION OF NEUTRON ACTIVATION ANALYSIS IN CHARACTERIZATION OF ENVIRONMENTAL SRM SAMPLES

    Directory of Open Access Journals (Sweden)

    Diah Dwiana Lestiani

    2010-06-01

    Full Text Available Neutron activation analysis (NAA is a nuclear technique that is excellent, multi-elemental, sensitive and has limit detection up to nanogram level. The application of NAA in analysis of Standard Reference Material (SRM National Institute of Standard Technology (NIST 1633b Coal Fly Ash and SRM NIST 1646a Estuarine Sediment was carried out for NAA laboratory inter-comparison program. The samples were distributed by Technology Centre for Nuclear Industry Material, National Nuclear Energy Agency as a coordinator of the inter-comparison program. The samples were irradiated in rabbit facility of G.A. Siwabessy reactor with neutron flux ~ 1013 n.cm-2.s-1, and counted with HPGe spectrometry gamma detector. Several trace elements in these samples were detected. The concentration of Al, Mg, K, Na and Ti in SRM NIST 1633b were 15.11, 7.35, 2.09, 0.192 and 0.756% respectively and the concentration of As, Cr, Mn, Se, V, Sb, Co, Cs, La, Sc and Sm were 137.0, 195.6, 129.4, 9.61, 305.8, 5.45, 56.2, 11.18, 83.73, 41.1 and 19.13 mg/kg respectively. The analysis result in SRM NIST 1646a of the concentration of Al and Na were 2.15 and 0.70% and the concentration of As, Cr, Co, La and Sc were 5.75, 36.3, 4.58, 15.67 and 4.00 mg/kg respectively. These results analysis had relative bias and u-test ranged from 0.4-11.3% and 0.15-2.25. The accuracy and precision evaluation based on International Atomic Energy Agency (IAEA criteria was also applied. The result showed that NAA technique is applicable for the environmental samples analysis, and it also showed that the NAA laboratory in BATAN Bandung has a good performance.   Keywords: NAA, inter-comparison, estuarine sediment, coal fly ash, environmental samples

  1. Nuclear techniques for trace element analysis. PIXE and its applications to biomedical samples

    International Nuclear Information System (INIS)

    Cata-Danil, I.; Moro, R.; Gialanella, G.

    1996-01-01

    Problems in understanding the role of trace elements in the functioning of life processes are discussed. A brief review of the state of the PIXE technique is given. Principles and recent advances in beam systems, instrumentation and sample handling are covered. A rather comprehensive list of references regarding varies methodological aspects and biomedical applications is given. Some applications are discussed. In particular, preliminary results of an investigation regarding pediatric obesity are presented. (author) 5 tabs., 21 refs

  2. Modern Trends in Neutron Activation Analysis. Applications to some African Environmental Samples

    International Nuclear Information System (INIS)

    Hassan, A.M.

    2009-01-01

    This review covers the results of several published articles which deal with the modern trends in neutron activation analysis techniques using some of African research reactors for some environmental samples. The samples used have been collected from different areas in Egypt, South Africa, Ghana, Morocco, Nigeria, and Algeria. The neutron irradiation facilities and the advanced detection systems in each country are outlined. The prompt and delayed gamma-rays emitted due to neutron capture have been applied for investigation of the elemental constituents of such samples. Covered applications include exploration, mining, industrial environment, pollution of air, foodstuffs, soils and irrigation water samples. Some of the developed software programmes as well as the modern methods of data analysis are presented. The thermal and epithermal neutron activation analysis techniques have been applied for estimation of major, minor and trace elements in each material. Some of these data are presented with several comments.

  3. Application of bar codes to the automation of analytical sample data collection

    International Nuclear Information System (INIS)

    Jurgensen, H.A.

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented

  4. Practicing IEF-PAGE of EPO: the impact of detergents and sample application methods on analytical performance in doping control.

    Science.gov (United States)

    Reichel, Christian

    2010-01-01

    Electrophoretic techniques, namely isoelectric focusing polyacrylamide gel electrophoresis (IEF-PAGE) and sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) are key techniques used for confirming the doping-related abuse of recombinant erythropoietins and analogs. IEF-PAGE is performed on horizontal slab-gels with samples applied to the surface of the gel. Different sample application techniques can be employed, but application pieces and applicator strips are most frequently used. However, defective application pieces cause lane streaking during IEF of erythropoietin (EPO), which is especially pronounced in the acidic region of the gel. The effect is due to an incompatibility of the substance used for enhancing the wettability of the cellulose-based commercial product and is batch-dependent. A detailed mass spectrometric study was performed, which revealed that defective sample application pieces (bought between 2007 and 2010) contained a complex mixture of alcohol ethoxylates, alcohol ethoxysulfates, and alkyl sulfates (e.g. SDS). Anionic detergents, like the sulfates contained in these application pieces, are in general incompatible with IEF. Alternative application techniques proved partly useful. While homemade pieces made of blotting paper are a good alternative, the usage of applicator strips or shims is hampered by the risk of leaking wells, which lead to laterally diffused samples. Casting IEF-gels with wells appears to be the best solution, since sustained release of retained proteins from the application pieces can be avoided. Edge effects do not occur if wells are correctly filled with the samples. The evaluation of EPO-profiles with defects is prohibited by the technical document on EPO-analytics (TD2009EPO) of the World Anti-Doping Agency (WADA). Copyright © 2010 John Wiley & Sons, Ltd.

  5. Application of a Dual-Arm Robot in Complex Sample Preparation and Measurement Processes.

    Science.gov (United States)

    Fleischer, Heidi; Drews, Robert Ralf; Janson, Jessica; Chinna Patlolla, Bharath Reddy; Chu, Xianghua; Klos, Michael; Thurow, Kerstin

    2016-10-01

    Automation systems with applied robotics have already been established in industrial applications for many years. In the field of life sciences, a comparable high level of automation can be found in the areas of bioscreening and high-throughput screening. Strong deficits still exist in the development of flexible and universal fully automated systems in the field of analytical measurement. Reasons are the heterogeneous processes with complex structures, which include sample preparation and transport, analytical measurements using complex sensor systems, and suitable data analysis and evaluation. Furthermore, the use of nonstandard sample vessels with various shapes and volumes results in an increased complexity. The direct use of existing automation solutions from bioscreening applications is not possible. A flexible automation system for sample preparation, analysis, and data evaluation is presented in this article. It is applied for the determination of cholesterol in biliary endoprosthesis using gas chromatography-mass spectrometry (GC-MS). A dual-arm robot performs both transport and active manipulation tasks to ensure human-like operation. This general robotic concept also enables the use of manual laboratory devices and equipment and is thus suitable in areas with a high standardization grade. © 2016 Society for Laboratory Automation and Screening.

  6. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  7. Modification and application of a leaf blower-vac for field sampling of arthropods

    NARCIS (Netherlands)

    Zou, Yi; Telgen, van Mario D.; Chen, Junhui; Xiao, Haijun; Kraker, de Joop; Bianchi, Felix J.J.A.; Werf, van der Wopke

    2016-01-01

    Rice fields host a large diversity of arthropods, but investigating their population dynamics and interactions is challenging. Here we describe the modification and application of a leaf blower-vac for suction sampling of arthropod populations in rice. When used in combination with an enclosure,

  8. Characterization of PDMS samples with variation of its synthesis parameters for tunable optics applications

    Science.gov (United States)

    Marquez-Garcia, Josimar; Cruz-Félix, Angel S.; Santiago-Alvarado, Agustin; González-García, Jorge

    2017-09-01

    Nowadays the elastomer known as polydimethylsiloxane (PDMS, Sylgard 184), due to its physical properties, low cost and easy handle, have become a frequently used material for the elaboration of optical components such as: variable focal length liquid lenses, optical waveguides, solid elastic lenses, etc. In recent years, we have been working in the characterization of this material for applications in visual sciences; in this work, we describe the elaboration of PDMSmade samples, also, we present physical and optical properties of the samples by varying its synthesis parameters such as base: curing agent ratio, and both, curing time and temperature. In the case of mechanical properties, tensile and compression tests were carried out through a universal testing machine to obtain the respective stress-strain curves, and to obtain information regarding its optical properties, UV-vis spectroscopy is applied to the samples to obtain transmittance and absorbance curves. Index of refraction variation was obtained through an Abbe refractometer. Results from the characterization will determine the proper synthesis parameters for the elaboration of tunable refractive surfaces for potential applications in robotics.

  9. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    International Nuclear Information System (INIS)

    Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L.H.

    2015-01-01

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The “muon generator” produces muons with zenith angles in the range 0–90° and energies in the range 1–100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance–Rejection and Metropolis–Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1–60 GeV and zenith angles 0–90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic–polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed “muon generator” is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  10. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    Science.gov (United States)

    Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.

    2015-12-01

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  11. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    Energy Technology Data Exchange (ETDEWEB)

    Chatzidakis, S., E-mail: schatzid@purdue.edu; Chrysikopoulou, S.; Tsoukalas, L.H.

    2015-12-21

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The “muon generator” produces muons with zenith angles in the range 0–90° and energies in the range 1–100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance–Rejection and Metropolis–Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1–60 GeV and zenith angles 0–90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic–polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed “muon generator” is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  12. Advanced sampling theory with applications how Michael ‘selected’ Amy

    CERN Document Server

    Singh, Sarjinder

    2003-01-01

    This book is a multi-purpose document. It can be used as a text by teachers, as a reference manual by researchers, and as a practical guide by statisticians. It covers 1165 references from different research journals through almost 1900 citations across 1194 pages, a large number of complete proofs of theorems, important results such as corollaries, and 324 unsolved exercises from several research papers. It includes 159 solved, data-based, real life numerical examples in disciplines such as Agriculture, Demography, Social Science, Applied Economics, Engineering, Medicine, and Survey Sampling. These solved examples are very useful for an understanding of the applications of advanced sampling theory in our daily life and in diverse fields of science. An additional 173 unsolved practical problems are given at the end of the chapters. University and college professors may find these useful when assigning exercises to students. Each exercise gives exposure to several complete research papers for researchers/stude...

  13. Pesticide residues in individual versus composite samples of apples after fine or coarse spray quality application

    NARCIS (Netherlands)

    Poulsen, M.; Wenneker, M.; Withagen, J.C.M.; Christensen, H.B.

    2012-01-01

    In this study, field trials on fine and coarse spray quality application of pesticides on apples were performed. The main objectives were to study the variation of pesticide residue levels in individual fruits versus composite samples, and the effect of standard fine spray quality application versus

  14. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    Science.gov (United States)

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Application of Artificial Neural Networks to the Analysis of NORM Samples

    International Nuclear Information System (INIS)

    Moser, H.; Peyrés, V.; Mejuto, M.; García-Toraño, E.

    2015-01-01

    This work describes the application of artificial neural networks (ANNs) to analyze the raw data of gamma-ray spectra of NORM samples and decide if the activity content of a certain nuclide is above or below the exemption limit of 1 Bq/g. The main advantage of using an ANN for this purpose is that for the user no specialized knowledge in the field of gamma-ray spectrometry is necessary. In total a number of 635 spectra consisting of varying activity concentrations, seven different materials and three densities each have been generated by Monte Carlo simulation to provide training material to the ANN. These spectra have been created using the simulation code PENELOPE. Validation was carried out with a number of NORM samples previously characterized by conventional gamma-ray spectrometry with peak fitting

  16. Aerosol sampling and Transport Efficiency Calculation (ASTEC) and application to surtsey/DCH aerosol sampling system: Code version 1.0: Code description and user's manual

    International Nuclear Information System (INIS)

    Yamano, N.; Brockmann, J.E.

    1989-05-01

    This report describes the features and use of the Aerosol Sampling and Transport Efficiency Calculation (ASTEC) Code. The ASTEC code has been developed to assess aerosol transport efficiency source term experiments at Sandia National Laboratories. This code also has broad application for aerosol sampling and transport efficiency calculations in general as well as for aerosol transport considerations in nuclear reactor safety issues. 32 refs., 31 figs., 7 tabs

  17. The application of x-ray fluorescence and diffraction to the characterization of environmental assessment samples

    International Nuclear Information System (INIS)

    Censullo, A.C.; Briden, F.E.

    1982-01-01

    Some of the results of tests on environmental assessment samples are reported on. The utility of the J.W. Criss fundamental parameters computer program is evaluated for samples in which only one standard per element was used and where the standard matrix did not strictly resemble the unknown matrix. The environmental significance of a sample is dependent not only on its elemental composition, but also on the species or phases which the elements comprise. X-ray powder diffraction may be used to advantage for speciation. Multi-phase environmental assessment samples are amenable to XRD interpretation. Some results of the application of the Joint Committee on Power Diffraction Standards computer interpretatin of typical environmental samples are discussed. They were shown to contribute to the specification of the complex samples that are encountered in environmental assessments

  18. Application of an ultrasonic focusing radiator for acoustic levitation of submillimeter samples

    Science.gov (United States)

    Lee, M. C.

    1981-01-01

    An acoustic apparatus has been specifically developed to handle samples of submillimeter size in a gaseous medium. This apparatus consists of an acoustic levitation device, deployment devices for small liquid and solid samples, heat sources for sample heat treatment, acoustic alignment devices, a cooling system and data-acquisition instrumentation. The levitation device includes a spherical aluminum dish of 12 in. diameter and 0.6 in. thickness, 130 pieces of PZT transducers attached to the back side of the dish and a spherical concave reflector situated in the vicinity of the center of curvature of the dish. The three lowest operating frequencies for the focusing-radiator levitation device are 75, 105 and 163 kHz, respectively. In comparison with other levitation apparatus, it possesses a large radiation pressure and a high lateral positional stability. This apparatus can be used most advantageously in the study of droplets and spherical shell systems, for instance, for fusion target applications.

  19. Contributions for the application of a phoswich detector on the analysis of environmental samples

    International Nuclear Information System (INIS)

    Dalaqua Junior, L.

    1989-01-01

    The characteristics of a phoswich detector and the parameters of the pulse shape descrimination system are evaluated aiming the application on environmental analysis by direct low level gamma ray spectrometry. The calibration curves and adjustments for the pulse discrimination, detector resolution and homogeneity measurements are presented. Background reduction and the 210 Pb detection eficiency on evaporated sources are evaluated. The results obtained demonstrates the application potentiality on the analysis of environmental samples due to a high detection eficiency and good geometry conditions to the measurements. (author) [pt

  20. Skin sample preparation by collagenase digestion for diclofenac quantification using LC-MS/MS after topical application.

    Science.gov (United States)

    Nirogi, Ramakrishna; Padala, Naga Surya Prakash; Boggavarapu, Rajesh Kumar; Kalaikadhiban, Ilayaraja; Ajjala, Devender Reddy; Bhyrapuneni, Gopinadh; Muddana, Nageswara Rao

    2016-06-01

    Skin is the target site to evaluate the pharmacokinetic parameters of topical applications. Sample preparation is one of the influential steps in the bioanalysis of drugs in the skin. Evaluation of dermatopharmacokinetics at preclinical stage is challenging due to lack of proper sample preparation method. There is a need for an efficient sample preparation procedure for quantification of drugs in the skin using LC-MS/MS. The skin samples treated with collagenase followed by homogenization using a bead beater represents a best-fit method resulting in uniform homogenate for reproducible results. A new approach involving enzymatic treatment and mechanical homogenization techniques were evaluated for efficient sample preparation of skin samples in the bioanalysis.

  1. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sample flow meter for batch sampling... Sample flow meter for batch sampling. (a) Application. Use a sample flow meter to determine sample flow... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  2. Reverse sample genome probing, a new technique for identification of bacteria in environmental samples by DNA hybridization, and its application to the identification of sulfate-reducing bacteria in oil field samples

    International Nuclear Information System (INIS)

    Voordouw, G.; Voordouw, J.K.; Karkhoff-Schweizer, R.R.; Fedorak, P.M.; Westlake, D.W.S.

    1991-01-01

    A novel method for identification of bacteria in environmental samples by DNA hybridization is presented. It is based on the fact that, even within a genus, the genomes of different bacteria may have little overall sequence homology. This allows the use of the labeled genomic DNA of a given bacterium (referred to as a standard) to probe for its presence and that of bacteria with highly homologous genomes in total DNA obtained from an environmental sample. Alternatively, total DNA extracted from the sample can be labeled and used to probe filters on which denatured chromosomal DNA from relevant bacterial standards has been spotted. The latter technique is referred to as reverse sample genome probing, since it is the reverse of the usual practice of deriving probes from reference bacteria for analyzing a DNA sample. Reverse sample genome probing allows identification of bacteria in a sample in a single step once a master filter with suitable standards has been developed. Application of reverse sample genome probing to the identification of sulfate-reducing bacteria in 31 samples obtained primarily from oil fields in the province of Alberta has indicated that there are at least 20 genotypically different sulfate-reducing bacteria in these samples

  3. Application of the neutron activation analysis method to the multielemental determination of food samples

    International Nuclear Information System (INIS)

    Maihara, V.A.

    1985-01-01

    The application of thermal neutron activation analysis method for determining elements presented at low concentration and level of traces in bread and dried milk samples, using non-destructive and chemical analyses, was studied. The non-destructive analyses were based on measurements of gamma spectrometry of samples and standards irradiated by thermal neutron flux on the order of 10 12 n cm -2 s -1 . The irradiation time varied from some minutes to 8 hours. The Na, Cl, Mn, Br, Fe, Zn, Rb, Sb, Cr and Sc elements in bread samples were determined. The Na, K, Cl, Ca, Mg, Br, Al, Zn, Rb, Sb and Cr elements in dried milk samples were determined. In destructive analysis, the 24 Na radioisotope was separeted by retention on hydrated antimony pentoxide column from 8N HCL after digestion of organic matter. The bread was dissolved in HNO 3 concentrated and 70% of HCLO 4 and the dried milk was dissolved in HNO 3 concentrated and H 2 O 2 . The 64 Cu, 69m Zn and 140 La radioisotopes determined. The concentrations obtained for dried milk were compared with data obtained by other authors from different contries. Basic considerations on detection limit related to its application on the technique used in this work, were done. The detection limits and trace elements using the Currie and Girardi methods were determined. The accuracy of results obtained for trace element detection limits is discussed. (Author) [pt

  4. Application of high resolution x-ray spectrometry preceded by neutron activation for elemental analysis of soil samples

    International Nuclear Information System (INIS)

    Hernandez Rivero, A.; Capote Rodriguez, G.; Padilla Alvarez, R.; Herrera Peraza, E.

    1997-01-01

    Utilization of High Resolution X-Ray Spectrometry preceded by activation of the samples by irradiation with neutron fluxes (NAA-RX) is a relatively modern trend in application of nuclear techniques. This method may complement advantageously the usual Neutron Activation Analysis by means of Gamma Spectrometry (NAA-G). In this work results obtained by the application of NAA-RX for non-destructive analysis of Cuban soil samples are discussed. The samples were irradiated with reactor neutron fluxes and the induced characteristic X-rays were measured by using Si(Li)-detector. Concentrations of Fe, Zn and Eu as determined by NAA-RX are compared with both NAA-G and XRF data. For the elaboration of X-Ray and Gamma Spectra the computer programs AXIL and ACTAN were used respectively. (author) [es

  5. Application of high resolution x-ray spectrometry preceded by neutron activation for elemental analysis of soil samples

    International Nuclear Information System (INIS)

    Hernandez Rivero, A.; Capote Rodriguez, G.; Herrera Peraza, E.

    1996-01-01

    Utilization of High Resolution X-Ray Spectrometry preceded by activation of the samples by irradiation with neutron fluxes (NAA R X) is a relatively modern trend in application of nuclear techniques. This method may complement advantageously the usual Neutron Activation Analysis by means of Gamma Spectrometry (NAA-G) In this work results obtained by the application of NAA-RX for non-destructive analysis of Cuban soil samples are discussed. The samples were irradiated with reactor neutron fluxes and the induced characteristic X-rays were measured by using Si(li)-detector. Concentrations of Fe, Zn and Eu as determined by NAA-RX are compared with both NAA-G and XRF data. For the elaboration of X-ray and Gamma Spectra the computer programs AXIL and ACTAN were used respectively

  6. Application of semiempirical expressions to the alpha and beta radiometry of environmental depositions samples

    International Nuclear Information System (INIS)

    Perez Tamayo, L.

    1996-01-01

    Were applied two semiempirical equations exponential beta absorption and Bragg-Kleeman approximation complementary to experimental corrections for beta backscattering and auto absorption of beta and alpha radiations in measurements of environmental depositions samples In the first case was verified the validity of mentioned corrections with an application boundary to mass greater than 300 Pb-210 (0.015 mg/cm 2 ) In the second case, the Bragg-Kleeman approximation combined with the experimental beta corrections, bring a judgment to determine the fundamental alpha and beta emisors samples which results the Pb-210 group

  7. Dried blood spot measurement: application in tacrolimus monitoring using limited sampling strategy and abbreviated AUC estimation.

    Science.gov (United States)

    Cheung, Chi Yuen; van der Heijden, Jaques; Hoogtanders, Karin; Christiaans, Maarten; Liu, Yan Lun; Chan, Yiu Han; Choi, Koon Shing; van de Plas, Afke; Shek, Chi Chung; Chau, Ka Foon; Li, Chun Sang; van Hooff, Johannes; Stolk, Leo

    2008-02-01

    Dried blood spot (DBS) sampling and high-performance liquid chromatography tandem-mass spectrometry have been developed in monitoring tacrolimus levels. Our center favors the use of limited sampling strategy and abbreviated formula to estimate the area under concentration-time curve (AUC(0-12)). However, it is inconvenient for patients because they have to wait in the center for blood sampling. We investigated the application of DBS method in tacrolimus level monitoring using limited sampling strategy and abbreviated AUC estimation approach. Duplicate venous samples were obtained at each time point (C(0), C(2), and C(4)). To determine the stability of blood samples, one venous sample was sent to our laboratory immediately. The other duplicate venous samples, together with simultaneous fingerprick blood samples, were sent to the University of Maastricht in the Netherlands. Thirty six patients were recruited and 108 sets of blood samples were collected. There was a highly significant relationship between AUC(0-12), estimated from venous blood samples, and fingerprick blood samples (r(2) = 0.96, P AUC(0-12) strategy as drug monitoring.

  8. Application of nuclear and allied techniques for the characterisation of forensic samples

    International Nuclear Information System (INIS)

    Sudersanan, M.; Kayasth, S.R.; Pant, D.R.; Chattopadhyay, N.; Bhattacharyya, C.N.

    2002-01-01

    Full text: Forensic science deals with the application of various techniques for physics, chemistry and biology for crime investigation. The legal implication of such analysis put considerable restriction on the choice of analytical techniques. Moreover, the unknown nature of the materials, the limited availability of samples and the large number of elements to be analysed put considerable strain on the analytical chemist on the selection of the appropriate technique. The availability of nuclear techniques has considerably enhanced the scope of forensic analysis. This paper deals with the recent results on the use of nuclear and allied analytical techniques for forensic applications. One of the important types of samples of forensic importance pertain to the identification of gunshot residues. The use of nuclear techniques has considerably simplified the interpretation of results through the use of appropriate elements like Ba, Cu, Sb, Zn, As and Sn etc. The combination of non-nuclear techniques for elements like Pb and Ni which are not easily amenable to be analysed by NAA and the use of appropriate separation procedure has led to the use of this method as a valid and versatile analytical procedure. In view of the presence of a large amounts of extraneous materials like cloth, body tissues etc in these samples and the limited availability of materials, the procedures for sample collection, dissolution and analysis have been standardized. Analysis of unknown materials like powders, metallic pieces etc. for the possible presence of nuclear materials or as materials in illicit trafficking is becoming important in recent years. The use of multi-technique approach is important in this case. Use of non-destructive techniques like XRF and radioactive counting enables the preliminary identification of materials and for the detection of radioactivity. Subsequent analysis by NAA or other appropriate analytical methods allows the characterization of the materials. Such

  9. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  10. A novel recursive Fourier transform for nonuniform sampled signals: application to heart rate variability spectrum estimation.

    Science.gov (United States)

    Holland, Alexander; Aboy, Mateo

    2009-07-01

    We present a novel method to iteratively calculate discrete Fourier transforms for discrete time signals with sample time intervals that may be widely nonuniform. The proposed recursive Fourier transform (RFT) does not require interpolation of the samples to uniform time intervals, and each iterative transform update of N frequencies has computational order N. Because of the inherent non-uniformity in the time between successive heart beats, an application particularly well suited for this transform is power spectral density (PSD) estimation for heart rate variability. We compare RFT based spectrum estimation with Lomb-Scargle Transform (LST) based estimation. PSD estimation based on the LST also does not require uniform time samples, but the LST has a computational order greater than Nlog(N). We conducted an assessment study involving the analysis of quasi-stationary signals with various levels of randomly missing heart beats. Our results indicate that the RFT leads to comparable estimation performance to the LST with significantly less computational overhead and complexity for applications requiring iterative spectrum estimations.

  11. Fast neutron (14 MeV) attenuation analysis in saturated core samples and its application in well logging

    International Nuclear Information System (INIS)

    Amin Attarzadeh; Mohammad Kamal Ghassem Al Askari; Tagy Bayat

    2009-01-01

    To introduce the application of nuclear logging, it is appropriate to provide a motivation for the use of nuclear measurement techniques in well logging. Importance aspects of the geological sciences are for instance grain and porosity structure and porosity volume of the rocks, as well as the transport properties of a fluid in the porous media. Nuclear measurements are, as a rule non-intrusive. Namely, a measurement does not destroy the sample, and it does not interfere with the process to be measured. Also, non- intrusive measurements are often much faster than the radiation methods, and can also be applied in field measurements. A common type of nuclear measurement employs neutron irradiation. It is powerful technique for geophysical analysis. In this research we illustrate the detail of this technique and it's applications to well logging and oil industry. Experiments have been performed to investigate the possibilities of using neutron attenuation measurements to determine water and oil content of rock sample. A beam of 14 MeV neutrons produced by a 150 KV neutron generator was attenuated by different samples and subsequently detected with plastic scintillators NE102 (Fast counter). Each sample was saturated with water and oil. The difference in neutron attenuation between dry and wet samples was compared with the fluid content determined by mass balance of the sample. In this experiment we were able to determine 3% of humidity in standard sample model (SiO 2 ) and estimate porosity in geological samples when saturated with different fluids. (Author)

  12. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  13. Field Sampling Plan for Closure of the Central Facilities Area Sewage Treatment Plant Lagoon 3 and Land Application Area

    International Nuclear Information System (INIS)

    Lewis, Michael George

    2016-01-01

    This field sampling plan describes sampling of the soil/liner of Lagoon 3 at the Central Facilities Area Sewage Treatment Plant. The lagoon is to be closed, and samples obtained from the soil/liner will provide information to determine if Lagoon 3 and the land application area can be closed in a manner that renders it safe to human health and the environment. Samples collected under this field sampling plan will be compared to Idaho National Laboratory background soil concentrations. If the concentrations of constituents of concern exceed the background level, they will be compared to Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals and Resource Conservation and Recovery Act levels. If the concentrations of constituents of concern are lower than the background levels, Resource Conservation and Recovery Act levels, or the preliminary remediation goals, then Lagoon 3 and the land application area will be closed. If the Resource Conservation and Recovery Act levels and/or the Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals are exceeded, additional sampling and action may be required.

  14. Field Sampling Plan for Closure of the Central Facilities Area Sewage Treatment Plant Lagoon 3 and Land Application Area

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Michael George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-10-01

    This field sampling plan describes sampling of the soil/liner of Lagoon 3 at the Central Facilities Area Sewage Treatment Plant. The lagoon is to be closed, and samples obtained from the soil/liner will provide information to determine if Lagoon 3 and the land application area can be closed in a manner that renders it safe to human health and the environment. Samples collected under this field sampling plan will be compared to Idaho National Laboratory background soil concentrations. If the concentrations of constituents of concern exceed the background level, they will be compared to Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals and Resource Conservation and Recovery Act levels. If the concentrations of constituents of concern are lower than the background levels, Resource Conservation and Recovery Act levels, or the preliminary remediation goals, then Lagoon 3 and the land application area will be closed. If the Resource Conservation and Recovery Act levels and/or the Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals are exceeded, additional sampling and action may be required.

  15. Approaches to Sampling Gay, Bisexual, and Other Men Who Have Sex with Men from Geosocial-Networking Smartphone Applications: A Methodological Note

    Directory of Open Access Journals (Sweden)

    William C. Goedel

    2016-09-01

    Full Text Available Geosocial-networking smartphone applications utilize global positioning system (GPS technologies to connect users based on their physical proximity. Many gay, bisexual, and other men who have sex with men (MSM have smartphones, and these new mobile technologies have generated quicker and easier modes for MSM to meet potential partners. In doing so, these technologies may facilitate a user’s ability to have multiple concurrent partners, thereby increasing their risk for acquiring HIV or other sexually transmitted infections. Researchers have sought to recruit users of these applications (e.g., Grindr, Jack’d, Scruff into HIV prevention studies, primarily through advertising on the application. Given that these advertisements often broadly targeted large urban areas, these approaches have generated samples that are not representative of the population of users of the given application in a given area. As such, we propose a method to generate a spatially representative sample of MSM via direct messaging on a given application using New York City and its geography as an example of this sampling and recruitment method. These methods can increase geographic representativeness and wider access to MSM who use geosocial-networking smartphone applications.

  16. The application of extraction chromatography to the determination of radionuclides in biological and environmental samples

    International Nuclear Information System (INIS)

    Testa, C.; Delle Site, A.

    1976-01-01

    The paper describe the application of extraction chromatography to the determination of several alpha and beta emitters in biological and environmental samples. Both column extraction chromatography and batch extraction process have been used to isolate the radionuclides from the samples. The effect of several parameters (extractant concentration, support granulometry, stirring time, temperature, presence of a complexing agent) on the extraction and elution has been examined. The application of redox extraction chromatography is also described. A very simple and rapid determination of the activity retained on the column can be obtained by transferring the slurry to a counting vial and by adding the scintillation liquid for a direct detection of the α or β emission. The counting efficiencies obtained with this technique are compared with those obtained with ion exchange resins. The organic polymers used for the extraction chromatography give about 100% counting efficiency. The conventional ion exchange resin cannot be used to this purpose because of their strong light absorption. (T.G.)

  17. Study on Big Database Construction and its Application of Sample Data Collected in CHINA'S First National Geographic Conditions Census Based on Remote Sensing Images

    Science.gov (United States)

    Cheng, T.; Zhou, X.; Jia, Y.; Yang, G.; Bai, J.

    2018-04-01

    In the project of China's First National Geographic Conditions Census, millions of sample data have been collected all over the country for interpreting land cover based on remote sensing images, the quantity of data files reaches more than 12,000,000 and has grown in the following project of National Geographic Conditions Monitoring. By now, using database such as Oracle for storing the big data is the most effective method. However, applicable method is more significant for sample data's management and application. This paper studies a database construction method which is based on relational database with distributed file system. The vector data and file data are saved in different physical location. The key issues and solution method are discussed. Based on this, it studies the application method of sample data and analyzes some kinds of using cases, which could lay the foundation for sample data's application. Particularly, sample data locating in Shaanxi province are selected for verifying the method. At the same time, it takes 10 first-level classes which defined in the land cover classification system for example, and analyzes the spatial distribution and density characteristics of all kinds of sample data. The results verify that the method of database construction which is based on relational database with distributed file system is very useful and applicative for sample data's searching, analyzing and promoted application. Furthermore, sample data collected in the project of China's First National Geographic Conditions Census could be useful in the earth observation and land cover's quality assessment.

  18. A sample application of nuclear power human resources model

    International Nuclear Information System (INIS)

    Gurgen, A.; Ergun, S.

    2016-01-01

    One of the most important issues for a new comer country initializing the nuclear power plant projects is to have both quantitative and qualitative models for the human resources development. For the quantitative model of human resources development for Turkey, “Nuclear Power Human Resources (NPHR) Model” developed by the Los Alamos National Laboratory was used to determine the number of people that will be required from different professional or occupational fields in the planning of human resources for Akkuyu, Sinop and the third nuclear power plant projects. The number of people required for different professions for the Nuclear Energy Project Implementation Department, the regulatory authority, project companies, construction, nuclear power plants and the academy were calculated. In this study, a sample application of the human resources model is presented. The results of the first tries to calculate the human resources needs of Turkey were obtained. Keywords: Human Resources Development, New Comer Country, NPHR Model

  19. Variable Sampling Composite Observer Based Frequency Locked Loop and its Application in Grid Connected System

    Directory of Open Access Journals (Sweden)

    ARUN, K.

    2016-05-01

    Full Text Available A modified digital signal processing procedure is described for the on-line estimation of DC, fundamental and harmonics of periodic signal. A frequency locked loop (FLL incorporated within the parallel structure of observers is proposed to accommodate a wide range of frequency drift. The error in frequency generated under drifting frequencies has been used for changing the sampling frequency of the composite observer, so that the number of samples per cycle of the periodic waveform remains constant. A standard coupled oscillator with automatic gain control is used as numerically controlled oscillator (NCO to generate the enabling pulses for the digital observer. The NCO gives an integer multiple of the fundamental frequency making it suitable for power quality applications. Another observer with DC and second harmonic blocks in the feedback path act as filter and reduces the double frequency content. A systematic study of the FLL is done and a method has been proposed to design the controller. The performance of FLL is validated through simulation and experimental studies. To illustrate applications of the new FLL, estimation of individual harmonics from nonlinear load and the design of a variable sampling resonant controller, for a single phase grid-connected inverter have been presented.

  20. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  1. Flight Vehicle Control and Aerobiological Sampling Applications

    OpenAIRE

    Techy, Laszlo

    2009-01-01

    Aerobiological sampling using unmanned aerial vehicles (UAVs) is an exciting research field blending various scientific and engineering disciplines. The biological data collected using UAVs helps to better understand the atmospheric transport of microorganisms. Autopilot-equipped UAVs can accurately sample along pre-defined flight plans and precisely regulated altitudes. They can provide even greater utility when they are networked together in coordinated sampling missions: such measurements ...

  2. Standardization and application of indirect ELISA for diagnosis of Mycoplasma bovis in bovine blood serum samples

    Directory of Open Access Journals (Sweden)

    Samira Moraes Cunha de Mesquita

    2015-06-01

    Full Text Available ABSTRACT. Mesquita S.M.C., Mansur F.J., Nascimento E.R., Barreto M.L. & Kimura L.M.S. [Standardization and application of indirect ELISA for diagnosis of Mycoplasma bovis in bovine blood serum samples.] Padroniza- ção e aplicação de ELISA indireto para diagnóstico de Mycoplasma bovis em amostras de soro sanguíneo bovino. Revista Brasileira de Medicina Veterinária, 37(2:101-107, 2015. Universidade Federal Fluminense, Faculdade de Veteriná- ria, Rua Vital Brazil Filho, 64, Vital Brazil, Niterói, RJ 24230-340, Brasil. E-mail: samira.veterinaria@gmail.com International researchers presented results indicating frequent involvement of Mycoplasma spp. as a causative agent of mastitis in cattle, associating its presence with significant economic losses to farmers. Mycoplasma bovis is the species most reported and relevant, because it causes more severe disease. The level of antibodies against M. bovis remains high for several months and can be detected by ELISA. The aim of this work was to develop an indirect ELISA with whole cell antigen of M. bovis (strain Donetta PG 45 with subsequent application in bovine blood serum samples for detection of antibodies against M. bovis. The immunization of cows A and B by inoculating an immunogen against M. bovis to obtain hyperimmune blood serum was the first stage of this work, then the stage of standardization of ELISA was proceeded. The concentration of 2 mg of antigen/mL for coating the microtiter plates was decided by statistical analyses. The optical density value 0,2 was determined as the limit of reactivity discrimination of samples (the cut-off point. The hyperimmune blood serum sample of the cow A (collected 30 days after immunization was chosen as the positive control and, the fetal calf serum was chosen as negative control of the assay. In addition, the ideal optimal dilutions found for blood serum samples was 1:400 and for conjugate was 1:10.000 and the substrate used was the ortho

  3. A 1.5--4 Kelvin detachable cold-sample transfer system: Application to inertially confined fusion with spin-polarized hydrogens fuels

    International Nuclear Information System (INIS)

    Alexander, N.; Barden, J.; Fan, Q.; Honig, A.

    1990-01-01

    A compact cold-transfer apparatus for engaging and retrieving samples at liquid helium temperatures (1.5--4K), maintaining the samples at such temperatures for periods of hours, and subsequently inserting them in diverse apparatuses followed by disengagement, is described. The properties of several thermal radiation-insulating shrouds, necessary for very low sample temperatures, are presented. The immediate intended application is transportable target-shells containing highly spin-polarized deuterons in solid HD or D 2 for inertially confined fusion (ICF) experiments. The system is also valuable for unpolarized high-density fusion fuels, as well as for other applications which are discussed. 9 refs., 6 figs

  4. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    Science.gov (United States)

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  5. Method for analysing radium in powder samples and its application to uranium prospecting

    International Nuclear Information System (INIS)

    Gong Xinxi; Hu Minzhi.

    1987-01-01

    The decayed daughter of Rn released from the power sample (soil) in a sealed bottle were collected on a piece of copper and the radium in the sample can be measured by counting α-particles with an Alphameter for uranium prospection, thus it is called the radium method. This method has many advantages, such as high sensitivity (the lowest limit of detection for radium sample per gram is 2.7 x 10 -15 g), high efficiency, low cost and easy to use. On the basis of measuring more than 700 samples taken along 20 sections in 8 deposits, the results show that the radium method is better than γ-measurement and equal to 210 Po method for the capability to descover anomalies. The author also summarizes the anomaly intensities of radium method, 210 Po method and γ-measurement respectively at the surface with deep blind ores, with or without surficial mineralization, and the figures of their profiles and the variation of Ra/ 210 Po ratios. According to the above-mentioned distinguishing features, the uranium mineralization located in deep and/or shallow parts can be distinguishd. The combined application of radium, 210 Po and γ-measurement methods may be regarded as one of the important methods used for anomaly assessment. Based on the experiments of the radium measurements with 771 stream sediments samples in an area of 100 km 2 , it is demonstrated that the radium mehtod can be used in the stages of uranium reconnaissance and prospecting

  6. Iridium oxide pH sensor for biomedical applications. Case urea-urease in real urine samples.

    Science.gov (United States)

    Prats-Alfonso, Elisabet; Abad, Llibertat; Casañ-Pastor, Nieves; Gonzalo-Ruiz, Javier; Baldrich, Eva

    2013-01-15

    This work demonstrates the implementation of iridium oxide films (IROF) grown on silicon-based thin-film platinum microelectrodes, their utilization as a pH sensor, and their successful formatting into a urea pH sensor. In this context, Pt electrodes were fabricated on Silicon by using standard photolithography and lift-off procedures and IROF thin films were growth by a dynamic oxidation electrodeposition method (AEIROF). The AEIROF pH sensor reported showed a super-Nerstian (72.9±0.9mV/pH) response between pH 3 and 11, with residual standard deviation of both repeatability and reproducibility below 5%, and resolution of 0.03 pH units. For their application as urea pH sensors, AEIROF electrodes were reversibly modified with urease-coated magnetic microparticles (MP) using a magnet. The urea pH sensor provided fast detection of urea between 78μM and 20mM in saline solution, in sample volumes of just 50μL. The applicability to urea determination in real urine samples is discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Simultaneous radiochemical determination of plutonium, strontium, uranium, and iron nuclides and application to atmospheric deposition and aerosol samples

    International Nuclear Information System (INIS)

    Rosner, G.; Hoetzl, H.; Winkler, R.

    1990-01-01

    A procedure for the sequential radiochemical determination of plutonium, strontium, uranium and iron nuclides is described. The separation is carried out on a single anion exchange column. Pu(IV), U(VI) and Fe(III) are fixed on Bio Rad AG 1-X4 from 9 mol/l HCl, while the sample effluent is used for the determination of radiostrontium. Fe and U are eluted separately with 7 mol/l HNO 3 , and Pu(III) is eluted with 1.2 mol/l HCl containing hydrogen peroxide. Subsequently, Pu and U are electrolysed and counted by alpha spectrometry. Radiostrontium is purified by the nitrate method and counted in a low level beta proportional counter. Fe is purified by extraction and cation exchange and 55 Fe is counted by X-ray spectrometry with a Si(Li) detector. The sample preparation and the application of the procedure to large samples, namely aerosols from 10 5 m 3 of air, and monthly deposition samples from 0.6 m 2 sampling area (10-100 l) are described. Chemical yields are for Pu 70 ± 20, for Sr 80 ± 15, for U 80-90, and for Fe 75 ± 10%. As an example, the maximum airborne radionuclide concentrations determined with that procedure in fortnightly collected samples at Neuherberg after the Chernobyl accident were: 239+240 Pu, 2.58; 238 Pu, 1.40; 238 U, 0.65; 234 U, 0.67; 90 Sr, 7600; and 55 Fe, 990 μBqm -3 . With appropriate changes in sample preparation, the procedure is applicable to other kinds of samples. (orig.)

  8. Online mixted sampling: An application in hidden populations Muestreo mixto online: Una aplicación en poblaciones ocultas Online mixted sampling: An application in hidden populations

    Directory of Open Access Journals (Sweden)

    María Tatiana Gorjup

    2012-04-01

    inmigrante argentino y utilizando como fuentes de información los grupos virtuales.Purpose: The objective of the article is to explore the possibilities offered by new technologies and virtual social networks for the recruitment of sampling units in hidden populations and as a support of the use of mixed methods.Design/methodology: The objective was to identify Argentinean entrepreneurs who start their business in Spain. The observation unit has the characteristics of a hidden population: 1 high geographic dispersion which makes it difficult to localize them; 2 underestimation of the size of Argentinean residents in the official statistics; 3 Argentinean residents in illegal situation; and, 4 in some cases, the factors that led the emigration were negative, making them reluctant to answer. In this context, the researchers used (1 an online virtual sampling and, (2 the traditional snowball sampling. The online virtual sampling was carried out by using a social network (Facebook through which 52 virtual groups of ‘Argentinean living in Spain’ were identified. Subsequently, each member was contacted by an individual message which explained the aim of the research and invited them to participate in the study. Findings: Through the development of this study, it was possible to prove that the use of virtual groups in social networks led to detect observation units that are not registered officially (administrative register, census, etc.. This finding contributed to increase the scope and size of the sample, it favoured the design of the qualitative sample and the triangulation of the results. Therefore, it increased the validity of the hidden population.Originality/value: The article presents an experience of application of virtual sampling and mixed methods in the study of hidden populations. In particular, it analysed Argentinean immigrant entrepreneurs by using virtual groups as a source of information.Purpose: The objective of the article is to explore the possibilities

  9. Simultaneous radiochemical determination of plutonium, strontium, uranium, and iron nuclides and application to atmospheric deposition and aerosol samples

    Energy Technology Data Exchange (ETDEWEB)

    Rosner, G.; Hoetzl, H.; Winkler, R. (Gesellschaft fuer Strahlen- und Umweltforschung mbH Muenchen, Neuherberg (Germany, F.R.). Inst. fuer Strahlenschutz)

    1990-11-01

    A procedure for the sequential radiochemical determination of plutonium, strontium, uranium and iron nuclides is described. The separation is carried out on a single anion exchange column. Pu(IV), U(VI) and Fe(III) are fixed on Bio Rad AG 1-X4 from 9 mol/l HCl, while the sample effluent is used for the determination of radiostrontium. Fe and U are eluted separately with 7 mol/l HNO{sub 3}, and Pu(III) is eluted with 1.2 mol/l HCl containing hydrogen peroxide. Subsequently, Pu and U are electrolysed and counted by alpha spectrometry. Radiostrontium is purified by the nitrate method and counted in a low level beta proportional counter. Fe is purified by extraction and cation exchange and {sup 55}Fe is counted by X-ray spectrometry with a Si(Li) detector. The sample preparation and the application of the procedure to large samples, namely aerosols from 10{sup 5} m{sup 3} of air, and monthly deposition samples from 0.6 m{sup 2} sampling area (10-100 l) are described. Chemical yields are for Pu 70 {plus minus} 20, for Sr 80 {plus minus} 15, for U 80-90, and for Fe 75 {plus minus} 10%. As an example, the maximum airborne radionuclide concentrations determined with that procedure in fortnightly collected samples at Neuherberg after the Chernobyl accident were: {sup 239+240}Pu, 2.58; {sup 238}Pu, 1.40; {sup 238}U, 0.65; {sup 234}U, 0.67; {sup 90}Sr, 7600; and {sup 55}Fe, 990 {mu}Bqm{sup -3}. With appropriate changes in sample preparation, the procedure is applicable to other kinds of samples. (orig.).

  10. Probe depth matters in dermal microdialysis sampling of benzoic acid after topical application

    DEFF Research Database (Denmark)

    Holmgaard, R; Benfeldt, E; Bangsgaard, N

    2012-01-01

    -2 mm) and deep (>2 mm) positioning of the linear MD probe in the dermis of human abdominal skin, followed by topical application of 4 mg/ml of benzoic acid (BA) in skin chambers overlying the probes. Dialysate was sampled every hour for 12 h and analysed for BA content by high-performance liquid...... chromatography. Probe depth was measured by 20-MHz ultrasound scanning. The area under the time-versus-concentration curve (AUC) describes the drug exposure in the tissue during the experiment and is a relevant parameter to compare for the 3 dermal probe depths investigated. The AUC(0-12) were: superficial...... significantly different from each other (p value paper demonstrates that there is an inverse relationship between the depth of the probe in the dermis and the amount of drug sampled following topical penetration ex vivo. The result is of relevance to the in vivo situation, and it can...

  11. Sampling strong tracking nonlinear unscented Kalman filter and its application in eye tracking

    International Nuclear Information System (INIS)

    Zu-Tao, Zhang; Jia-Shu, Zhang

    2010-01-01

    The unscented Kalman filter is a developed well-known method for nonlinear motion estimation and tracking. However, the standard unscented Kalman filter has the inherent drawbacks, such as numerical instability and much more time spent on calculation in practical applications. In this paper, we present a novel sampling strong tracking nonlinear unscented Kalman filter, aiming to overcome the difficulty in nonlinear eye tracking. In the above proposed filter, the simplified unscented transform sampling strategy with n + 2 sigma points leads to the computational efficiency, and suboptimal fading factor of strong tracking filtering is introduced to improve robustness and accuracy of eye tracking. Compared with the related unscented Kalman filter for eye tracking, the proposed filter has potential advantages in robustness, convergence speed, and tracking accuracy. The final experimental results show the validity of our method for eye tracking under realistic conditions. (classical areas of phenomenology)

  12. An innovative approach to sampling complex industrial emissions for use in animal toxicity tests: application to iron casting operations.

    Science.gov (United States)

    Palmer, W G; Scholz, R C; Moorman, W J

    1983-03-01

    Sampling of complex mixtures of airborne contaminants for chronic animal toxicity tests often involves numerous sampling devices, requires extensive sampling time, and yields forms of collected materials unsuitable for administration to animals. A method is described which used a high volume, wet venturi scrubber for collection of respirable fractions of emissions from iron foundry casting operations. The construction and operation of the sampler are presented along with collection efficiency data and its application to the preparation of large quantities of samples to be administered to animals by intratracheal instillation.

  13. Large-volume injection of sample diluents not miscible with the mobile phase as an alternative approach in sample preparation for bioanalysis: an application for fenspiride bioequivalence.

    Science.gov (United States)

    Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor

    2011-09-01

    Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.

  14. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  15. Innovation leading the way: application of lean manufacturing to sample management.

    Science.gov (United States)

    Allen, M; Wigglesworth, M J

    2009-06-01

    Historically, sample management successfully focused on providing compound quality and tracking distribution within a diverse geographic. However, if a competitive advantage is to be delivered in a changing environment of outsourcing, efficiency and customer service must now improve or face reconstruction. The authors have used discrete event simulation to model the compound process from chemistry to assay and applied lean manufacturing techniques to analyze and improve these processes. In doing so, they identified a value-adding process time of just 11 min within a procedure that took days. Modeling also allowed the analysis of equipment and human resources necessary to complete the expected demand in an acceptable cycle time. Layout and location of sample management and screening departments are key in allowing process integration, creating rapid flow of work, and delivering these efficiencies. Following this analysis and minor process changes, the authors have demonstrated for 2 programs that solid compounds can be converted to assay-ready plates in less than 4 h. In addition, it is now possible to deliver assay data from these compounds within the same working day, allowing chemistry teams more flexibility and more time to execute the next chemistry round. Additional application of lean manufacturing principles has the potential to further decrease cycle times while using fewer resources.

  16. Applicability of Non-Invasive Sampling in Population Genetic Study of Taiwanese Macaques (Macaca cyclopis

    Directory of Open Access Journals (Sweden)

    Jui-Hua Chu

    2006-12-01

    Full Text Available This paper presents a pilot study conducted to test the applicability of non-invasive sampling approach in population genetic studies of Taiwanese macaques (Macaca cyclopis. Monkey feces were collected in the field and used as non-invasive DNA sources. PCR success rates of both microsatellite and mitochondrial DNA markers were examined. When compared with other studies by non-invasive genetic sampling of different mammal species, success rate of microsatellite PCR amplification is low (42.4%, N = 181 while that of mtDNA PCR amplification is acceptable (66.5%, N = 334. The low PCR success rate and poor PCR repeatability of microsatellite alleles due to allelic dropout and false alleles make it difficult to obtain a reliable microsatellite data set. However, the difficulties may be overcome by new techniques.

  17. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  18. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  19. Optimal sampling theory and population modelling - Application to determination of the influence of the microgravity environment on drug distribution and elimination

    Science.gov (United States)

    Drusano, George L.

    1991-01-01

    The optimal sampling theory is evaluated in applications to studies related to the distribution and elimination of several drugs (including ceftazidime, piperacillin, and ciprofloxacin), using the SAMPLE module of the ADAPT II package of programs developed by D'Argenio and Schumitzky (1979, 1988) and comparing the pharmacokinetic parameter values with results obtained by traditional ten-sample design. The impact of the use of optimal sampling was demonstrated in conjunction with NONMEM (Sheiner et al., 1977) approach, in which the population is taken as the unit of analysis, allowing even fragmentary patient data sets to contribute to population parameter estimates. It is shown that this technique is applicable in both the single-dose and the multiple-dose environments. The ability to study real patients made it possible to show that there was a bimodal distribution in ciprofloxacin nonrenal clearance.

  20. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  2. Application of the neutron activation analysis method to the multielemental determination of food samples

    International Nuclear Information System (INIS)

    Maihara, V.A.

    1985-01-01

    The thermal neutron activation analysis method was applied to the determination of elements present at low concentrations and trace levels in samples of bread and milk powder using non-destructive analyses were based on gamma ray spectrometric measurements of samples and standards irradiated for periods which varied from some minutes to eight hours in a thermal neutron flux of about 10 12 n cm -2 s -1 . The concentrations obtained for milk powder were compared with the data obtained by other autors from different contries. For the bread, that comparison was not possible, because data about trace analysis in bread samples were not found. Besides, the results obtained for the various brands of bread and milk by means of non destructive and destructive analyses were compared using Student's t criterion. Some basic considerations about 'Detection Limit' were done, mainly in relation to its application in the technique used in the present work. The detection and determination limits of the trace elements analysed by destructive and non destructive techniques in bread and milk powder samples were determined using the Currie and Girardi methods. The precision of the analyses and the results obtained for the detection limits of the analysed trace elements are discussed. (Author) [pt

  3. Electromagnetic shower development and applications to sampling calorimeters

    International Nuclear Information System (INIS)

    Prescott, C.Y.

    1984-07-01

    The application of electromagnetic theory to particle interactions is an old subject which represented one of the early successes in the study of particle interactions and fundamental forces. The ability to describe properties of electron, positron, and photon interactions has led to applications in numerous experimental devices used in high energy experiments. The subject is now considered to be relatively mature, but applications continue to evolve as new ideas are tried and new techniques become available. This report is a review of the underlying processes, a discussion of the application to electromagnetic calorimetry, discussions of some scaling laws and approximations that serve to guide designs of experimental devices, and examples where these principles are put to work. 13 references, 10 figures, 2 tables

  4. Electromagnetic shower development and applications to sampling calorimeters

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, C.Y.

    1984-07-01

    The application of electromagnetic theory to particle interactions is an old subject which represented one of the early successes in the study of particle interactions and fundamental forces. The ability to describe properties of electron, positron, and photon interactions has led to applications in numerous experimental devices used in high energy experiments. The subject is now considered to be relatively mature, but applications continue to evolve as new ideas are tried and new techniques become available. This report is a review of the underlying processes, a discussion of the application to electromagnetic calorimetry, discussions of some scaling laws and approximations that serve to guide designs of experimental devices, and examples where these principles are put to work. 13 references, 10 figures, 2 tables.

  5. On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study.

    Science.gov (United States)

    Fan, Chunpeng; Zhang, Donghui; Zhang, Cun-Hui

    2011-03-01

    As the nonparametric generalization of the one-way analysis of variance model, the Kruskal-Wallis test applies when the goal is to test the difference between multiple samples and the underlying population distributions are nonnormal or unknown. Although the Kruskal-Wallis test has been widely used for data analysis, power and sample size methods for this test have been investigated to a much lesser extent. This article proposes new power and sample size calculation methods for the Kruskal-Wallis test based on the pilot study in either a completely nonparametric model or a semiparametric location model. No assumption is made on the shape of the underlying population distributions. Simulation results show that, in terms of sample size calculation for the Kruskal-Wallis test, the proposed methods are more reliable and preferable to some more traditional methods. A mouse peritoneal cavity study is used to demonstrate the application of the methods. © 2010, The International Biometric Society.

  6. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  7. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  8. Application of bootstrap sampling in gamma-ray astronomy: Time variability in pulsed emission from crab pulsar

    International Nuclear Information System (INIS)

    Ozel, M.E.; Mayer-Hasselwander, H.

    1985-01-01

    This paper discusses the bootstrap scheme which fits well for many astronomical applications. It is based on the well-known sampling plan called ''sampling with replacement''. Digital computers make the method very practical for the investigation of various trends present in a limited set of data which is usually a small fraction of the total population. The authors attempt to apply the method and demonstrate its feasibility. The study indicates that the discrete nature of high energy gamma-ray data makes the bootstrap method especially attractive for gamma-ray astronomy. Present analysis shows that the ratio of pulse strengths is variable with a 99.8% confidence

  9. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  10. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  11. Estimation of effective temperatures in quantum annealers for sampling applications: A case study with possible applications in deep learning

    Science.gov (United States)

    Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro

    2016-08-01

    An increase in the efficiency of sampling from Boltzmann distributions would have a significant impact on deep learning and other machine-learning applications. Recently, quantum annealers have been proposed as a potential candidate to speed up this task, but several limitations still bar these state-of-the-art technologies from being used effectively. One of the main limitations is that, while the device may indeed sample from a Boltzmann-like distribution, quantum dynamical arguments suggest it will do so with an instance-dependent effective temperature, different from its physical temperature. Unless this unknown temperature can be unveiled, it might not be possible to effectively use a quantum annealer for Boltzmann sampling. In this work, we propose a strategy to overcome this challenge with a simple effective-temperature estimation algorithm. We provide a systematic study assessing the impact of the effective temperatures in the learning of a special class of a restricted Boltzmann machine embedded on quantum hardware, which can serve as a building block for deep-learning architectures. We also provide a comparison to k -step contrastive divergence (CD-k ) with k up to 100. Although assuming a suitable fixed effective temperature also allows us to outperform one-step contrastive divergence (CD-1), only when using an instance-dependent effective temperature do we find a performance close to that of CD-100 for the case studied here.

  12. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  13. Application of the spectrometric and radiochemical techniques in analyzing environmental samples from the Bulgarian Black Sea region

    International Nuclear Information System (INIS)

    Veleva, B.S.; Mungov, G.; Galabov, N.; Kolarova, M.; Guenchev, T.

    1999-01-01

    Development of the appropriate methods and techniques for marine and atmospheric radioactivity measurements in the NIMH-BAS during the last 5 years is presented. Approaches for pre-concentration of the radionuclides from the atmosphere and sea water samples followed with reliable radiochemical methods for radionuclides separation and low level counting are discussed. Dissolved radiocesium concentrations measured in a period of time starting in 1993 show some decrease with years and spatial variations probably due to the hydrophysical features of the sampling sides - the lower measured concentrations during 1995 and 1998 correspond to the lower salinity. Application of the radiochemical separation of Plutonium, Thorium, 90 Sr and Americium on the samples from the Bulgarian Black Sea coastal region is reported. (author)

  14. In vitro evaluation of new biocompatible coatings for solid-phase microextraction: implications for drug analysis and in vivo sampling applications.

    Science.gov (United States)

    Vuckovic, Dajana; Shirey, Robert; Chen, Yong; Sidisky, Len; Aurand, Craig; Stenerson, Katherine; Pawliszyn, Janusz

    2009-04-13

    A new line of solid-phase microextraction (SPME) coatings suitable for use with liquid chromatography applications was recently developed to address the limitations of the currently available coatings. The proposed coatings were immobilized on the metal fiber core and consisted of a mixture of proprietary biocompatible binder and various types of coated silica (octadecyl, polar embedded and cyano) particles. The aim of this research was to perform in vitro assessment of these new SPME fibers in order to evaluate their suitability for drug analysis and in vivo SPME applications. The main parameters examined were extraction efficiency, solvent resistance, preconditioning, dependence of extraction kinetics on coating thickness, carryover, linear range and inter-fiber reproducibility. The performance of the proposed coatings was compared against commercial Carbowax-TPR (CW-TPR) coating, when applicable. The fibers were evaluated for the extraction of drugs of different classes (carbamazepine, propranolol, pseudoephedrine, ranitidine and diazepam) from plasma and urine. The analyses were performed using liquid chromatography-tandem mass spectrometry. The results show that the fibers perform very well for the extraction of biological fluids with no sample pre-treatment required and can also be used for in vivo sampling applications of flowing blood. A coating thickness of 45 microm was found to be a good compromise between extraction capacity and extraction kinetics. Due to the high extraction efficiency of these coatings, pre-equilibrium SPME with very short extraction times (2 min) can be employed to increase sample throughput. Inter-fiber reproducibility was drugs examined in plasma, which is a significant improvement over polypyrrole coatings reported in literature, and permits single fiber use for in vivo applications.

  15. An international comparison of the Ohio department of aging-resident satisfaction survey: applicability in a U.S. and Canadian sample.

    Science.gov (United States)

    Cooke, Heather A; Yamashita, Takashi; Brown, J Scott; Straker, Jane K; Baiton Wilkinson, Susan

    2013-12-01

    The majority of resident satisfaction surveys available for use in assisted living settings have been developed in the United States; however, empirical assessment of their measurement properties remains limited and sporadic, as does knowledge regarding their applicability for use in settings outside of the United States. This study further examines the psychometric properties of the Ohio Department of Aging-Resident Satisfaction Survey (ODA-RSS) and explores its applicability within a sample of Canadian assisted living facilities. Data were collected from 9,739 residential care facility (RCF) residents in Ohio, United States and 938 assisted-living residents in British Columbia, Canada. Confirmatory factor analysis was used to assess the instrument's psychometric properties within the 2 samples. Although the ODA-RSS appears well suited for assessing resident satisfaction in Ohio RCFs, it is less so in British Columbia assisted living settings. Adequate reliability and validity were observed for all 8 measurable instrument domains in the Ohio sample, but only 4 (Care and Services, Employee Relations, Employee Responsiveness, and Communications) in the British Columbia sample. The ODA-RSS performs best in an environment that encompasses a wide range of RCF types. In settings where greater uniformity and standardization exist, more nuanced questions may be required to detect variation between facilities. It is not sufficient to assume that rigorous development and empirical testing of a tool ensures its applicability in states or countries other than that in which it was initially developed.

  16. Spectral BRDF measurements of metallic samples for laser processing applications

    International Nuclear Information System (INIS)

    Vitali, L; Fustinoni, D; Gramazio, P; Niro, A

    2015-01-01

    The spectral bidirectional reflectance distribution function (BRDF) of metals plays an important role in industrial processing involving laser-surface interaction. In particular, in laser metal machining, absorbance is strongly dependent on the radiation incidence angle as well as on finishing and contamination grade of the surface, and in turn it can considerably affect processing results. Very recently, laser radiation is also used to structure metallic surfaces, in order to produce many particular optical effects, ranging from a high level polishing to angular color shifting. Of course, full knowledge of the spectral BRDF of these structured layers makes it possible to infer reflectance or color for any irradiation and viewing angles. In this paper, we present Vis-NIR spectral BRDF measurements of laser-polished metallic, opaque, flat samples commonly employed in such applications. The resulting optical properties seem to be dependent on the atmospheric composition during the polishing process in addition to the roughness. The measurements are carried out with a Perkin Elmer Lambda 950 double-beam spectrophotometer, equipped with the Absolute Reflectance/Transmittance Analyzer (ARTA) motorized goniometer. (paper)

  17. Application of a free parameter model to plastic scintillation samples

    Energy Technology Data Exchange (ETDEWEB)

    Tarancon Sanz, Alex, E-mail: alex.tarancon@ub.edu [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Kossert, Karsten, E-mail: Karsten.Kossert@ptb.de [Physikalisch-Technische Bundesanstalt (PTB), Bundesallee 100, 38116 Braunschweig (Germany)

    2011-08-21

    In liquid scintillation (LS) counting, the CIEMAT/NIST efficiency tracing method and the triple-to-double coincidence ratio (TDCR) method have proved their worth for reliable activity measurements of a number of radionuclides. In this paper, an extended approach to apply a free-parameter model to samples containing a mixture of solid plastic scintillation microspheres and radioactive aqueous solutions is presented. Several beta-emitting radionuclides were measured in a TDCR system at PTB. For the application of the free parameter model, the energy loss in the aqueous phase must be taken into account, since this portion of the particle energy does not contribute to the creation of scintillation light. The energy deposit in the aqueous phase is determined by means of Monte Carlo calculations applying the PENELOPE software package. To this end, great efforts were made to model the geometry of the samples. Finally, a new geometry parameter was defined, which was determined by means of a tracer radionuclide with known activity. This makes the analysis of experimental TDCR data of other radionuclides possible. The deviations between the determined activity concentrations and reference values were found to be lower than 3%. The outcome of this research work is also important for a better understanding of liquid scintillation counting. In particular the influence of (inverse) micelles, i.e. the aqueous spaces embedded in the organic scintillation cocktail, can be investigated. The new approach makes clear that it is important to take the energy loss in the aqueous phase into account. In particular for radionuclides emitting low-energy electrons (e.g. M-Auger electrons from {sup 125}I), this effect can be very important.

  18. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  19. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  20. Sample preparation of environmental samples using benzene synthesis followed by high-performance LSC

    International Nuclear Information System (INIS)

    Filippis, S. De; Noakes, J.E.

    1991-01-01

    Liquid scintillation counting (LSC) techniques have been widely employed as the detection method for determining environmental levels of tritium and 14 C. Since anthropogenic and nonanthropogenic inputs to the environment are a concern, sampling the environment surrounding a nuclear power facility or fuel reprocessing operation requires the collection of many different sample types, including agriculture products, water, biota, aquatic life, soil, and vegetation. These sample types are not suitable for the direct detection of tritium of 14 C for liquid scintillation techniques. Each sample type must be initially prepared in order to obtain the carbon or hydrogen component of interest and present this in a chemical form that is compatible with common chemicals used in scintillation counting applications. Converting the sample of interest to chemically pure benzene as a sample preparation technique has been widely accepted for processing samples for radiocarbon age-dating applications. The synthesized benzene is composed of the carbon or hydrogen atoms from the original sample and is ideal as a solvent for LSC with excellent photo-optical properties. Benzene synthesis followed by low-background scintillation counting can be applied to the preparation and measurement of environmental samples yielding good detection sensitivities, high radionuclide counting efficiency, and shorter preparation time. The method of benzene synthesis provides a unique approach to the preparation of a wide variety of environmental sample types using similar chemistry for all samples

  1. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  2. Pesticide residues in individual versus composite samples of apples after fine or coarse spray quality application

    DEFF Research Database (Denmark)

    Poulsen, Mette E.; Wenneker, Marcel; Withagen, Jacques

    2012-01-01

    . None of the results for the pesticides residues measured in individual apples exceeded the EU Maximum Residue Levels (MRLs). However, there was a large variation in the residues levels in the apples, with levels from 0.01 to 1.4 mg kg−1 for captan, the pesticide with the highest variation, and from 0.......01 to 0.2 mg kg−1 for pyraclostrobin, the pesticide with the lowest variation. Residues of fenoxycarb and indoxacarb were only found in a few apples, probably due to the early application time of these two compounds. The evaluation of the effect of spray quality did not show any major difference between......In this study, field trials on fine and coarse spray quality application of pesticides on apples were performed. The main objectives were to study the variation of pesticide residue levels in individual fruits versus composite samples, and the effect of standard fine spray quality application...

  3. "On-off" switchable tool for food sample preparation: merging molecularly imprinting technology with stimuli-responsive blocks. Current status, challenges and highlighted applications.

    Science.gov (United States)

    Garcia, Raquel; Gomes da Silva, Marco D R; Cabrita, Maria João

    2018-01-01

    Sample preparation still remains a great challenge in the analytical workflow representing the most time-consuming and laborious step in analytical procedures. Ideally, sample pre-treatment procedures must be more selective, cheap, quick and environmental friendly. Molecular imprinting technology is a powerful tool in the development of highly selective sample preparation methodologies enabling to preconcentrate the analytes from a complex food matrix. Actually, the design and development of molecularly imprinted polymers-based functional materials that merge an enhancement of selectivity with a controllable and switchable mode of action by means of specific stimulus constitutes a hot research topic in the field of food analysis. Thus, combining the stimuli responsive mechanism and imprinting technology a new generation of materials are emerging. The application of these smart materials in sample preparation is in early stage of development, nevertheless new improvements will promote a new driven in the demanding field of food sample preparation. The new trends in the advancement of food sample preparation using these smart materials will be presented in this review and highlighted the most relevant applications in this particular area of knowledge. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Application of lot quality assurance sampling for leprosy elimination monitoring--examination of some critical factors.

    Science.gov (United States)

    Gupte, M D; Murthy, B N; Mahmood, K; Meeralakshmi, S; Nagaraju, B; Prabhakaran, R

    2004-04-01

    The concept of elimination of an infectious disease is different from eradication and in a way from control as well. In disease elimination programmes the desired reduced level of prevalence is set up as the target to be achieved in a practical time frame. Elimination can be considered in the context of national or regional levels. Prevalence levels depend on occurrence of new cases and thus could remain fluctuating. There are no ready pragmatic methods to monitor the progress of leprosy elimination programmes. We therefore tried to explore newer methods to answer these demands. With the lowering of prevalence of leprosy to the desired level of 1 case per 10000 population at the global level, the programme administrators' concern will be shifted to smaller areas e.g. national and sub-national levels. For monitoring this situation, we earlier observed that lot quality assurance sampling (LQAS), a quality control tool in industry was useful in the initially high endemic areas. However, critical factors such as geographical distribution of cases and adoption of cluster sampling design instead of simple random sampling design deserve attention before LQAS could generally be recommended. The present exercise was aimed at validating applicability of LQAS, and adopting these modifications for monitoring leprosy elimination in Tamil Nadu state, which was highly endemic for leprosy. A representative sample of 64000 people drawn from eight districts of Tamil Nadu state, India, with maximum allowable number of 25 cases was considered, using LQAS methodology to test whether leprosy prevalence was at or below 7 per 10000 population. Expected number of cases for each district was obtained assuming Poisson distribution. Goodness of fit for the observed and expected cases (closeness of the expected number of cases to those observed) was tested through chi(2). Enhancing factor (design effect) for sample size was obtained by computing the intraclass correlation. The survey actually

  5. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  6. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  7. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Directory of Open Access Journals (Sweden)

    Meznah Almutairy

    Full Text Available Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  8. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    Science.gov (United States)

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  9. Quantitative NMR measurements on core samples

    International Nuclear Information System (INIS)

    Olsen, Dan

    1997-01-01

    Within the frame of an EFP-95 project NMR methods for porosity determination in 2D, and for fluid saturation determination in 1D and 2D have been developed. The three methods have been developed and tested on cleaned core samples of chalk from the Danish North Sea. The main restriction for the use of the methods is the inherently short T2 relaxation constants of rock samples. Referring to measurements conducted at 200 MHz, the 2D porosity determination method is applicable to sample material with T2 relaxation constants down to 5 ms. The 1D fluid saturation determination method is applicable to sample material with T2 relaxation constants down to 3 ms, while the 2D fluid saturation determination method is applicable to material with T2 relaxation constants down to 8 ms. In the case of the 2D methods these constraints as a minimum enables work on the majority of chalk samples of Maastrichtian age. The 1D fluid saturation determination method in addition is applicable to at least some chalk samples of Danian and pre-Maastrichtian age. The spatial resolution of the 2D porosity determination method, the 1D fluid saturation methods, and the 2D fluid saturation method is respectively 0.8 mm, 0.8 mm and 2 mm. Reproducibility of pixel values is for all three methods 2%- points. (au)

  10. A dansyl based fluorescence chemosensor for Hg2+ and its application in the complicated environment samples

    Science.gov (United States)

    Zhou, Shuai; Zhou, Ze-Quan; Zhao, Xuan-Xuan; Xiao, Yu-Hao; Xi, Gang; Liu, Jin-Ting; Zhao, Bao-Xiang

    2015-09-01

    We have developed a novel fluorescent chemosensor (DAM) based on dansyl and morpholine units for the detection of mercury ion with excellent selectivity and sensitivity. In the presence of Hg2+ in a mixture solution of HEPES buffer (pH 7.5, 20 mM) and MeCN (2/8, v/v) at room temperature, the fluorescence of DAM was almost completely quenched from green to colorless with fast response time. Moreover, DAM also showed its excellent anti-interference capability even in the presence of large amount of interfering ions. It is worth noting that DAM could be used to detect Hg2+ specifically in the Yellow River samples, which significantly implied the potential applications of DAM in the complicated environment samples.

  11. Cost-constrained optimal sampling for system identification in pharmacokinetics applications with population priors and nuisance parameters.

    Science.gov (United States)

    Sorzano, Carlos Oscars S; Pérez-De-La-Cruz Moreno, Maria Angeles; Burguet-Castell, Jordi; Montejo, Consuelo; Ros, Antonio Aguilar

    2015-06-01

    Pharmacokinetics (PK) applications can be seen as a special case of nonlinear, causal systems with memory. There are cases in which prior knowledge exists about the distribution of the system parameters in a population. However, for a specific patient in a clinical setting, we need to determine her system parameters so that the therapy can be personalized. This system identification is performed many times by measuring drug concentrations in plasma. The objective of this work is to provide an irregular sampling strategy that minimizes the uncertainty about the system parameters with a fixed amount of samples (cost constrained). We use Monte Carlo simulations to estimate the average Fisher's information matrix associated to the PK problem, and then estimate the sampling points that minimize the maximum uncertainty associated to system parameters (a minimax criterion). The minimization is performed employing a genetic algorithm. We show that such a sampling scheme can be designed in a way that is adapted to a particular patient and that it can accommodate any dosing regimen as well as it allows flexible therapeutic strategies. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  12. Outpatient Tinnitus Clinic, Self-Help Web Platform, or Mobile Application to Recruit Tinnitus Study Samples?

    Directory of Open Access Journals (Sweden)

    Thomas Probst

    2017-04-01

    Full Text Available For understanding the heterogeneity of tinnitus, large samples are required. However, investigations on how samples recruited by different methods differ from each other are lacking. In the present study, three large samples each recruited by different means were compared: N = 5017 individuals registered at a self-help web platform for tinnitus (crowdsourcing platform Tinnitus Talk, N = 867 users of a smart mobile application for tinnitus (crowdsensing platform TrackYourTinnitus, and N = 3786 patients contacting an outpatient tinnitus clinic (Tinnitus Center of the University Hospital Regensburg. The three samples were compared regarding age, gender, and duration of tinnitus (month or years perceiving tinnitus; subjective report using chi-squared tests. The three samples significantly differed from each other in age, gender and tinnitus duration (p < 0.05. Users of the TrackYourTinnitus crowdsensing platform were younger, users of the Tinnitus Talk crowdsourcing platform had more often female gender, and users of both newer technologies (crowdsourcing and crowdsensing had more frequently acute/subacute tinnitus (<3 months and 4–6 months as well as a very long tinnitus duration (>20 years. The implications of these findings for clinical research are that newer technologies such as crowdsourcing and crowdsensing platforms offer the possibility to reach individuals hard to get in contact with at an outpatient tinnitus clinic. Depending on the aims and the inclusion/exclusion criteria of a given study, different recruiting strategies (clinic and/or newer technologies offer different advantages and disadvantages. In general, the representativeness of study results might be increased when tinnitus study samples are recruited in the clinic as well as via crowdsourcing and crowdsensing.

  13. Genetic identification of Iberian rodent species using both mitochondrial and nuclear loci: application to noninvasive sampling.

    Science.gov (United States)

    Barbosa, S; Pauperio, J; Searle, J B; Alves, P C

    2013-01-01

    Species identification through noninvasive sampling is increasingly used in animal conservation genetics, given that it obviates the need to handle free-living individuals. Noninvasive sampling is particularly valuable for elusive and small species such as rodents. Although rodents are not usually assumed to be the most obvious target for conservation, of the 21 species or near-species present in Iberia, three are considered endangered and declining, while several others are poorly studied. Here, we develop a genetic tool for identifying all rodent species in Iberia by noninvasive genetic sampling. To achieve this purpose, we selected one mitochondrial gene [cytochrome b (cyt-b)] and one nuclear gene [interphotoreceptor retinoid-binding protein (IRBP)], which we first sequenced using tissue samples. Both genes allow for the phylogenetic distinction of all species except the sibling species Microtus lusitanicus and Microtus duodecimcostatus. Overall, cyt-b showed higher resolution than IRBP, revealing a clear barcoding gap. To allow these markers to be applied to noninvasive samples, we selected a short highly diagnostic fragment from each gene, which we used to obtain sequences from faeces and bones from owl pellets. Amplification success for the cyt-b and IRBP fragment was 85% and 43% in faecal and 88% and 64% in owl-pellet DNA extractions, respectively. The method allows the unambiguous identification of the great majority of Iberian rodent species from noninvasive samples, with application in studies of distribution, spatial ecology and population dynamics, and for conservation. © 2012 Blackwell Publishing Ltd.

  14. Sample presentation, sources of error and future perspectives on the application of vibrational spectroscopy in the wine industry.

    Science.gov (United States)

    Cozzolino, Daniel

    2015-03-30

    Vibrational spectroscopy encompasses a number of techniques and methods including ultra-violet, visible, Fourier transform infrared or mid infrared, near infrared and Raman spectroscopy. The use and application of spectroscopy generates spectra containing hundreds of variables (absorbances at each wavenumbers or wavelengths), resulting in the production of large data sets representing the chemical and biochemical wine fingerprint. Multivariate data analysis techniques are then required to handle the large amount of data generated in order to interpret the spectra in a meaningful way in order to develop a specific application. This paper focuses on the developments of sample presentation and main sources of error when vibrational spectroscopy methods are applied in wine analysis. Recent and novel applications will be discussed as examples of these developments. © 2014 Society of Chemical Industry.

  15. Field sampling and data analysis methods for development of ecological land classifications: an application on the Manistee National Forest.

    Science.gov (United States)

    George E. Host; Carl W. Ramm; Eunice A. Padley; Kurt S. Pregitzer; James B. Hart; David T. Cleland

    1992-01-01

    Presents technical documentation for development of an Ecological Classification System for the Manistee National Forest in northwest Lower Michigan, and suggests procedures applicable to other ecological land classification projects. Includes discussion of sampling design, field data collection, data summarization and analyses, development of classification units,...

  16. Development and Application of a Sample Holder for In Situ Gaseous TEM Studies of Membrane Electrode Assemblies for Polymer Electrolyte Fuel Cells.

    Science.gov (United States)

    Kamino, Takeo; Yaguchi, Toshie; Shimizu, Takahiro

    2017-10-01

    Polymer electrolyte fuel cells hold great potential for stationary and mobile applications due to high power density and low operating temperature. However, the structural changes during electrochemical reactions are not well understood. In this article, we detail the development of the sample holder equipped with gas injectors and electric conductors and its application to a membrane electrode assembly of a polymer electrolyte fuel cell. Hydrogen and oxygen gases were simultaneously sprayed on the surfaces of the anode and cathode catalysts of the membrane electrode assembly sample, respectively, and observation of the structural changes in the catalysts were simultaneously carried out along with measurement of the generated voltages.

  17. A dansyl based fluorescence chemosensor for Hg(2+) and its application in the complicated environment samples.

    Science.gov (United States)

    Zhou, Shuai; Zhou, Ze-Quan; Zhao, Xuan-Xuan; Xiao, Yu-Hao; Xi, Gang; Liu, Jin-Ting; Zhao, Bao-Xiang

    2015-09-05

    We have developed a novel fluorescent chemosensor (DAM) based on dansyl and morpholine units for the detection of mercury ion with excellent selectivity and sensitivity. In the presence of Hg(2+) in a mixture solution of HEPES buffer (pH 7.5, 20 mM) and MeCN (2/8, v/v) at room temperature, the fluorescence of DAM was almost completely quenched from green to colorless with fast response time. Moreover, DAM also showed its excellent anti-interference capability even in the presence of large amount of interfering ions. It is worth noting that DAM could be used to detect Hg(2+) specifically in the Yellow River samples, which significantly implied the potential applications of DAM in the complicated environment samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  19. Experiences from Refurbishment of Metallography Hot Cells and Application of a New Preparation Concept for Materialography Samples

    International Nuclear Information System (INIS)

    Oberlander, B. C.; Espeland, M.; Solum, N. O.

    2001-01-01

    After more than 30 years of operation the lead shielded metallography hot cells needed a basic renewal and modernisation not least of the specimen preparation equipment. Preparation in hot cells of radioactive samples for metallography and ceramography is challenging and time consuming. It demands a special design and quality of all in-cell equipment and skill and patience from the operator. Essentials in the preparation process are: simplicity and reliability of the machines, and a good quality, reproducibility and efficiency in performance. Desirable is process automation, flexibility and an alara amounto of radioactive waste produced per sample prepared. State of the art preparation equipment for materialography seems to meet most of the demands, however, it cannot be used in hot cells without modifications. Therefore. IFE and Struers in Copenhagen modified a standard model of a Strues precision cutting machine and a microprocessor controlled grinding and polishing machine for Hot Cell application. Hot cell utilisation of the microcomputer controlled grinding and polishing machine and the existing automatic dosing equipment made the task of preparing radioactive samples more attractive. The new grinding and polishing system for hot cells provides good sample preparation quality and reproductibility at reduced preparation time and reduced amount of contaminated waste produced per sample prepared. the sample materials examined were irradiated cladding materials and fuels

  20. Optimized preparation of urine samples for two-dimensional electrophoresis and initial application to patient samples

    DEFF Research Database (Denmark)

    Lafitte, Daniel; Dussol, Bertrand; Andersen, Søren

    2002-01-01

    OBJECTIVE: We optimized of the preparation of urinary samples to obtain a comprehensive map of urinary proteins of healthy subjects and then compared this map with the ones obtained with patient samples to show that the pattern was specific of their kidney disease. DESIGN AND METHODS: The urinary...

  1. Development of a versatile sample preparation method and its application for rare-earth pattern and Nd isotope ratio analysis in nuclear forensics

    International Nuclear Information System (INIS)

    Krajko, J.

    2015-01-01

    An improved sample preparation procedure for trace-levels of lanthanides in uranium-bearing samples was developed. The method involves a simple co-precipitation using Fe(III) carrier in ammonium carbonate medium to remove the uranium matrix. The procedure is an effective initial pre-concentration step for the subsequent extraction chromatographic separations. The applicability of the method was demonstrated by the measurement of REE pattern and 143 Nd/ 144 Nd isotope ratio in uranium ore concentrate samples. (author)

  2. Sorption models and their application in environmental samples

    International Nuclear Information System (INIS)

    Kamel, Nariman H.M.

    2008-01-01

    Full text: Naturally occurring radioactive materials (NORM) were found in some environmental soils not high enough to pose problems for human health. The health may be affected by increasing of NORM at some environmental soils. Four soil samples obtained from certain coastal regions in Egypt. Naturally occurring radioactive materials (NORM) of the uranium ( 238 U) series, thorium ( 232 Th) series and the radioactive isotope of potassium ( 40 K) were measured. The soil samples were selected from the situations where the radionuclide concentrations are significantly higher than the average level of other sites. It were chemically analyzed for the uranium, silicon aluminum and iron. The cation exchange capacity (CEC) were determined, it was found lower in the presence of Fe-silicates suggested that Fe-hydroxide had precipitin at the exchangeable edge sites of the clay minerals. The pH of the solid particles at which the net total surface charge is zero was known as the point of zero charge (PZC). The PZC is very important in determining the affinity of the soil samples for different cations and anions. The aim of this work is to determine the natural radiological hazardous of radionuclide at four environmental coastal soil samples in Egypt. The point of zero surface charge was determined using titration tests. Sorption model was developed for this purpose. (author)

  3. Sampling and estimating recreational use.

    Science.gov (United States)

    Timothy G. Gregoire; Gregory J. Buhyoff

    1999-01-01

    Probability sampling methods applicable to estimate recreational use are presented. Both single- and multiple-access recreation sites are considered. One- and two-stage sampling methods are presented. Estimation of recreational use is presented in a series of examples.

  4. A simplified method to recover urinary vesicles for clinical applications, and sample banking.

    Science.gov (United States)

    Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry

    2014-12-23

    Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking.

  5. Application of Compton suppression spectrometry in the improvement of nuclear analytical techniques for biological samples

    International Nuclear Information System (INIS)

    Ahmed, Y. A.; Ewa, I.O.B.; Funtua, I.I.; Jonah, S.A.; Landsberger, S.

    2007-01-01

    Compton Suppression Factors (SF) and Compton Reduction Factors (RF) of the UT Austin's Compton suppression spectrometer being parameters characterizing the system performance were measured using ''1''3''7Cs and ''6''0Co point sources. The system performance was evaluated as a function of energy and geometry. The (P/C), A(P/C), (P/T), Cp, and Ce were obtained for each of the parameters. The natural background reduction factor in the anticoincidence mode and that of normal mode was calculated and its effect on the detection limit of biological samples evaluated. Applicability of the spectrometer and the method for biological samples was tested in the measurement of twenty-four elements (Ba, Sr, I, Br, Cu, V, Mg, Na, Cl, Mn, Ca, Sn, In, K, Mo, Cd, Zn, As, Sb, Ni, Rb, Cs, Fe, and Co) commonly found in food, milk, tea and tobacco items. They were determined from seven National Institute for Standard and Technology (NIST) certified reference materials (rice flour, oyster tissue, non-fat powdered milk, peach leaves, tomato leaves, apple leaves, and citrus leaves). Our results shows good agreement with the NIST certified values, indicating that the method developed in the present study is suitable for the determination of aforementioned elements in biological samples without undue interference problems

  6. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  7. β-NMR sample optimization

    CERN Document Server

    Zakoucka, Eva

    2013-01-01

    During my summer student programme I was working on sample optimization for a new β-NMR project at the ISOLDE facility. The β-NMR technique is well-established in solid-state physics and just recently it is being introduced for applications in biochemistry and life sciences. The β-NMR collaboration will be applying for beam time to the INTC committee in September for three nuclei: Cu, Zn and Mg. Sample optimization for Mg was already performed last year during the summer student programme. Therefore sample optimization for Cu and Zn had to be completed as well for the project proposal. My part in the project was to perform thorough literature research on techniques studying Cu and Zn complexes in native conditions, search for relevant binding candidates for Cu and Zn applicable for ß-NMR and eventually evaluate selected binding candidates using UV-VIS spectrometry.

  8. Quantitative portable gamma-spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Ebara, S.B.

    1998-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. This method was not developed to replace other methods such as Monte Carlo or Discrete Ordinates but rather to offer an alternative rapid solution. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma-spectroscopy system is impractical. The portable gamma-spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma-rays and cannot be used for pure beta or alpha emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma-rays. The following presents the analysis technique and presents verification results using actual experimental data, rather than comparisons to other approximations such as Monte Carlo techniques, to demonstrate the accuracy of the method given a known geometry and source term. (author)

  9. Liquid scintillation alpha counting and spectrometry and its application to bone and tissue samples

    International Nuclear Information System (INIS)

    McDowell, W.J.; Weiss, J.F.

    1976-01-01

    Three methods for determination of alpha-emitting nuclides using liquid scintillation counting are compared, and the pertinent literature is reviewed. Data showing the application of each method to the measurement of plutonium concentration in tissue and bone samples are presented. Counting with a commercial beta-liquid scintillation counter and an aqueous-phase-accepting scintillator is shown to be accurate only in cases where the alpha activity is high (several hundred counts/min or more), only gross alpha counting is desired, and beta-gamma emitters are known to be absent from the sample or present at low levels compared with the alpha activity. Counting with the same equipment and an aqueous immiscible scintillator containing an extractant for the nuclide of interest (extractive scintillator) is shown to allow better control of alpha peak shift due to quenching, a significant reduction of beta-gamma interference, and, usually, a low background. The desirability of using a multichannel pulse-height analyzer in the above two counting methods is stressed. The use of equipment and procedures designed for alpha liquid scintillation counting is shown to allow alpha spectrometry with an energy resolution capability of 200 to 300 keV full-peak-width-at-half-peak-height and a background of 0.3 to 1.0 counts/min, or as low as 0.01 counts/min if pulse-shape discrimination methods are used. Methods for preparing animal bone and tissue samples for assay are described

  10. PIXE analysis of thin samples

    International Nuclear Information System (INIS)

    Kiss, Ildiko; Koltay, Ede; Szabo, Gyula; Laszlo, S.; Meszaros, A.

    1985-01-01

    Particle-induced X-ray emission (PIXE) multielemental analysis of thin film samples are reported. Calibration methods of K and L X-lines are discussed. Application of PIXE analysis to aerosol monitoring, multielement aerosol analysis is described. Results of PIXE analysis of samples from two locations in Hungary are compared with the results of aerosol samples from Scandinavia and the USA. (D.Gy.)

  11. [Acceptance of lot sampling: its applicability to the evaluation of the primary care services portfolio].

    Science.gov (United States)

    López-Picazo Ferrer, J

    2001-05-15

    To determine the applicability of the acceptance of lot quality assurance sampling (LQAS) in the primary care service portfolio, comparing its results with those given by classic evaluation. Compliance with the minimum technical norms (MTN) of the service of diabetic care was evaluated through the classic methodology (confidence 95%, accuracy 5%, representativeness of area, sample of 376 histories) and by LQAS (confidence 95%, power 80%, representativeness of primary care team (PCT), defining a lot by MTN and PCT, sample of 13 histories/PCT). Effort, information obtained and its operative nature were assessed. 44 PCTs from Murcia Primary Care Region. Classic methodology: compliance with MTN ranged between 91.1% (diagnosis, 95% CI, 84.2-94.0) and 30% (repercussion in viscera, 95% CI, 25.4-34.6). Objectives in three MTN were reached (diagnosis, history and EKG). LQAS: no MTN was accepted in all the PCTs, being the most accepted (42 PCT, 95.6%) and the least accepted (24 PCT, 55.6%). In 9 PCT all were accepted (20.4%), and in 2 none were accepted (4.5%). Data were analysed through Pareto charts. Classic methodology offered accurate results, but did not identify which centres were those that did not comply (general focus). LQAS was preferable for evaluating MTN and probably coverage because: 1) it uses small samples, which foment internal quality-improvement initiatives; 2) it is easy and rapid to execute; 3) it identifies the PCT and criteria where there is an opportunity for improvement (specific focus), and 4) it can be used operatively for monitoring.

  12. 40 CFR 141.703 - Sampling locations.

    Science.gov (United States)

    2010-07-01

    ... samples prior to the point of filter backwash water addition. (d) Bank filtration. (1) Systems that... applicable, must collect source water samples in the surface water prior to bank filtration. (2) Systems that use bank filtration as pretreatment to a filtration plant must collect source water samples from the...

  13. 50 CFR 260.58 - Accessibility for sampling.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Accessibility for sampling. 260.58 Section... Fishery Products for Human Consumption Sampling § 260.58 Accessibility for sampling. Each applicant shall cause the processed products for which inspection is requested to be made accessible for proper sampling...

  14. Emigration Rates From Sample Surveys: An Application to Senegal.

    Science.gov (United States)

    Willekens, Frans; Zinn, Sabine; Leuchter, Matthias

    2017-12-01

    What is the emigration rate of a country, and how reliable is that figure? Answering these questions is not at all straightforward. Most data on international migration are census data on foreign-born population. These migrant stock data describe the immigrant population in destination countries but offer limited information on the rate at which people leave their country of origin. The emigration rate depends on the number leaving in a given period and the population at risk of leaving, weighted by the duration at risk. Emigration surveys provide a useful data source for estimating emigration rates, provided that the estimation method accounts for sample design. In this study, emigration rates and confidence intervals are estimated from a sample survey of households in the Dakar region in Senegal, which was part of the Migration between Africa and Europe survey. The sample was a stratified two-stage sample with oversampling of households with members abroad or return migrants. A combination of methods of survival analysis (time-to-event data) and replication variance estimation (bootstrapping) yields emigration rates and design-consistent confidence intervals that are representative for the study population.

  15. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  16. A comprehensive comparison of perpendicular distance sampling methods for sampling downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2013-01-01

    Many new methods for sampling down coarse woody debris have been proposed in the last dozen or so years. One of the most promising in terms of field application, perpendicular distance sampling (PDS), has several variants that have been progressively introduced in the literature. In this study, we provide an overview of the different PDS variants and comprehensive...

  17. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  18. Trends in sample preparation 2002. Development and application. Book of abstracts

    International Nuclear Information System (INIS)

    Wenzl, T.; Eberl, M.; Zischka, M.; Knapp, G.

    2002-01-01

    This conference comprised topics dealing with sample preparation such as: sample decomposition, solvent extraction, derivatization techniques and uncertainty in sample preparation. In particular microwave assisted sample preparation techniques and equipment were discussed. The papers were organized under the general topics: trace element analysis, trace analysis of organic compounds, high performance instrumentation in sample preparation, speciation analysis and posters session. Those papers of INIS interest are cited individually. (nevyjel)

  19. Trends in sample preparation 2002. Development and application. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Wenzl, T; Eberl, M; Zischka, M; Knapp, G [eds.

    2002-07-01

    This conference comprised topics dealing with sample preparation such as: sample decomposition, solvent extraction, derivatization techniques and uncertainty in sample preparation. In particular microwave assisted sample preparation techniques and equipment were discussed. The papers were organized under the general topics: trace element analysis, trace analysis of organic compounds, high performance instrumentation in sample preparation, speciation analysis and posters session. Those papers of INIS interest are cited individually. (nevyjel)

  20. National emission standards for hazardous air pollutants application for approval to construct rotary mode core-sampling truck and exhauster

    International Nuclear Information System (INIS)

    1993-05-01

    Characterization of wastes in the underground single-shell tanks and double-shell tanks on the Hanford Site is crucial in developing the final disposal options for the waste and closure strategy for the Hanford Site. Additionally, characterization of tank waste is important for the waste tank safety programs. The Hanford Federal Facility Agreement and Consent Order (also referred to as the Tri-Party Agreement) Milestone M-10-00 requires the obtaining and analyzing of at least two samples from each single-shell tank, and Milestone M-10-13 specifically requires the ability to sample hard saltcake. Existing equipment does not allow sampling of all single-shell tanks within established tank safety limits. Consequently, the US Department of Energy, Richland Operations Office has developed a rotary mode core-sampling system that uses nitrogen gas to cool and clear the drill bit. A rotary mode core-sampling truck will be used on approximately 80 single-shell tanks which contain saltcake wastes, and will provide crucial information on the contents of the tanks. This application is a request for approval to construct and operate the rotary mode core-sampling truck and exhauster in the 200 East and 200 West Area Tank Farms of the Hanford Site. This request is being made pursuant to 40 CFR 61, Subpart H

  1. Application of isotope dilution for the determination of thorium in biological samples by inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Igarashi, Yasuhito; Shiraishi, Kunio; Takaku, Yuichi; Masuda, Kimihiko; Seki, Riki; Yamamoto, Masayoshi.

    1992-01-01

    The applicability of isotope dilution-inductively coupled plasma mass spectrometry (ID-ICP-MS) was examined for Th in biological samples. A naturally occurring isotope of Th(Th-230) was used as the spiking isotope. The concentration of Th-230 in the final sample solution was about 50 - 60 pg/ml; an isotope ratio of 232/230 could be measured with a relative standard deviation of less than 2%. The error magnification depended on the amount of Th-232 being concomitant with the Th-230. Though it was shown that one ng of Th-232 could be determined with reasonable precision with a tracer of the present purity, more care should be taken to reduce any source of systematic error. (author)

  2. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Forensic application of total reflection X-ray fluorescence spectrometry for elemental characterization of ink samples

    International Nuclear Information System (INIS)

    Dhara, Sangita; Misra, N.L.; Maind, S.D.; Kumar, Sanjukta A.; Chattopadhyay, N.; Aggarwal, S.K.

    2010-01-01

    The possibility of applying Total Reflection X-ray Fluorescence for qualitative and quantitative differentiation of documents printed with rare earth tagged and untagged inks has been explored in this paper. For qualitative differentiation, a very small amount of ink was loosened from the printed documents by smoothly rubbing with a new clean blade without destroying the manuscript. 50 μL of Milli-Q water was put on this loose powder, on the manuscript, and was agitated by sucking and releasing the suspension two to three times with the help of a micropipette. The resultant dispersion was deposited on quartz sample support for Total Reflection X-ray Fluorescence measurements. The Total Reflection X-ray Fluorescence spectrum of tagged and untagged inks could be clearly differentiated. In order to see the applicability of Total Reflection X-ray Fluorescence for quantitative determinations of rare earths and also to countercheck such determinations in ink samples, the amounts of rare earth in painted papers with single rare earth tagged inks were determined by digesting the painted paper in HNO 3 /HClO 4 , mixing this solution with the internal standard and recording their Total Reflection X-ray Fluorescence spectra after calibration of the instrument. The results thus obtained were compared with those obtained by Inductively Coupled Plasma Mass Spectrometry and were found in good agreement. The average precision of the Total Reflection X-ray Fluorescence determinations was 5.5% (1σ) and the average deviation of Total Reflection X-ray Fluorescence determined values with that of Inductively Coupled Plasma Mass Spectrometry was 7.3%. These studies have shown that Total Reflection X-ray Fluorescence offers a promising and potential application in forensic work of this nature.

  4. Forensic application of total reflection X-ray fluorescence spectrometry for elemental characterization of ink samples

    Energy Technology Data Exchange (ETDEWEB)

    Dhara, Sangita [Fuel Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Misra, N.L., E-mail: nlmisra@barc.gov.i [Fuel Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Maind, S.D. [NAA Unit of Central Forensic Science Laboratory Hyderabad at Analytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Kumar, Sanjukta A. [Analytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Chattopadhyay, N. [NAA Unit of Central Forensic Science Laboratory Hyderabad at Analytical Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India); Aggarwal, S.K. [Fuel Chemistry Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India)

    2010-02-15

    The possibility of applying Total Reflection X-ray Fluorescence for qualitative and quantitative differentiation of documents printed with rare earth tagged and untagged inks has been explored in this paper. For qualitative differentiation, a very small amount of ink was loosened from the printed documents by smoothly rubbing with a new clean blade without destroying the manuscript. 50 muL of Milli-Q water was put on this loose powder, on the manuscript, and was agitated by sucking and releasing the suspension two to three times with the help of a micropipette. The resultant dispersion was deposited on quartz sample support for Total Reflection X-ray Fluorescence measurements. The Total Reflection X-ray Fluorescence spectrum of tagged and untagged inks could be clearly differentiated. In order to see the applicability of Total Reflection X-ray Fluorescence for quantitative determinations of rare earths and also to countercheck such determinations in ink samples, the amounts of rare earth in painted papers with single rare earth tagged inks were determined by digesting the painted paper in HNO{sub 3}/HClO{sub 4}, mixing this solution with the internal standard and recording their Total Reflection X-ray Fluorescence spectra after calibration of the instrument. The results thus obtained were compared with those obtained by Inductively Coupled Plasma Mass Spectrometry and were found in good agreement. The average precision of the Total Reflection X-ray Fluorescence determinations was 5.5% (1sigma) and the average deviation of Total Reflection X-ray Fluorescence determined values with that of Inductively Coupled Plasma Mass Spectrometry was 7.3%. These studies have shown that Total Reflection X-ray Fluorescence offers a promising and potential application in forensic work of this nature.

  5. The Applicability of the Risk Analysis System in Tax Audit Effectiveness (A Sample Application on Gaziantep Carpet Sector

    Directory of Open Access Journals (Sweden)

    Atilla Ahmet UĞUR

    2016-12-01

    Full Text Available Tax audit which is an undeniable part of the correct collection of taxes covering a significant portion of public expenditure is increasingly important in our tax where the basis of the declaration is valid. The necessity of tax audit to be effective is accepted by everyone without a doubt, but the issue of efficiency in tax audit has always been a problem. The effectiveness of tax audit is increasingly important in terms of the fiscal policy of the country and therefore of general economic policy. Various options have been put forward on ways to improve the efficiency of tax audit. One of these options is the risk analysis method. Risk analysis is an activity in which taxpayer's activities are analyzed in terms of groups and sectors through the risk analysis system which is formed by collecting all kind of information data and statistics and it is the activity to identify risk areas in this way. In this sense this study addresses a sample application of the risk analysis system for the Gaziantep Carpet Sector.

  6. Development of an Analytical Protocol for Determination of Cyanide in Human Biological Samples Based on Application of Ion Chromatography with Pulsed Amperometric Detection

    OpenAIRE

    Jaszczak, Ewa; Ruman, Marek; Narkowicz, Sylwia; Namieśnik, Jacek; Polkowska, Żaneta

    2017-01-01

    A simple and accurate ion chromatography (IC) method with pulsed amperometric detection (PAD) was proposed for the determination of cyanide ion in urine, sweat, and saliva samples. The sample pretreatment relies on alkaline digestion and application of Dionex OnGuard II H cartridge. Under the optimized conditions, the method showed good linearity in the range of 1–100 μg/L for urine, 5–100 μg/L for saliva, and 3–100 μg/L for sweat samples with determination coefficients (R) > 0.992. Low detec...

  7. 7 CFR 52.35 - Accessibility for sampling.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Accessibility for sampling. 52.35 Section 52.35... PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification Sampling § 52.35 Accessibility for sampling. Each applicant shall cause the processed products for which inspection is requested to...

  8. An econometric method for estimating population parameters from non-random samples: An application to clinical case finding.

    Science.gov (United States)

    Burger, Rulof P; McLaren, Zoë M

    2017-09-01

    The problem of sample selection complicates the process of drawing inference about populations. Selective sampling arises in many real world situations when agents such as doctors and customs officials search for targets with high values of a characteristic. We propose a new method for estimating population characteristics from these types of selected samples. We develop a model that captures key features of the agent's sampling decision. We use a generalized method of moments with instrumental variables and maximum likelihood to estimate the population prevalence of the characteristic of interest and the agents' accuracy in identifying targets. We apply this method to tuberculosis (TB), which is the leading infectious disease cause of death worldwide. We use a national database of TB test data from South Africa to examine testing for multidrug resistant TB (MDR-TB). Approximately one quarter of MDR-TB cases was undiagnosed between 2004 and 2010. The official estimate of 2.5% is therefore too low, and MDR-TB prevalence is as high as 3.5%. Signal-to-noise ratios are estimated to be between 0.5 and 1. Our approach is widely applicable because of the availability of routinely collected data and abundance of potential instruments. Using routinely collected data to monitor population prevalence can guide evidence-based policy making. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Use of a holder-vacuum tube device to save on-site hands in preparing urine samples for head-space gas-chromatography, and its application to determine the time allowance for sample sealing.

    Science.gov (United States)

    Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki

    2011-01-01

    To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.

  10. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Hebel, S.; Hands, J.; Goering, F.; Kirchner, G.; Purtschert, R.

    2015-01-01

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  11. Test of a sample container for shipment of small size plutonium samples with PAT-2

    International Nuclear Information System (INIS)

    Kuhn, E.; Aigner, H.; Deron, S.

    1981-11-01

    A light-weight container for the air transport of plutonium, to be designated PAT-2, has been developed in the USA and is presently undergoing licensing. The very limited effective space for bearing plutonium required the design of small size sample canisters to meet the needs of international safeguards for the shipment of plutonium samples. The applicability of a small canister for the sampling of small size powder and solution samples has been tested in an intralaboratory experiment. The results of the experiment, based on the concept of pre-weighed samples, show that the tested canister can successfully be used for the sampling of small size PuO 2 -powder samples of homogeneous source material, as well as for dried aliquands of plutonium nitrate solutions. (author)

  12. Effect of repeated application of 14C-carbaryl and of addition of glucose and cellulose to soil samples

    International Nuclear Information System (INIS)

    Hirata, R.; Luchini, L.C.; Mesquita, T.B.; Ruegg, E.F.

    1984-01-01

    The behaviour of the insecticide carbaryl is studied in samples of Gley Humic and Red-Yellow Latosol soil by means of radiometric techniques. In the Red-Yellow Latosol two carbon sources - glucose and cellulose - and a mixture of glucose plus cellulose were added. Repeated applications of carbaryl in both soils highly increased the rate of degradation, probably due to a rapid increase in the number of microorganisms by using the pesticide as substrate. (M.A.C.) [pt

  13. Fitted temperature-corrected Compton cross sections for Monte Carlo applications and a sampling distribution

    International Nuclear Information System (INIS)

    Wienke, B.R.; Devaney, J.J.; Lathrop, B.L.

    1984-01-01

    Simple temperature-corrected cross sections, which replace the static Klein-Nishina set in a one-to-one manner, are developed for Monte Carlo applications. The reduced set is obtained from a nonlinear least-squares fit to the exact photon-Maxwellian electron cross sections by using a Klein-Nishina-like formula as the fitting equation. Two parameters are sufficient, and accurate to two decimal places, to explicitly fit the exact cross sections over a range of 0 to 100 keV in electron temperature and 0 to 1 MeV in incident photon energy. Since the fit equations are Klein-Nishina-like, existing Monte Carlo code algorithms using the Klein-Nishina formula can be trivially modified to accommodate corrections for a moving Maxwellian electron background. The simple two parameter scheme and other fits are presented and discussed and comparisons with exact predictions are exhibited. The fits are made to the total photon-Maxwellian electron cross section and the fitting parameters can be consistently used in both the energy conservation equation for photon-electron scattering and the differential cross section, as they are presently sampled in Monte Carlo photonics applications. The fit equations are motivated in a very natural manner by the asymptotic expansion of the exact photon-Maxwellian effective cross-section kernel. A probability distribution is also obtained for the corrected set of equations

  14. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  15. Application of factor analysis to chemically analyzed data in environmental samples after x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    El-Sayed, A.A.

    2005-01-01

    The underlying principle of factorial analysis is frequency distribution and description of reaction in between and through the element series in specific environmental samples. Application of this factor analysis was elaborated to interpret the variance and covariance of certain elements Si, Al, Ca. K, Fe, Ti and Mg in three different types of common materials in environmental sediments, soil, and rock. These evaluations were proceeded after x-ray fluorescence measurements. Results of applications of factorial statistical data analysis show that three factors cause relationship between the above elements in a certain type of environmental samples are mainly recognized. In such cases, these factors represent the main reason for findings and interpret all hidden relationship between the chemical analyzed data. Factor one, the effect of weathering type alteration and oxidation reaction processes as a main one in case of soil and rock where they are characterized by the close covariance of a group of metals, like iron and manganese, commonly derived from weathered and altered igneous rocks. Factor two and three represents other processes. In case of soil, formation of alumino-silicate is revealed in factor two due to the positive covariance of these elements and also the presence of aluminum oxide, titanium oxide and silicon dioxide together is explained by these positive values. The inverse relation between Ca, K, Fe and Mg while indicate the presence of mineral salts which may be due to fertilization and water of irrigation. In case of factor three in that soil, it is the weakest factor that can be used to explain the relationship between the above elements

  16. Headspace needle-trap analysis of priority volatile organic compounds from aqueous samples: application to the analysis of natural and waste waters.

    Science.gov (United States)

    Alonso, Monica; Cerdan, Laura; Godayol, Anna; Anticó, Enriqueta; Sanchez, Juan M

    2011-11-11

    Combining headspace (HS) sampling with a needle-trap device (NTD) to determine priority volatile organic compounds (VOCs) in water samples results in improved sensitivity and efficiency when compared to conventional static HS sampling. A 22 gauge stainless steel, 51-mm needle packed with Tenax TA and Carboxen 1000 particles is used as the NTD. Three different HS-NTD sampling methodologies are evaluated and all give limits of detection for the target VOCs in the ng L⁻¹ range. Active (purge-and-trap) HS-NTD sampling is found to give the best sensitivity but requires exhaustive control of the sampling conditions. The use of the NTD to collect the headspace gas sample results in a combined adsorption/desorption mechanism. The testing of different temperatures for the HS thermostating reveals a greater desorption effect when the sample is allowed to diffuse, whether passively or actively, through the sorbent particles. The limits of detection obtained in the simplest sampling methodology, static HS-NTD (5 mL aqueous sample in 20 mL HS vials, thermostating at 50 °C for 30 min with agitation), are sufficiently low as to permit its application to the analysis of 18 priority VOCs in natural and waste waters. In all cases compounds were detected below regulated levels. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Sensitive thermal transitions of nanoscale polymer samples using the bimetallic effect: application to ultra-thin polythiophene.

    Science.gov (United States)

    Ahumada, O; Pérez-Madrigal, M M; Ramirez, J; Curcó, D; Esteves, C; Salvador-Matar, A; Luongo, G; Armelin, E; Puiggalí, J; Alemán, C

    2013-05-01

    A sensitive nanocalorimetric technology based on microcantilever sensors is presented. The technology, which combines very short response times with very small sample consumption, uses the bimetallic effect to detect thermal transitions. Specifically, abrupt variations in the Young's modulus and the thermal expansion coefficient produced by temperature changes have been employed to detect thermodynamic transitions. The technology has been used to determine the glass transition of poly(3-thiophene methyl acetate), a soluble semiconducting polymer with different nanotechnological applications. The glass transition temperature determined using microcantilevers coated with ultra-thin films of mass = 10(-13) g is 5.2 °C higher than that obtained using a conventional differential scanning calorimeter for bulk powder samples of mass = 5 × 10(-3) g. Atomistic molecular dynamics simulations on models that represent the bulk powder and the ultra-thin films have been carried out to provide understanding and rationalization of this feature. Simulations indicate that the film-air interface plays a crucial role in films with very small thickness, affecting both the organization of the molecular chains and the response of the molecules against the temperature.

  18. Sampling and mass spectrometry approaches for the detection of drugs and foreign contaminants in breath for homeland security applications

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Audrey Noreen [Michigan State Univ., East Lansing, MI (United States)

    2009-01-01

    Homeland security relies heavily on analytical chemistry to identify suspicious materials and persons. Traditionally this role has focused on attribution, determining the type and origin of an explosive, for example. But as technology advances, analytical chemistry can and will play an important role in the prevention and preemption of terrorist attacks. More sensitive and selective detection techniques can allow suspicious materials and persons to be identified even before a final destructive product is made. The work presented herein focuses on the use of commercial and novel detection techniques for application to the prevention of terrorist activities. Although drugs are not commonly thought of when discussing terrorism, narcoterrorism has become a significant threat in the 21st century. The role of the drug trade in the funding of terrorist groups is prevalent; thus, reducing the trafficking of illegal drugs can play a role in the prevention of terrorism by cutting off much needed funding. To do so, sensitive, specific, and robust analytical equipment is needed to quickly identify a suspected drug sample no matter what matrix it is in. Single Particle Aerosol Mass Spectrometry (SPAMS) is a novel technique that has previously been applied to biological and chemical detection. The current work applies SPAMS to drug analysis, identifying the active ingredients in single component, multi-component, and multi-tablet drug samples in a relatively non-destructive manner. In order to do so, a sampling apparatus was created to allow particle generation from drug tablets with on-line introduction to the SPAMS instrument. Rules trees were developed to automate the identification of drug samples on a single particle basis. A novel analytical scheme was also developed to identify suspect individuals based on chemical signatures in human breath. Human breath was sampled using an RTube{trademark} and the trace volatile organic compounds (VOCs) were preconcentrated using solid

  19. Field Portable Low Temperature Porous Layer Open Tubular Cryoadsorption Headspace Sampling and Analysis Part II: Applications*

    Science.gov (United States)

    Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J.

    2016-01-01

    This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3 s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. PMID:26726934

  20. Field portable low temperature porous layer open tubular cryoadsorption headspace sampling and analysis part II: Applications.

    Science.gov (United States)

    Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J

    2016-01-15

    This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. Published by Elsevier B.V.

  1. Application of In-Segment Multiple Sampling in Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Nataša Đurić

    2014-12-01

    Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.

  2. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  3. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  4. Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis

    International Nuclear Information System (INIS)

    Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

    1994-01-01

    The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia's Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented

  5. Securing While Sampling in Wireless Body Area Networks With Application to Electrocardiography.

    Science.gov (United States)

    Dautov, Ruslan; Tsouri, Gill R

    2016-01-01

    Stringent resource constraints and broadcast transmission in wireless body area network raise serious security concerns when employed in biomedical applications. Protecting data transmission where any minor alteration is potentially harmful is of significant importance in healthcare. Traditional security methods based on public or private key infrastructure require considerable memory and computational resources, and present an implementation obstacle in compact sensor nodes. This paper proposes a lightweight encryption framework augmenting compressed sensing with wireless physical layer security. Augmenting compressed sensing to secure information is based on the use of the measurement matrix as an encryption key, and allows for incorporating security in addition to compression at the time of sampling an analog signal. The proposed approach eliminates the need for a separate encryption algorithm, as well as the predeployment of a key thereby conserving sensor node's limited resources. The proposed framework is evaluated using analysis, simulation, and experimentation applied to a wireless electrocardiogram setup consisting of a sensor node, an access point, and an eavesdropper performing a proximity attack. Results show that legitimate communication is reliable and secure given that the eavesdropper is located at a reasonable distance from the sensor node and the access point.

  6. The effect of clustering on lot quality assurance sampling: a probabilistic model to calculate sample sizes for quality assessments.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello

    2013-10-26

    Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.

  7. Application of Passive Sampling to Characterise the Fish Exometabolome

    Directory of Open Access Journals (Sweden)

    Mark R. Viant

    2017-02-01

    Full Text Available The endogenous metabolites excreted by organisms into their surrounding environment, termed the exometabolome, are important for many processes including chemical communication. In fish biology, such metabolites are also known to be informative markers of physiological status. While metabolomics is increasingly used to investigate the endogenous biochemistry of organisms, no non-targeted studies of the metabolic complexity of fish exometabolomes have been reported to date. In environmental chemistry, Chemcatcher® (Portsmouth, UK passive samplers have been developed to sample for micro-pollutants in water. Given the importance of the fish exometabolome, we sought to evaluate the capability of Chemcatcher® samplers to capture a broad spectrum of endogenous metabolites excreted by fish and to measure these using non-targeted direct infusion mass spectrometry metabolomics. The capabilities of C18 and styrene divinylbenzene reversed-phase sulfonated (SDB-RPS Empore™ disks for capturing non-polar and polar metabolites, respectively, were compared. Furthermore, we investigated real, complex metabolite mixtures excreted from two model fish species, rainbow trout (Oncorhynchus mykiss and three-spined stickleback (Gasterosteus aculeatus. In total, 344 biological samples and 28 QC samples were analysed, revealing 646 and 215 m/z peaks from trout and stickleback, respectively. The measured exometabolomes were principally affected by the type of Empore™ (Hemel Hempstead, UK disk and also by the sampling time. Many peaks were putatively annotated, including several bile acids (e.g., chenodeoxycholate, taurocholate, glycocholate, glycolithocholate, glycochenodeoxycholate, glycodeoxycholate. Collectively these observations show the ability of Chemcatcher® passive samplers to capture endogenous metabolites excreted from fish.

  8. Development of a polymerase chain reaction applicable to rapid and sensitive detection of Clonorchis sinensis eggs in human stool samples

    Science.gov (United States)

    Cho, Pyo Yun; Na, Byoung-Kuk; Mi Choi, Kyung; Kim, Jin Su; Cho, Shin-Hyeong; Lee, Won-Ja; Lim, Sung-Bin; Cha, Seok Ho; Park, Yun-Kyu; Pak, Jhang Ho; Lee, Hyeong-Woo; Hong, Sung-Jong; Kim, Tong-Soo

    2013-01-01

    Microscopic examination of eggs of parasitic helminths in stool samples has been the most widely used classical diagnostic method for infections, but tiny and low numbers of eggs in stool samples often hamper diagnosis of helminthic infections with classical microscopic examination. Moreover, it is also difficult to differentiate parasite eggs by the classical method, if they have similar morphological characteristics. In this study, we developed a rapid and sensitive polymerase chain reaction (PCR)-based molecular diagnostic method for detection of Clonorchis sinensis eggs in stool samples. Nine primers were designed based on the long-terminal repeat (LTR) of C. sinensis retrotransposon1 (CsRn1) gene, and seven PCR primer sets were paired. Polymerase chain reaction with each primer pair produced specific amplicons for C. sinensis, but not for other trematodes including Metagonimus yokogawai and Paragonimus westermani. Particularly, three primer sets were able to detect 10 C. sinensis eggs and were applicable to amplify specific amplicons from DNA samples purified from stool of C. sinensis-infected patients. This PCR method could be useful for diagnosis of C. sinensis infections in human stool samples with a high level of specificity and sensitivity. PMID:23916334

  9. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  10. Application of GIXF to forensic samples

    International Nuclear Information System (INIS)

    Ninomiya, Toshio; Nomura, Shigeaki; Taniguchi, Kazuo; Ikeda, Shigero.

    1995-01-01

    Glazing incidence X-ray fluorescence analysis (GIXF) has been applied to forensic samples, a counterfeit 100 dollars bill, fragments of polyvinyl tapes, a trace of semen, illegal drugs, fingerprints and fake V.S.O.P cognacs. Sr could not be detected on the magnet-respondent letter of the counterfeit bill and Br was detected on the magnet-nonrespondent part of the counterfeit bill, while such phenomena could not be noticed on a true bill. Zn as a characteristic ingredient could be detected in a trace of semen. Br was detected in what is called pure methamphetamine crystals and K, Ca, Fe, Zn etc. were detected in heroin powders. Pb was sharply detected in gunshot residues attached on the finger after gun-firing. Sulphur as a contaminant was abundant in fake V.S.O.P cognacs against no S being detected in genuine V.S.O.P cognacs. (author)

  11. Application of GIXF to forensic samples

    Energy Technology Data Exchange (ETDEWEB)

    Ninomiya, Toshio [Hyogo Prefecture, Kobe (Japan). Forensic Science Lab.; Nomura, Shigeaki; Taniguchi, Kazuo; Ikeda, Shigero

    1995-03-01

    Glazing incidence X-ray fluorescence analysis (GIXF) has been applied to forensic samples, a counterfeit 100 dollars bill, fragments of polyvinyl tapes, a trace of semen, illegal drugs, fingerprints and fake V.S.O.P cognacs. Sr could not be detected on the magnet-respondent letter of the counterfeit bill and Br was detected on the magnet-nonrespondent part of the counterfeit bill, while such phenomena could not be noticed on a true bill. Zn as a characteristic ingredient could be detected in a trace of semen. Br was detected in what is called pure methamphetamine crystals and K, Ca, Fe, Zn etc. were detected in heroin powders. Pb was sharply detected in gunshot residues attached on the finger after gun-firing. Sulphur as a contaminant was abundant in fake V.S.O.P cognacs against no S being detected in genuine V.S.O.P cognacs. (author).

  12. Application of DNA-DNA colony hybridization to the detection of catabolic genotypes in environmental samples

    International Nuclear Information System (INIS)

    Sayler, G.S.; Shields, M.S.; Tedford, E.T.; Breen, A.; Hooper, S.W.; Sirotkin, K.M.; Davis, J.W.

    1985-01-01

    The application of preexisting DNA hybridization techniques was investigated for potential in determining populations of specific gene sequences in environmental samples. Cross-hybridizations among two degradative plasmids, TOL and NAH, and two cloning vehicles, pLAFR1 and RSF1010, were determined. The detection limits for the TOL plasmid against a nonhomologous plasmid-bearing bacterial background was ascertained. The colony hybridization technique allowed detection of one colony containing TOL plasmid among 10(6) Escherichia coli colonies of nonhomologous DNA. Comparisons between population estimates derived from growth on selective substrates and from hybridizations were examined. Findings indicated that standard sole carbon source enumeration procedures for degradative populations lead to overestimations due to nonspecific growth of other bacteria on the microcontaminant carbon sources present in the media. Population estimates based on the selective growth of a microcosm population on two aromatic substrates (toluene and naphthalene) and estimates derived from DNA-DNA colony hybridizations, using the TOL or NAH plasmid as a probe, corresponded with estimates of substrate mineralization rates and past exposure to environmental contaminants. The applications of such techniques are hoped to eventually allow enumeration of any specific gene sequences in the environment, including both anabolic and catabolic genes. In addition, this procedure should prove useful in monitoring recombinant DNA clones released into environmental situations

  13. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Investigation and Applications of In-Source Oxidation in Liquid Sampling-Atmospheric Pressure Afterglow Microplasma Ionization (LS-APAG) Source.

    Science.gov (United States)

    Xie, Xiaobo; Wang, Zhenpeng; Li, Yafeng; Zhan, Lingpeng; Nie, Zongxiu

    2017-06-01

    A liquid sampling-atmospheric pressure afterglow microplasma ionization (LS-APAG) source is presented for the first time, which is embedded with both electrospray ionization (ESI) and atmospheric pressure afterglow microplasma ionization (APAG) techniques. This ion source is capable of analyzing compounds with diverse molecule weights and polarities. An unseparated mixture sample was detected as a proof-of-concept, giving complementary information (both polarities and non-polarities) with the two ionization modes. It should also be noted that molecular mass can be quickly identified by ESI with clean and simple spectra, while the structure can be directly studied using APAG with in-source oxidation. The ionization/oxidation mechanism and applications of the LS-APAG source have been further explored in the analysis of nonpolar alkanes and unsaturated fatty acids/esters. A unique [M + O - 3H] + was observed in the case of individual alkanes (C 5 -C 19 ) and complex hydrocarbons mixture under optimized conditions. Moreover, branched alkanes generated significant in-source fragments, which could be further applied to the discrimination of isomeric alkanes. The technique also facilitates facile determination of double bond positions in unsaturated fatty acids/esters due to diagnostic fragments (the acid/ester-containing aldehyde and acid oxidation products) generated by on-line ozonolysis in APAG mode. Finally, some examples of in situ APAG analysis by gas sampling and surface sampling were given as well. Graphical Abstract ᅟ.

  15. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  16. Optimization and application of octadecyl-modified monolithic silica for solid-phase extraction of drugs in whole blood samples.

    Science.gov (United States)

    Namera, Akira; Saito, Takeshi; Ota, Shigenori; Miyazaki, Shota; Oikawa, Hiroshi; Murata, Kazuhiro; Nagao, Masataka

    2017-09-29

    Monolithic silica in MonoSpin for solid-phase extraction of drugs from whole blood samples was developed to facilitate high-throughput analysis. Monolithic silica of various pore sizes and octadecyl contents were synthesized, and their effects on recovery rates were evaluated. The silica monolith M18-200 (20μm through-pore size, 10.4nm mesopore size, and 17.3% carbon content) achieved the best recovery of the target analytes in whole blood samples. The extraction proceeded with centrifugal force at 1000rpm for 2min, and the eluate was directly injected into the liquid chromatography-mass spectrometry system without any tedious steps such as evaporation of extraction solvents. Under the optimized condition, low detection limits of 0.5-2.0ngmL -1 and calibration ranges up to 1000ngmL -1 were obtained. The recoveries of the target drugs in the whole blood were 76-108% with relative standard deviation of less than 14.3%. These results indicate that the developed method based on monolithic silica is convenient, highly efficient, and applicable for detecting drugs in whole blood samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Urine sample preparation for proteomic analysis.

    Science.gov (United States)

    Olszowy, Pawel; Buszewski, Boguslaw

    2014-10-01

    Sample preparation for both environmental and more importantly biological matrices is a bottleneck of all kinds of analytical processes. In the case of proteomic analysis this element is even more important due to the amount of cross-reactions that should be taken into consideration. The incorporation of new post-translational modifications, protein hydrolysis, or even its degradation is possible as side effects of proteins sample processing. If protocols are evaluated appropriately, then identification of such proteins does not bring difficulties. However, if structural changes are provided without sufficient attention then protein sequence coverage will be reduced or even identification of such proteins could be impossible. This review summarizes obstacles and achievements in protein sample preparation of urine for proteome analysis using different tools for mass spectrometry analysis. The main aim is to present comprehensively the idea of urine application as a valuable matrix. This article is dedicated to sample preparation and application of urine mainly in novel cancer biomarkers discovery. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. DNA-based species identification for faecal samples: An application ...

    African Journals Online (AJOL)

    ... An application on the mammalian survey in Mountain Huangshan Scenic Spot. ... Noninvasive methods using genetic markers have been suggested as ways to ... molecular identification are the useful supplements for traditional field survey.

  19. Design unbiased estimation in line intersect sampling using segmented transects

    Science.gov (United States)

    David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine

    2005-01-01

    In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...

  20. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello

    2014-03-01

    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  1. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  2. A large-scale cryoelectronic system for biological sample banking

    Science.gov (United States)

    Shirley, Stephen G.; Durst, Christopher H. P.; Fuchs, Christian C.; Zimmermann, Heiko; Ihmig, Frank R.

    2009-11-01

    We describe a polymorphic electronic infrastructure for managing biological samples stored over liquid nitrogen. As part of this system we have developed new cryocontainers and carrier plates attached to Flash memory chips to have a redundant and portable set of data at each sample. Our experimental investigations show that basic Flash operation and endurance is adequate for the application down to liquid nitrogen temperatures. This identification technology can provide the best sample identification, documentation and tracking that brings added value to each sample. The first application of the system is in a worldwide collaborative research towards the production of an AIDS vaccine. The functionality and versatility of the system can lead to an essential optimization of sample and data exchange for global clinical studies.

  3. Detection of Pseudomonas aeruginosa in sputum samples by ...

    African Journals Online (AJOL)

    samples obtained from CF patients may impede detection of microorganisms by FISH. The aim of this study was to test the application of biotin during FISH technique to reduce unspecific background fluorescence in sputum samples to facilitate and improve detection of P. aeruginosa. Sixty-three sputum samples from CF ...

  4. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  5. Development of an Analytical Protocol for Determination of Cyanide in Human Biological Samples Based on Application of Ion Chromatography with Pulsed Amperometric Detection

    Directory of Open Access Journals (Sweden)

    Ewa Jaszczak

    2017-01-01

    Full Text Available A simple and accurate ion chromatography (IC method with pulsed amperometric detection (PAD was proposed for the determination of cyanide ion in urine, sweat, and saliva samples. The sample pretreatment relies on alkaline digestion and application of Dionex OnGuard II H cartridge. Under the optimized conditions, the method showed good linearity in the range of 1–100 μg/L for urine, 5–100 μg/L for saliva, and 3–100 μg/L for sweat samples with determination coefficients (R>0.992. Low detection limits (LODs in the range of 1.8 μg/L, 5.1 μg/L, and 5.8 μg/L for urine, saliva, and sweat samples, respectively, and good repeatability (CV < 3%, n=3 were obtained. The proposed method has been successfully applied to the analysis of human biological samples.

  6. Development of an Analytical Protocol for Determination of Cyanide in Human Biological Samples Based on Application of Ion Chromatography with Pulsed Amperometric Detection.

    Science.gov (United States)

    Jaszczak, Ewa; Ruman, Marek; Narkowicz, Sylwia; Namieśnik, Jacek; Polkowska, Żaneta

    2017-01-01

    A simple and accurate ion chromatography (IC) method with pulsed amperometric detection (PAD) was proposed for the determination of cyanide ion in urine, sweat, and saliva samples. The sample pretreatment relies on alkaline digestion and application of Dionex OnGuard II H cartridge. Under the optimized conditions, the method showed good linearity in the range of 1-100  μ g/L for urine, 5-100  μ g/L for saliva, and 3-100  μ g/L for sweat samples with determination coefficients ( R ) > 0.992. Low detection limits (LODs) in the range of 1.8  μ g/L, 5.1  μ g/L, and 5.8  μ g/L for urine, saliva, and sweat samples, respectively, and good repeatability (CV < 3%, n = 3) were obtained. The proposed method has been successfully applied to the analysis of human biological samples.

  7. 7 CFR 52.38a - Definitions of terms applicable to statistical sampling.

    Science.gov (United States)

    2010-01-01

    ... the number of defects (or defectives), which exceed the sample unit tolerance (“T”), in a series of... accumulation of defects (or defectives) allowed to exceed the sample unit tolerance (“T”) in any sample unit or consecutive group of sample units. (ii) CuSum value. The accumulated number of defects (or defectives) that...

  8. Long-Term Persistence of Pesticides and TPs in Archived Agricultural Soil Samples and Comparison with Pesticide Application.

    Science.gov (United States)

    Chiaia-Hernandez, Aurea C; Keller, Armin; Wächter, Daniel; Steinlin, Christine; Camenzuli, Louise; Hollender, Juliane; Krauss, Martin

    2017-09-19

    For polar and more degradable pesticides, not many data on long-term persistence in soil under field conditions and real application practices exist. To assess the persistence of pesticides in soil, a multiple-compound screening method (log K ow 1.7-5.5) was developed based on pressurized liquid extraction, QuEChERS and LC-HRMS. The method was applied to study 80 polar pesticides and >90 transformation products (TPs) in archived topsoil samples from the Swiss Soil Monitoring Network (NABO) from 1995 to 2008 with known pesticide application patterns. The results reveal large variations between crop type and field sites. For the majority of the sites 10-15 pesticides were identified with a detection rate of 45% at concentrations between 1 and 330 μg/kg dw in soil. Furthermore, TPs were detected in 47% of the cases where the "parent-compound" was applied. Overall, residues of about 80% of all applied pesticides could be detected with half of these found as TPs with a persistence of more than a decade.

  9. Two-Sample Tests for High-Dimensional Linear Regression with an Application to Detecting Interactions.

    Science.gov (United States)

    Xia, Yin; Cai, Tianxi; Cai, T Tony

    2018-01-01

    Motivated by applications in genomics, we consider in this paper global and multiple testing for the comparisons of two high-dimensional linear regression models. A procedure for testing the equality of the two regression vectors globally is proposed and shown to be particularly powerful against sparse alternatives. We then introduce a multiple testing procedure for identifying unequal coordinates while controlling the false discovery rate and false discovery proportion. Theoretical justifications are provided to guarantee the validity of the proposed tests and optimality results are established under sparsity assumptions on the regression coefficients. The proposed testing procedures are easy to implement. Numerical properties of the procedures are investigated through simulation and data analysis. The results show that the proposed tests maintain the desired error rates under the null and have good power under the alternative at moderate sample sizes. The procedures are applied to the Framingham Offspring study to investigate the interactions between smoking and cardiovascular related genetic mutations important for an inflammation marker.

  10. Initial Results from an Energy-Aware Airborne Dynamic, Data-Driven Application System Performing Sampling in Coherent Boundary-Layer Structures

    Science.gov (United States)

    Frew, E.; Argrow, B. M.; Houston, A. L.; Weiss, C.

    2014-12-01

    The energy-aware airborne dynamic, data-driven application system (EA-DDDAS) performs persistent sampling in complex atmospheric conditions by exploiting wind energy using the dynamic data-driven application system paradigm. The main challenge for future airborne sampling missions is operation with tight integration of physical and computational resources over wireless communication networks, in complex atmospheric conditions. The physical resources considered here include sensor platforms, particularly mobile Doppler radar and unmanned aircraft, the complex conditions in which they operate, and the region of interest. Autonomous operation requires distributed computational effort connected by layered wireless communication. Onboard decision-making and coordination algorithms can be enhanced by atmospheric models that assimilate input from physics-based models and wind fields derived from multiple sources. These models are generally too complex to be run onboard the aircraft, so they need to be executed in ground vehicles in the field, and connected over broadband or other wireless links back to the field. Finally, the wind field environment drives strong interaction between the computational and physical systems, both as a challenge to autonomous path planning algorithms and as a novel energy source that can be exploited to improve system range and endurance. Implementation details of a complete EA-DDDAS will be provided, along with preliminary flight test results targeting coherent boundary-layer structures.

  11. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  12. Radioisotope Sample Measurement Techniques in Medicine and Biology. Proceedings of the Symposium on Radioisotope Sample Measurement Techniques

    International Nuclear Information System (INIS)

    1965-01-01

    The medical and biological applications of radioisotopes depend on two basically different types of measurements, those on living subjects in vivo and those on samples in vitro. The International Atomic Energy Agency has in the past held several meetings on in vivo measurement techniques, notably whole-body counting and radioisotope scanning. The present volume contains the Proceedings of the first Symposium the Agency has organized to discuss the various aspects of techniques for sample measurement in vitro. The range of these sample measurement techniques is very wide. The sample may weigh a few milligrams or several hundred grams, and may be in the gaseous, liquid or solid state. Its radioactive content may consist of a single, known radioisotope or several unknown ones. The concentration of radioactivity may be low, medium or high. The measurements may be made manually or automatically and any one of the many radiation detectors now available may be used. The 53 papers presented at the Symposium illustrate the great variety of methods now in use for radioactive- sample measurements. The first topic discussed is gamma-ray spectrometry, which finds an increasing number of applications in sample measurements. Other sections of the Proceedings deal with: the use of computers in gamma-ray spectrometry and multiple tracer techniques; recent developments in activation analysis where both gamma-ray spectrometry and computing techniques are applied; thin-layer and paper radio chromatographic techniques for use with low energy beta-ray emitters; various aspects of liquid scintillation counting techniques in the measurement of alpha- and beta-ray emitters, including chemical and colour quenching; autoradiographic techniques; calibration of equipment; and standardization of radioisotopes. Finally, some applications of solid-state detectors are presented; this section may be regarded as a preview of important future developments. The meeting was attended by 203 participants

  13. An Optimization Study on Listening Experiments to Improve the Comparability of Annoyance Ratings of Noise Samples from Different Experimental Sample Sets.

    Science.gov (United States)

    Di, Guoqing; Lu, Kuanguang; Shi, Xiaofan

    2018-03-08

    Annoyance ratings obtained from listening experiments are widely used in studies on health effect of environmental noise. In listening experiments, participants usually give the annoyance rating of each noise sample according to its relative annoyance degree among all samples in the experimental sample set if there are no reference sound samples, which leads to poor comparability between experimental results obtained from different experimental sample sets. To solve this problem, this study proposed to add several pink noise samples with certain loudness levels into experimental sample sets as reference sound samples. On this basis, the standard curve between logarithmic mean annoyance and loudness level of pink noise was used to calibrate the experimental results and the calibration procedures were described in detail. Furthermore, as a case study, six different types of noise sample sets were selected to conduct listening experiments using this method to examine the applicability of it. Results showed that the differences in the annoyance ratings of each identical noise sample from different experimental sample sets were markedly decreased after calibration. The determination coefficient ( R ²) of linear fitting functions between psychoacoustic annoyance (PA) and mean annoyance (MA) of noise samples from different experimental sample sets increased obviously after calibration. The case study indicated that the method above is applicable to calibrating annoyance ratings obtained from different types of noise sample sets. After calibration, the comparability of annoyance ratings of noise samples from different experimental sample sets can be distinctly improved.

  14. Guidance for air sampling at nuclear facilities

    International Nuclear Information System (INIS)

    Breslin, A.J.

    1976-11-01

    The principal uses of air sampling at nuclear facilities are to monitor general levels of radioactive air contamination, identify sources of air contamination, and evaluate the effectiveness of contaminant control equipment, determine exposures of individual workers, and provide automatic warning of hazardous concentrations of radioactivity. These applications of air sampling are discussed with respect to standards of occupational exposure, instrumentation, sample analysis, sampling protocol, and statistical treatment of concentration data. Emphasis is given to the influence of spacial and temporal variations of radionuclide concentration on the location, duration, and frequency of air sampling

  15. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-05-07

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  16. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming; Wallner, Johannes; Wonka, Peter

    2014-01-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  17. An improved sampling method of complex network

    Science.gov (United States)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  18. On Invertible Sampling and Adaptive Security

    DEFF Research Database (Denmark)

    Ishai, Yuval; Kumarasubramanian, Abishek; Orlandi, Claudio

    2011-01-01

    functionalities was left open. We provide the first convincing evidence that the answer to this question is negative, namely that some (randomized) functionalities cannot be realized with adaptive security. We obtain this result by studying the following related invertible sampling problem: given an efficient...... sampling algorithm A, obtain another sampling algorithm B such that the output of B is computationally indistinguishable from the output of A, but B can be efficiently inverted (even if A cannot). This invertible sampling problem is independently motivated by other cryptographic applications. We show......, under strong but well studied assumptions, that there exist efficient sampling algorithms A for which invertible sampling as above is impossible. At the same time, we show that a general feasibility result for adaptively secure MPC implies that invertible sampling is possible for every A, thereby...

  19. Teknik Sampling Snowball dalam Penelitian Lapangan

    Directory of Open Access Journals (Sweden)

    Nina Nurdiani

    2014-12-01

    Full Text Available Field research can be associated with both qualitative and quantitative research methods, depending on the problems faced and the goals to be achieved. The success of data collection in the field research depends on the determination of the appropriate sampling technique, to obtain accurate data, and reliably. In studies that have problems related to specific issues, requiring a non-probability sampling techniques one of which is the snowball sampling technique. This technique is useful for finding, identifying, selecting and taking samples in a network or chain of relationships. Phased implementation procedures performed through interviews and questionnaires. Snowball sampling technique has strengths and weaknesses in its application. Field research housing sector become the case study to explain this sampling technique.

  20. ACTIVE LEARNING TO OVERCOME SAMPLE SELECTION BIAS: APPLICATION TO PHOTOMETRIC VARIABLE STAR CLASSIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Berian James, J. [Astronomy Department, University of California, Berkeley, CA 94720-7450 (United States); Brink, Henrik [Dark Cosmology Centre, Juliane Maries Vej 30, 2100 Copenhagen O (Denmark); Long, James P.; Rice, John, E-mail: jwrichar@stat.berkeley.edu [Statistics Department, University of California, Berkeley, CA 94720-7450 (United States)

    2012-01-10

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL-where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up-is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  1. ACTIVE LEARNING TO OVERCOME SAMPLE SELECTION BIAS: APPLICATION TO PHOTOMETRIC VARIABLE STAR CLASSIFICATION

    International Nuclear Information System (INIS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Berian James, J.; Brink, Henrik; Long, James P.; Rice, John

    2012-01-01

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL—where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up—is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  2. Active Learning to Overcome Sample Selection Bias: Application to Photometric Variable Star Classification

    Science.gov (United States)

    Richards, Joseph W.; Starr, Dan L.; Brink, Henrik; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; James, J. Berian; Long, James P.; Rice, John

    2012-01-01

    Despite the great promise of machine-learning algorithms to classify and predict astrophysical parameters for the vast numbers of astrophysical sources and transients observed in large-scale surveys, the peculiarities of the training data often manifest as strongly biased predictions on the data of interest. Typically, training sets are derived from historical surveys of brighter, more nearby objects than those from more extensive, deeper surveys (testing data). This sample selection bias can cause catastrophic errors in predictions on the testing data because (1) standard assumptions for machine-learned model selection procedures break down and (2) dense regions of testing space might be completely devoid of training data. We explore possible remedies to sample selection bias, including importance weighting, co-training, and active learning (AL). We argue that AL—where the data whose inclusion in the training set would most improve predictions on the testing set are queried for manual follow-up—is an effective approach and is appropriate for many astronomical applications. For a variable star classification problem on a well-studied set of stars from Hipparcos and Optical Gravitational Lensing Experiment, AL is the optimal method in terms of error rate on the testing data, beating the off-the-shelf classifier by 3.4% and the other proposed methods by at least 3.0%. To aid with manual labeling of variable stars, we developed a Web interface which allows for easy light curve visualization and querying of external databases. Finally, we apply AL to classify variable stars in the All Sky Automated Survey, finding dramatic improvement in our agreement with the ASAS Catalog of Variable Stars, from 65.5% to 79.5%, and a significant increase in the classifier's average confidence for the testing set, from 14.6% to 42.9%, after a few AL iterations.

  3. Application of a DNA-based luminescence switch-on method for the detection of mercury(II) ions in water samples from Hong Kong

    Science.gov (United States)

    He, Hong-Zhang; Leung, Ka-Ho; Fu, Wai-Chung; Shiu-Hin Chan, Daniel; Leung, Chung-Hang; Ma, Dik-Lung

    2012-12-01

    Mercury is a highly toxic environmental contaminant that damages the endocrine and central nervous systems. In view of the contamination of Hong Kong territorial waters with anthropogenic pollutants such as trace heavy metals, we have investigated the application of our recently developed DNA-based luminescence methodology for the rapid and sensitive detection of mercury(II) ions in real water samples. The assay was applied to water samples from Shing Mun River, Nam Sang Wai and Lamma Island sea water, representing natural river, wetland and sea water media, respectively. The results showed that the system could function effectively in real water samples under conditions of low turbidity and low metal ion concentrations. However, high turbidity and high metal ion concentrations increased the background signal and reduced the performance of this assay.

  4. Enhanced, targeted sampling of high-dimensional free-energy landscapes using variationally enhanced sampling, with an application to chignolin.

    Science.gov (United States)

    Shaffer, Patrick; Valsson, Omar; Parrinello, Michele

    2016-02-02

    The capabilities of molecular simulations have been greatly extended by a number of widely used enhanced sampling methods that facilitate escaping from metastable states and crossing large barriers. Despite these developments there are still many problems which remain out of reach for these methods which has led to a vigorous effort in this area. One of the most important problems that remains unsolved is sampling high-dimensional free-energy landscapes and systems that are not easily described by a small number of collective variables. In this work we demonstrate a new way to compute free-energy landscapes of high dimensionality based on the previously introduced variationally enhanced sampling, and we apply it to the miniprotein chignolin.

  5. Enhanced, targeted sampling of high-dimensional free-energy landscapes using variationally enhanced sampling, with an application to chignolin

    Science.gov (United States)

    Shaffer, Patrick; Valsson, Omar; Parrinello, Michele

    2016-01-01

    The capabilities of molecular simulations have been greatly extended by a number of widely used enhanced sampling methods that facilitate escaping from metastable states and crossing large barriers. Despite these developments there are still many problems which remain out of reach for these methods which has led to a vigorous effort in this area. One of the most important problems that remains unsolved is sampling high-dimensional free-energy landscapes and systems that are not easily described by a small number of collective variables. In this work we demonstrate a new way to compute free-energy landscapes of high dimensionality based on the previously introduced variationally enhanced sampling, and we apply it to the miniprotein chignolin. PMID:26787868

  6. Research and application of sampling and analysis method of sodium aerosol

    International Nuclear Information System (INIS)

    Yu Xiaochen; Guo Qingzhou; Wen Ximeng

    1998-01-01

    Method of sampling-analysis for sodium aerosol is researched. The vacuum sampling technology is used in the sampling process, and the analysis method adopted is volumetric analysis and atomic absorption. When the absolute content of sodium is in the rang of 0.1 mg to 1.0 mg, the deviation of results between volumetric analysis and atomic absorption is less than 2%. The method has been applied in a sodium aerosol removal device successfully. The analysis range, accuracy and precision can meet the requirements for researching sodium aerosol

  7. Application of ICP-MS in Environmental Sampling Analysis for Safeguards

    International Nuclear Information System (INIS)

    Eko Pudjadi; Petrus Zacharias; Budi Prayitno

    2004-01-01

    Environmental samples measured by ICP-MS were analyzed for safeguards. There are two isotopes in environmental sampling that is used to find out the origin of nuclear materials and verify undeclared nuclear activities. Uranium isotopes are 234 U, 235 U, 236 U and 238 U and Plutonium isotopes are 239 Pu, 240 Pu, 241 Pu and 242 Pu. Uranium isotopes are used to verify an existing of nuclear power plants, enrichment plants or reprocessing plants. Plutonium isotopes are used to clarify global fallout from nuclear weapon testing and accident of nuclear facility or military purposes. The high sensitivity of ICP-MS can detect the isotopic fingerprint and trace elements in ppb concentration; ICP-MS has been applied to measure 235 U isotopic ratio and 240 Pu/ 239 Pu isotopic ratios. The sensitivity of ICP-MS is high precision and low operational cost in environmental sampling and can be considered in nuclear power design based on safeguards for development countries. (author)

  8. CAE to support a part of supporting technologies-its applications and verifying samples. Shien gijutsu no ittan wo sasaeru CAE sono torikumi to kensho rei

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, H. (Polyplastic Co. Ltd., Tokyo (Japan))

    1992-01-01

    Plastic CAE has suddenly being widened their applicable fields recently. On the other hand, since integated program development has also been made worldwidely, currently it is posssible to execute even for shrinkings and warping and warping analysis based on holding pressure analysis and cooling analysis. Polyplastics Company Inc. has paid attention to the usefullness of CAE technologies for engineering plastics from the early times, and in order to utilized them effectively, they have progressed several examinations on CAE software development, improvement of CAE analysis technologies, confirmation and verification of CAE analysis precision, resin data preparations and measuring methods, etc. In this paper, their applications have been introduced mainly by their verification samples such as output sample of FACE-2, which is their originally developped metal mold temperature adjustment design supporting system, and as for verification samples of warping deformation analysis, experimental result of testing metal mold (box type), and reliabilities on the results of measured precision and quantitative analysis of resin data to be used for analysis. 6 refs., 6 figs., 1 tab.

  9. Soil sampling for environmental contaminants

    International Nuclear Information System (INIS)

    2004-10-01

    The Consultants Meeting on Sampling Strategies, Sampling and Storage of Soil for Environmental Monitoring of Contaminants was organized by the International Atomic Energy Agency to evaluate methods for soil sampling in radionuclide monitoring and heavy metal surveys for identification of punctual contamination (hot particles) in large area surveys and screening experiments. A group of experts was invited by the IAEA to discuss and recommend methods for representative soil sampling for different kinds of environmental issues. The ultimate sinks for all kinds of contaminants dispersed within the natural environment through human activities are sediment and soil. Soil is a particularly difficult matrix for environmental pollution studies as it is generally composed of a multitude of geological and biological materials resulting from weathering and degradation, including particles of different sizes with varying surface and chemical properties. There are so many different soil types categorized according to their content of biological matter, from sandy soils to loam and peat soils, which make analytical characterization even more complicated. Soil sampling for environmental monitoring of pollutants, therefore, is still a matter of debate in the community of soil, environmental and analytical sciences. The scope of the consultants meeting included evaluating existing techniques with regard to their practicability, reliability and applicability to different purposes, developing strategies of representative soil sampling for cases not yet considered by current techniques and recommending validated techniques applicable to laboratories in developing Member States. This TECDOC includes a critical survey of existing approaches and their feasibility to be applied in developing countries. The report is valuable for radioanalytical laboratories in Member States. It would assist them in quality control and accreditation process

  10. Relationship between LIBS Ablation and Pit Volume for Geologic Samples: Applications for in situ Absolute Geochronology

    Science.gov (United States)

    Devismes, D.; Cohen, Barbara A.

    2014-01-01

    In planetary sciences, in situ absolute geochronology is a scientific and engineering challenge. Currently, the age of the Martian surface can only be determined by crater density counting. However this method has significant uncertainties and needs to be calibrated with absolute ages. We are developing an instrument to acquire in situ absolute geochronology based on the K-Ar method. The protocol is based on the laser ablation of a rock by hundreds of laser pulses. Laser Induced Breakdown Spectroscopy (LIBS) gives the potassium content of the ablated material and a mass spectrometer (quadrupole or ion trap) measures the quantity of 40Ar released. In order to accurately measure the quantity of released 40Ar in cases where Ar is an atmospheric constituent (e.g., Mars), the sample is first put into a chamber under high vacuum. The 40Arquantity, the concentration of K and the estimation of the ablated mass are the parameters needed to give the age of the rocks. The main uncertainties with this method are directly linked to the measures of the mass (typically some µg) and of the concentration of K by LIBS (up to 10%). Because the ablated mass is small compared to the mass of the sample, and because material is redeposited onto the sample after ablation, it is not possible to directly measure the ablated mass. Our current protocol measures the ablated volume and estimates the sample density to calculate ablated mass. The precision and accuracy of this method may be improved by using knowledge of the sample's geologic properties to predict its response to laser ablation, i.e., understanding whether natural samples have a predictable relationship between laser energy deposited and resultant ablation volume. In contrast to most previous studies of laser ablation, theoretical equations are not highly applicable. The reasons are numerous, but the most important are: a) geologic rocks are complex, polymineralic materials; b) the conditions of ablation are unusual (for example

  11. Development and application of a micro-digestion device for biological samples

    International Nuclear Information System (INIS)

    Bohlen, A. von; Klockenkaemper, R.; Messerschmidt, J.; Alt, F.

    2000-01-01

    The analytical characterization of small amounts of a sample is of increasing importance for various research projects in biology, biochemistry and medicine. Reliable determinations of minor and trace elements in microsamples can be performed by total reflection x-ray fluorescence analysis (TXRF). This microanalytical method is suitable for direct multielement analyses of a tiny amount of a liquid or solid sample. Instead of a direct analysis, however, a complete digestion or mineralisation of the sample material prior to analysis can be recommendable. It can be advantageous for a favorable presentation, for a preconcentration and/or homogenization of the material and particularly for an accurate quantification. Unfortunately, commercially available digestion devices are optimized for amounts of 50 to 400 mg of a sample. For smaller amounts, a microdigestion device was constructed and adapted to an equipment of high pressure ashing, which is commercially available. Digestions of very different microsamples between some μg and some mg were carried out, followed by quantitative determinations of a lot of elements. Besides, different Standard Reference Materials (SRM) were analyzed. The homogeneity of these materials could be investigated by comparing the results found for microsamples with those obtained for samples of 200 mg, the latter after digestion in a conventional device. (author)

  12. Handling missing data in ranked set sampling

    CERN Document Server

    Bouza-Herrera, Carlos N

    2013-01-01

    The existence of missing observations is a very important aspect to be considered in the application of survey sampling, for example. In human populations they may be caused by a refusal of some interviewees to give the true value for the variable of interest. Traditionally, simple random sampling is used to select samples. Most statistical models are supported by the use of samples selected by means of this design. In recent decades, an alternative design has started being used, which, in many cases, shows an improvement in terms of accuracy compared with traditional sampling. It is called R

  13. Application of immunoaffinity columns for different food item samples preparation in micotoxins determination

    Directory of Open Access Journals (Sweden)

    Ćurčić Marijana

    2016-01-01

    Full Text Available In analytical methods used for monitoring of what special attention is paid to sample preparation. Therefore, the objective of this study was testing the efficiency of immunoaffinity columns (IAC that are based on solid phase extraction principles used for samples preparation in determining aflatoxins and ochratoxins. Aflatoxins and ochratoxins concentrations were determined in totally 56 samples of food items: wheat, corn, rice, barley and other grains (19 samples, flour and flour products from grain and additives for the bakery industry (7 samples, fruits and vegetables (3 samples, hazelnut, walnut, almond, coconut flour (4 samples, roasted cocoa beans, peanuts, tea, coffee (16 samples, spices (4 samples and meat and meat products (4 samples. Obtained results indicate advantage of IAC use for sample preparation based on enhanced specificity due to binding of extracted molecules to incorporated specific antibodies and rinsing the rest molecules from sample which could interfere with further analysis. Additional advantage is the usage of small amount of organic solvents and consequently decreased exposure of staff who conduct micotoxins determination. Of special interest is increase in method sensitivity since limit of quantification for aflatoxins and ochratoxins determination method is lower than maximal allowed concentration of these toxines prescribed by national rule book.

  14. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    Science.gov (United States)

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Evaluation of personal air sampling pumps

    International Nuclear Information System (INIS)

    Ritter, P.D.; Novick, V.J.; Alvarez, J.L.; Huntsman, B.L.

    1987-01-01

    Personal air samplers are used to more conveniently obtain breathing zone samples from individuals over periods of several hours. Personal air sampling pumps must meet minimum performance levels under all working conditions to be suitable for use in radiation protection programs. In addition, the pumps should be simple to operate and as comfortable to wear as possible. Ten models of personal air sampling pumps were tested to evaluate their mechanical performance and physical characteristics. The pumps varied over a wide range in basic performance and operating features. Some of the pumps were found to have adequate performance for use in health physics air sampling applications. 3 references, 2 figures, 5 tables

  16. Edge Effects in Line Intersect Sampling With

    Science.gov (United States)

    David L. R. Affleck; Timothy G. Gregoire; Harry T. Valentine

    2005-01-01

    Transects consisting of multiple, connected segments with a prescribed configuration are commonly used in ecological applications of line intersect sampling. The transect configuration has implications for the probability with which population elements are selected and for how the selection probabilities can be modified by the boundary of the tract being sampled. As...

  17. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  18. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  19. Application of the neutron activation analysis method for determing trace elements in Brazilian food sample

    International Nuclear Information System (INIS)

    Maihara, V.A.; Vasconcellos, M.B.A.

    1988-01-01

    Recently there has been an increase of consciousness about the importance of trace elements in human health and disease as well as rising concern about food contamination. The development of sensitive, accurate and price methods is one of the most important of the knowledge of trace elements contents in foods and in biological samples. Neutron activation analysis is one of the most suitable tecniques because a great number of elements can be determined in concentrations in the range of μg/g to ng/g. The present work is a part of an AIEA Co-ordinated Research Programme on the applications of nuclear techniques for toxic elements in foodstuffs. Neutron activation analysis is applied to analysis of bread, milk powder and rice that are considered essential foods in the Brazilian diet. Some aspects of the activation analysis of biological matrices are discussed. (author) [pt

  20. Measurement of electro-sprayed 238 and 239+240 plutonium isotopes using 4π-alpha spectrometry. Application to environmental samples

    International Nuclear Information System (INIS)

    Charmoille-Roblot, M.

    1999-01-01

    A new protocol for plutonium deposition using the electro-spray technique coupled with 4π-α spectrometry is proposed to improve the detection limit, shorten the counting time. In order to increase the detection efficiency, it was proposed to measure 238 and 239+240 plutonium isotopes electro-sprayed deposit simultaneously on both sides of the source support, that must be as transparent as possible to alpha-emissions, in a two-alpha detectors chamber. A radiochemical protocol was adapted to electro-spray constraints and a very thin carbon foil was selected for 4π -alpha spectrometry. The method was applied to a batch of sediment samples and gave the same results as an electrodeposited source measured using conventional alpha spectrometry with a 25 % gain on counting time and 10 % on plutonium 238 detection limit. Validation and application of the technique have been made on reference samples. (author)

  1. 30 CFR 14.5 - Test samples.

    Science.gov (United States)

    2010-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.5 Test samples. Upon request by MSHA, the applicant must submit 3 precut, unrolled, flat conveyor belt...

  2. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  3. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  4. The application of headspace gas chromatography coupled to tandem quadrupole mass spectrometry for the analysis of furan in baby food samples.

    Science.gov (United States)

    Pugajeva, Iveta; Rozentale, Irina; Viksna, Arturs; Bartkiene, Elena; Bartkevics, Vadims

    2016-12-01

    Selective methodology employing a tandem quadrupole mass spectrometer coupled to a gas chromatograph with headspace autosampler (HS-GC-MS/MS) was elaborated in this study. Application of the elaborated procedure resulted in a limit of detection of 0.021μgkg(-1) and a limit of quantification of 0.071μgkg(-1). The mean recoveries during in-house validation ranged from 89% to 109%, and coefficients of variation for repeatability ranged from 4% to 11%. The proposed analytical method was applied for monitoring the furan content of 30 commercial baby food samples available on the Latvian retail market. The level of furan found in these samples varied from 0.45 to 81.9μgkg(-1), indicating that infants whose sole diet comprises baby food sold in jars and cans are exposed constantly to furan. Samples containing vegetables and meat had higher levels of furan than those containing only fruits. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Application of Conventional and K0-Based Internal Monostandard NAA Using Reactor Neutrons for Compositional Analysis of Large Samples

    International Nuclear Information System (INIS)

    Reddy, A.V.R.; Acharya, R.; Swain, K. K.; Pujari, P.K.

    2018-01-01

    homogeneous three large size samples of dross. A X-Z rotary scanning unit has been designed, fabricated and installed for counting large and not so homogeneous samples. As part of the intercomparison exercise under the CRP, pottery sample, obtained from IPEN, Peru, was analysed by IM-NAA. Results obtained are compared with those obtained by XRF. The report describes IM-NAA methodology standardized in our lab in brief and its applications to above-mentioned small as well as large and non-standard geometry samples. At the end of the report, a list of publications and presentations is given which were the outcome of this CRP work. (author)

  6. Lights Will Guide You : Sample Preparation and Applications for Integrated Laser and Electron Microscopy

    Science.gov (United States)

    Karreman, M. A.

    2013-03-01

    Correlative microscopy is the combined use of two different forms of microscopy in the study of a specimen, allowing for the exploitation of the advantages of both imaging tools. The integrated Laser and Electron Microscope (iLEM), developed at Utrecht University, combines a fluorescence microscope (FM) and a transmission electron microscope (TEM) in a single set-up. The region of interest in the specimen is labeled or tagged with a fluorescent probe and can easily be identified within a large field of view with the FM. Next, this same area is retraced in the TEM and can be studied at high resolution. The iLEM demands samples that can be imaged with both FM and TEM. Biological specimen, typically composed of light elements, generate low image contrast in the TEM. Therefore, these samples are often ‘contrasted’ with heavy metal stains. FM, on the other hand, images fluorescent samples. Sample preparation for correlative microscopy, and iLEM in particular, is complicated by the fact that the heavy metals stains employed for TEM quench the fluorescent signal of the probe that is imaged with FM. The first part of this thesis outlines preparation procedures for biological material yielding specimen that can be imaged with the iLEM. Here, approaches for the contrasting of thin sections of cells and tissue are introduced that do not affect the fluorescence signal of the probe that marks the region of interest. Furthermore, two novel procedures, VIS2FIXH and VIS2FIX­FS are described that allow for the chemical fixation of thin sections of cryo-immobilized material. These procedures greatly expedite the sample preparation process, and open up novel possibilities for the immuno-labeling of difficult antigens, eg. proteins and lipids that are challenging to preserve. The second part of this thesis describes applications of iLEM in research in the field of life and material science. The iLEM was employed in the study of UVC induced apoptosis (programmed cell death) of

  7. Detection and identification of Leishmania spp.: application of two hsp70-based PCR-RFLP protocols to clinical samples from the New World.

    Science.gov (United States)

    Montalvo, Ana M; Fraga, Jorge; Tirado, Dídier; Blandón, Gustavo; Alba, Annia; Van der Auwera, Gert; Vélez, Iván Darío; Muskus, Carlos

    2017-07-01

    Leishmaniasis is highly prevalent in New World countries, where several methods are available for detection and identification of Leishmania spp. Two hsp70-based PCR protocols (PCR-N and PCR-F) and their corresponding restriction fragment length polymorphisms (RFLP) were applied for detection and identification of Leishmania spp. in clinical samples recruited in Colombia, Guatemala, and Honduras. A total of 93 cases were studied. The samples were classified into positive or suspected of leishmaniasis according to parasitological criteria. Molecular amplification of two different hsp70 gene fragments and further RFLP analysis for identification of Leishmania species was done. The detection in parasitologically positive samples was higher using PCR-N than PCR-F. In the total of samples studied, the main species identified were Leishmania panamensis, Leishmania braziliensis, and Leishmania infantum (chagasi). Although RFLP-N was more efficient for the identification, RFLP-F is necessary for discrimination between L. panamensis and Leishmania guyanesis, of great importance in Colombia. Unexpectedly, one sample from this country revealed an RFLP pattern corresponding to Leishmania naiffi. Both molecular variants are applicable for the study of clinical samples originated in Colombia, Honduras, and Guatemala. Choosing the better tool for each setting depends on the species circulating. More studies are needed to confirm the presence of L. naiffi in Colombian territory.

  8. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    International Nuclear Information System (INIS)

    Salehpour, Mehran; Håkansson, Karl; Possnert, Göran

    2013-01-01

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for 14 C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5–10 μg C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  9. Accelerator mass spectrometry of ultra-small samples with applications in the biosciences

    Energy Technology Data Exchange (ETDEWEB)

    Salehpour, Mehran, E-mail: mehran.salehpour@physics.uu.se [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden); Hakansson, Karl; Possnert, Goeran [Department of Physics and Astronomy, Ion Physics, PO Box 516, SE-751 20 Uppsala (Sweden)

    2013-01-15

    An overview is presented covering the biological accelerator mass spectrometry activities at Uppsala University. The research utilizes the Uppsala University Tandem laboratory facilities, including a 5 MV Pelletron tandem accelerator and two stable isotope ratio mass spectrometers. In addition, a dedicated sample preparation laboratory for biological samples with natural activity is in use, as well as another laboratory specifically for {sup 14}C-labeled samples. A variety of ongoing projects are described and presented. Examples are: (1) Ultra-small sample AMS. We routinely analyze samples with masses in the 5-10 {mu}g C range. Data is presented regarding the sample preparation method, (2) bomb peak biological dating of ultra-small samples. A long term project is presented where purified and cell-specific DNA from various part of the human body including the heart and the brain are analyzed with the aim of extracting regeneration rate of the various human cells, (3) biological dating of various human biopsies, including atherosclerosis related plaques is presented. The average built up time of the surgically removed human carotid plaques have been measured and correlated to various data including the level of insulin in the human blood, and (4) In addition to standard microdosing type measurements using small pharmaceutical drugs, pre-clinical pharmacokinetic data from a macromolecular drug candidate are discussed.

  10. A Sample WebQuest Applicable in Teaching Topological Concepts

    Science.gov (United States)

    Yildiz, Sevda Goktepe; Korpeoglu, Seda Goktepe

    2016-01-01

    In recent years, WebQuests have received a great deal of attention and have been used effectively in teaching-learning process in various courses. In this study, a WebQuest that can be applicable in teaching topological concepts for undergraduate level students was prepared. A number of topological concepts, such as countability, infinity, and…

  11. Study of measurement of the alcohol biomarker phosphatidylethanol (PEth) in dried blood spot (DBS) samples and application of a volumetric DBS device.

    Science.gov (United States)

    Beck, Olof; Kenan Modén, Naama; Seferaj, Sabina; Lenk, Gabriel; Helander, Anders

    2018-04-01

    Phosphatidylethanol (PEth) is a group of phospholipids formed in cell membranes following alcohol consumption. PEth measurement in whole blood samples is established as a specific alcohol biomarker with clinical and medico-legal applications. This study further evaluated the usefulness of dried blood spot (DBS) samples collected on filter paper for PEth measurement. Specimens used were surplus volumes of venous whole blood sent for routine LC-MS/MS quantification of PEth 16:0/18:1, the major PEth homolog. DBS samples were prepared by pipetting blood on Whatman 903 Protein Saver Cards and onto a volumetric DBS device (Capitainer). The imprecision (CV) of the DBS sample amount based on area and weight measurements of spot punches were 23-28%. Investigation of the relationship between blood hematocrit and PEth concentration yielded a linear, positive correlation, and at around 1.0-1.5μmol/L PEth 16:0/18:1, the PEth concentration increased by ~0.1μmol/L for every 5% increase in hematocrit. There was a close agreement between the PEth concentrations obtained with whole blood samples and the corresponding results using Whatman 903 (PEth DBS =1.026 PEth WB +0.013) and volumetric device (PEth DBS =1.045 PEth WB +0.016) DBS samples. The CV of PEth quantification in DBS samples at concentrations≥0.05μmol/L were ≤15%. The present results further confirmed the usefulness of DBS samples for PEth measurement. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Evaluating the Applicability of Data-Driven Dietary Patterns to Independent Samples with a Focus on Measurement Tools for Pattern Similarity.

    Science.gov (United States)

    Castelló, Adela; Buijsse, Brian; Martín, Miguel; Ruiz, Amparo; Casas, Ana M; Baena-Cañada, Jose M; Pastor-Barriuso, Roberto; Antolín, Silvia; Ramos, Manuel; Muñoz, Monserrat; Lluch, Ana; de Juan-Ferré, Ana; Jara, Carlos; Lope, Virginia; Jimeno, María A; Arriola-Arellano, Esperanza; Díaz, Elena; Guillem, Vicente; Carrasco, Eva; Pérez-Gómez, Beatriz; Vioque, Jesús; Pollán, Marina

    2016-12-01

    Diet is a key modifiable risk for many chronic diseases, but it remains unclear whether dietary patterns from one study sample are generalizable to other independent populations. The primary objective of this study was to assess whether data-driven dietary patterns from one study sample are applicable to other populations. The secondary objective was to assess the validity of two criteria of pattern similarity. Six dietary patterns-Western (n=3), Mediterranean, Prudent, and Healthy- from three published studies on breast cancer were reconstructed in a case-control study of 973 breast cancer patients and 973 controls. Three more internal patterns (Western, Prudent, and Mediterranean) were derived from this case-control study's own data. Applicability was assessed by comparing the six reconstructed patterns with the three internal dietary patterns, using the congruence coefficient (CC) between pattern loadings. In cases where any pair met either of two commonly used criteria for declaring patterns similar (CC ≥0.85 or a statistically significant [Pdietary patterns was double-checked by comparing their associations to risk for breast cancer, to assess whether those two criteria of similarity are actually reliable. Five of the six reconstructed dietary patterns showed high congruence (CC >0.9) to their corresponding dietary pattern derived from the case-control study's data. Similar associations with risk for breast cancer were found in all pairs of dietary patterns that had high CC but not in all pairs of dietary patterns with statistically significant correlations. Similar dietary patterns can be found in independent samples. The P value of a correlation coefficient is less reliable than the CC as a criterion for declaring two dietary patterns similar. This study shows that diet scores based on a particular study are generalizable to other populations. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  13. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    Science.gov (United States)

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  14. Determination of Mn, Fe, Ni, Cu, Zn and Pb contents in samples in samples of apple trees by radionuclide X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Bumbalova, A.; Havranek, E.; Harangozo, M.

    1982-01-01

    The applicability of the radionuclide X-ray fluorescence analysis (RXFA) for qualitative and quantitative evaluation of environmental plant samples is discussed and examples of determination of Mn, Fe, Ni, Cu, Zn, Pb in samples of apple trees are given. The instrumentation, the standard and sample preparation are also presented. (author)

  15. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  16. The MIDAS Touch: Mixed Data Sampling Regression Models

    OpenAIRE

    Ghysels, Eric; Santa-Clara, Pedro; Valkanov, Rossen

    2004-01-01

    We introduce Mixed Data Sampling (henceforth MIDAS) regression models. The regressions involve time series data sampled at different frequencies. Technically speaking MIDAS models specify conditional expectations as a distributed lag of regressors recorded at some higher sampling frequencies. We examine the asymptotic properties of MIDAS regression estimation and compare it with traditional distributed lag models. MIDAS regressions have wide applicability in macroeconomics and �nance.

  17. Visual Sample Plan Version 7.0 User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Matzke, Brett D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Newburn, Lisa LN [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bramer, Lisa M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilson, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dowson, Scott T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sego, Landon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pulsipher, Brent A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    User's guide for VSP 7.0 This user's guide describes Visual Sample Plan (VSP) Version 7.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to risk decisions have the required confidence and performance. VSP Version 7.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. It also provides data quality assessment and statistical analysis functions to support evaluation of the data and determine whether the data support decisions regarding sites suspected of contamination. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (XP, Vista, Windows 7, and Windows 8). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to two- and three-dimensional populations to be sampled (e.g., rooms and buildings, surface soil, a defined layer of subsurface soil, water bodies, and other similar applications) for studies of environmental quality. VSP is also applicable for designing sampling plans for assessing chem/rad/bio threat and hazard identification within rooms and buildings, and for designing geophysical surveys for unexploded ordnance (UXO) identification.

  18. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  19. Current trends in sample preparation for cosmetic analysis.

    Science.gov (United States)

    Zhong, Zhixiong; Li, Gongke

    2017-01-01

    The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Analytical dual-energy microtomography: A new method for obtaining three-dimensional mineral phase images and its application to Hayabusa samples

    Science.gov (United States)

    Tsuchiyama, A.; Nakano, T.; Uesugi, K.; Uesugi, M.; Takeuchi, A.; Suzuki, Y.; Noguchi, R.; Matsumoto, T.; Matsuno, J.; Nagano, T.; Imai, Y.; Nakamura, T.; Ogami, T.; Noguchi, T.; Abe, M.; Yada, T.; Fujimura, A.

    2013-09-01

    We developed a novel technique called "analytical dual-energy microtomography" that uses the linear attenuation coefficients (LACs) of minerals at two different X-ray energies to nondestructively obtain three-dimensional (3D) images of mineral distribution in materials such as rock specimens. The two energies are above and below the absorption edge energy of an abundant element, which we call the "index element". The chemical compositions of minerals forming solid solution series can also be measured. The optimal size of a sample is of the order of the inverse of the LAC values at the X-ray energies used. We used synchrotron-based microtomography with an effective spatial resolution of >200 nm to apply this method to small particles (30-180 μm) collected from the surface of asteroid 25143 Itokawa by the Hayabusa mission of the Japan Aerospace Exploration Agency (JAXA). A 3D distribution of the minerals was successively obtained by imaging the samples at X-ray energies of 7 and 8 keV, using Fe as the index element (the K-absorption edge of Fe is 7.11 keV). The optimal sample size in this case is of the order of 50 μm. The chemical compositions of the minerals, including the Fe/Mg ratios of ferromagnesian minerals and the Na/Ca ratios of plagioclase, were measured. This new method is potentially applicable to other small samples such as cosmic dust, lunar regolith, cometary dust (recovered by the Stardust mission of the National Aeronautics and Space Administration [NASA]), and samples from extraterrestrial bodies (those from future sample return missions such as the JAXA Hayabusa2 mission and the NASA OSIRIS-REx mission), although limitations exist for unequilibrated samples. Further, this technique is generally suited for studying materials in multicomponent systems with multiple phases across several research fields.

  1. Water pollution screening by large-volume injection of aqueous samples and application to GC/MS analysis of a river Elbe sample

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, S.; Efer, J.; Engewald, W. [Leipzig Univ. (Germany). Inst. fuer Analytische Chemie

    1997-03-01

    The large-volume sampling of aqueous samples in a programmed temperature vaporizer (PTV) injector was used successfully for the target and non-target analysis of real samples. In this still rarely applied method, e.g., 1 mL of the water sample to be analyzed is slowly injected direct into the PTV. The vaporized water is eliminated through the split vent. The analytes are concentrated onto an adsorbent inside the insert and subsequently thermally desorbed. The capability of the method is demonstrated using a sample from the river Elbe. By means of coupling this method with a mass selective detector in SIM mode (target analysis) the method allows the determination of pollutants in the concentration range up to 0.01 {mu}g/L. Furthermore, PTV enrichment is an effective and time-saving method for non-target analysis in SCAN mode. In a sample from the river Elbe over 20 compounds were identified. (orig.) With 3 figs., 2 tabs.

  2. Double-Shell Tank (DST) Ventilation System Vapor Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2000-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples from the primary ventilation systems of the AN, AP, AW, and AY/AZ tank farms. Sampling will be performed in accordance with Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Air DQO) (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications. Vapor samples will be obtained from tank farm ventilation systems, downstream from the tanks and upstream of any filtration. Samples taken in support of the DQO will consist of SUMMA(trademark) canisters, triple sorbent traps (TSTs), sorbent tube trains (STTs), polyurethane foam (PUF) samples. Particulate filter samples and tritium traps will be taken for radiation screening to allow the release of the samples for analysis. The following sections provide the general methodology and procedures to be used in the preparation, retrieval, transport, analysis, and reporting of results from the vapor samples

  3. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Application of radioactivation analysis for determination of impurities in aluminium, raw materials and in samples from aluminium production process

    International Nuclear Information System (INIS)

    Vucina, L.J.

    1977-01-01

    Trace elements in aluminium, raw materials and in the samples from different stages of aluminium production were determined by nondestructive neutron radioactivation analysis. The samples were taken from the Bayer's process of alumina production (bauxite, red sludge and alumina), from the components of the reduction cell (anode, criolyte and AlF 3 ) and of the final product - aluminium (purity 99.5-99.7%). Under given set of conditions ten trace elements (V,La,Ga,Mn,Co,Cr,Sc,Sb,Zn and Fe) were determined in aluminium and followed through the production process. It was found that the main impurities in aluminium are iron (0.15%) and zinc (0.O6%). It has been concluded that the purity of produced aluminium depends mainly on the purity of used alumina. The second important source of contamination of aluminium is anode. The results obtained by radioactivation analysis for V, Mn, Cr and Fe fall within the ranges of concentrations of these elements determined by use of other methods (volumetry, spectrophotometry, atomic absorption). Higher values for zinc were obtained by radioactivation analysis, probably due to unsatisfactory irradiation and measuring conditions for this element. The possibilities of application of radioactivation analysis to these kinds of samples are also disscused

  5. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  6. The Statistics of Emission and Detection of Neutrons and Photons from Fissile Samples for Safeguard Applications

    International Nuclear Information System (INIS)

    Enqvist, Andreas

    2008-03-01

    One particular purpose of nuclear safeguards, in addition to accounting for known materials, is the detection, identifying and quantifying unknown material, to prevent accidental and clandestine transports and uses of nuclear materials. This can be achieved in a non-destructive way through the various physical and statistical properties of particle emission and detection from such materials. This thesis addresses some fundamental aspects of nuclear materials and the way they can be detected and quantified by such methods. Factorial moments or multiplicities have long been used within the safeguard area. These are low order moments of the underlying number distributions of emission and detection. One objective of the present work was to determine the full probability distribution and its dependence on the sample mass and the detection process. Derivation and analysis of the full probability distribution and its dependence on the above factors constitutes the first part of the thesis. Another possibility of identifying unknown samples lies in the information in the 'fingerprints' (pulse shape distribution) left by a detected neutron or photon. A study of the statistical properties of the interaction of the incoming radiation (neutrons and photons) with the detectors constitutes the second part of the thesis. The interaction between fast neutrons and organic scintillation detectors is derived, and compared to Monte Carlo simulations. An experimental approach is also addressed in which cross correlation measurements were made using liquid scintillation detectors. First the dependence of the pulse height distribution on the energy and collision number of an incoming neutron was derived analytically and compared to numerical simulations. Then an algorithm was elaborated which can discriminate neutron pulses from photon pulses. The resulting cross correlation graphs are analyzed and discussed whether they can be used in applications to distinguish possible sample

  7. The Statistics of Emission and Detection of Neutrons and Photons from Fissile Samples for Safeguard Applications

    Energy Technology Data Exchange (ETDEWEB)

    Enqvist, Andreas

    2008-03-15

    One particular purpose of nuclear safeguards, in addition to accounting for known materials, is the detection, identifying and quantifying unknown material, to prevent accidental and clandestine transports and uses of nuclear materials. This can be achieved in a non-destructive way through the various physical and statistical properties of particle emission and detection from such materials. This thesis addresses some fundamental aspects of nuclear materials and the way they can be detected and quantified by such methods. Factorial moments or multiplicities have long been used within the safeguard area. These are low order moments of the underlying number distributions of emission and detection. One objective of the present work was to determine the full probability distribution and its dependence on the sample mass and the detection process. Derivation and analysis of the full probability distribution and its dependence on the above factors constitutes the first part of the thesis. Another possibility of identifying unknown samples lies in the information in the 'fingerprints' (pulse shape distribution) left by a detected neutron or photon. A study of the statistical properties of the interaction of the incoming radiation (neutrons and photons) with the detectors constitutes the second part of the thesis. The interaction between fast neutrons and organic scintillation detectors is derived, and compared to Monte Carlo simulations. An experimental approach is also addressed in which cross correlation measurements were made using liquid scintillation detectors. First the dependence of the pulse height distribution on the energy and collision number of an incoming neutron was derived analytically and compared to numerical simulations. Then an algorithm was elaborated which can discriminate neutron pulses from photon pulses. The resulting cross correlation graphs are analyzed and discussed whether they can be used in applications to distinguish possible

  8. Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food

    International Nuclear Information System (INIS)

    Lari, L; Dudkiewicz, A

    2014-01-01

    Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility

  9. Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food

    Science.gov (United States)

    Lari, L.; Dudkiewicz, A.

    2014-06-01

    Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility.

  10. An overview of sample preparation procedures for LC-MS multiclass antibiotic determination in environmental and food samples.

    Science.gov (United States)

    Moreno-Bondi, María Cruz; Marazuela, María Dolores; Herranz, Sonia; Rodriguez, Erika

    2009-10-01

    Antibiotics are a class of pharmaceuticals that are of great interest due to the large volumes of these substances that are consumed in both human and veterinary medicine, and due to their status as the agents responsible for bacterial resistance. They can be present in foodstuffs and in environmental samples as multicomponent chemical mixtures that exhibit a wide range of mechanisms of action. Moreover, they can be transformed into different metabolites by the action of microorganisms, as well as by other physical or chemical means, resulting in mixtures with higher ecotoxicities and risks to human health than those of the individual compounds. Therefore, there is growing interest in the availability of multiclass methods for the analysis of antimicrobial mixtures in environmental and food samples at very low concentrations. Liquid chromatography (LC) has become the technique of choice for multiclass analysis, especially when coupled to mass spectrometry (LC-MS) and tandem MS (LC-MS(2)). However, due to the complexity of the matrix, in most cases an extraction step for sample clean-up and preconcentration is required before analysis in order to achieve the required sensitivities. This paper reviews the most recent developments and applications of multiclass antimicrobial determination in environmental and food matrices, emphasizing the practical aspects of sample preparation for the simultaneous extraction of antimicrobials from the selected samples. Future trends in the application of LC-MS-based techniques to multiclass antibiotic analysis are also presented.

  11. Study on the Application of the Combination of TMD Simulation and Umbrella Sampling in PMF Calculation for Molecular Conformational Transitions

    Directory of Open Access Journals (Sweden)

    Qing Wang

    2016-05-01

    Full Text Available Free energy calculations of the potential of mean force (PMF based on the combination of targeted molecular dynamics (TMD simulations and umbrella samplings as a function of physical coordinates have been applied to explore the detailed pathways and the corresponding free energy profiles for the conformational transition processes of the butane molecule and the 35-residue villin headpiece subdomain (HP35. The accurate PMF profiles for describing the dihedral rotation of butane under both coordinates of dihedral rotation and root mean square deviation (RMSD variation were obtained based on the different umbrella samplings from the same TMD simulations. The initial structures for the umbrella samplings can be conveniently selected from the TMD trajectories. For the application of this computational method in the unfolding process of the HP35 protein, the PMF calculation along with the coordinate of the radius of gyration (Rg presents the gradual increase of free energies by about 1 kcal/mol with the energy fluctuations. The feature of conformational transition for the unfolding process of the HP35 protein shows that the spherical structure extends and the middle α-helix unfolds firstly, followed by the unfolding of other α-helices. The computational method for the PMF calculations based on the combination of TMD simulations and umbrella samplings provided a valuable strategy in investigating detailed conformational transition pathways for other allosteric processes.

  12. Application of robust NiTi-ZrO2-PEG SPME fiber in the determination of haloanisoles in cork stopper samples

    International Nuclear Information System (INIS)

    Budziak, Dilma; Martendal, Edmar; Carasek, Eduardo

    2008-01-01

    In this study, a novel solid-phase microextraction (SPME) fiber obtained using sol-gel technology was applied in the determination of off-flavor compounds (2,4,6-trichloroanisole (TCA), 2,4,6-tribromoanisole (TBA) and pentachloroanisole (PCA)) present in cork stopper samples. A NiTi alloy previously electrodeposited with zirconium oxide was used as the substrate for a poly(ethylene glycol) (PEG) coating. Scanning electronic microscopy showed good uniformity of the coating and allowed the coating thickness to be estimated as around 17 μm. The optimization of the main parameters influencing the extraction efficiency, such as cork sample mass, sodium chloride mass, extraction temperature and extraction time were optimized using a full factorial design, followed by a Doehlert design. The optimum conditions were: 20 min of extraction at 70 deg. C using 60 mg of the cork sample and 10 mL of water saturated with sodium chloride in a 20 mL amber vial with constant magnetic stirring. Satisfactory detection limits between 2.5 and 5.1 ng g -1 were obtained, as well as good precision (R.S.D. in the range of 5.8-12.0%). Recovery tests were performed on three different cork samples, and values between 83 and 119% were obtained. The proposed SPME fiber was compared with commercially available fibers and good results were achieved, demonstrating its applicability

  13. Foundations and latest advances in replica exchange transition interface sampling

    Science.gov (United States)

    Cabriolu, Raffaela; Skjelbred Refsnes, Kristin M.; Bolhuis, Peter G.; van Erp, Titus S.

    2017-10-01

    Nearly 20 years ago, transition path sampling (TPS) emerged as an alternative method to free energy based approaches for the study of rare events such as nucleation, protein folding, chemical reactions, and phase transitions. TPS effectively performs Monte Carlo simulations with relatively short molecular dynamics trajectories, with the advantage of not having to alter the actual potential energy surface nor the underlying physical dynamics. Although the TPS approach also introduced a methodology to compute reaction rates, this approach was for a long time considered theoretically attractive, providing the exact same results as extensively long molecular dynamics simulations, but still expensive for most relevant applications. With the increase of computer power and improvements in the algorithmic methodology, quantitative path sampling is finding applications in more and more areas of research. In particular, the transition interface sampling (TIS) and the replica exchange TIS (RETIS) algorithms have, in turn, improved the efficiency of quantitative path sampling significantly, while maintaining the exact nature of the approach. Also, open-source software packages are making these methods, for which implementation is not straightforward, now available for a wider group of users. In addition, a blooming development takes place regarding both applications and algorithmic refinements. Therefore, it is timely to explore the wide panorama of the new developments in this field. This is the aim of this article, which focuses on the most efficient exact path sampling approach, RETIS, as well as its recent applications, extensions, and variations.

  14. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  15. Interactive Sample Book (ISB)

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen; Lenau, Torben Anker; Guglielmi, Michel

    2009-01-01

    supervisor Torben A. Lenau. Inspiration to use smart materials Interactive textiles are still quite an unknown phenomenon to many. It is thus often difficult to communicate what kind of potentials lie within these materials. This is why the ISB project was started, as a practice based research project...... and senses in relation to integrated decoration and function primarily to indoor applications. The result of the project will be a number of interactive textiles, to be gathered in an interactive sample book (ISB), in a similar way as the sample books of wallpapers one can take home from the shop and choose...... from. In other words, it is a kind of display material, which in a simple manner can illustrate how different techniques and smart materials work. The sample book should display a number of possibilities where sensor technology, smart materials and textiles are mixed to such an extent that the textile...

  16. AN APPLICATION OF FLOW INJECTION ANALYSIS WITH GAS DIFFUSION AND SPECTROPHOTOMETRIC DETECTION FOR THE MONITORING OF DISSOLVED SULPHIDE CONCENTRATION IN ENVIRONMENTAL SAMPLES

    Directory of Open Access Journals (Sweden)

    Malwina Cykowska

    2014-10-01

    Full Text Available The monitoring of the concentration of sulphide is very important from the environment point of view because of high toxicity of hydrogen sulphide. What is more hydrogen sulphide is an important pollution indicator. In many cases the determination of sulphide is very difficult due to complicated matrix of some environmental samples, which causes that most analytical methods cannot be used. Flow injection analysis allows to avoid matrix problem what makes it suitable for a wide range of applications in analytical laboratories. In this paper determination of dissolved sulphide in environmental samples by gas-diffusion flow injection analysis with spectrophotometric detection was presented. Used gas-diffusion separation ensures the elimination of interferences caused by sample matrix and gives the ability of determination of sulphides in coloured and turbid samples. Studies to optimize the measurement conditions and to determine the value of the validation parameters (e.g. limit of detection, limit of quantification, precision, accuracy were carried out. Obtained results confirm the usefulness of the method for monitoring the concentration of dissolved sulphides in water and waste water. Full automation and work in a closed system greatly reduces time of analysis, minimizes consumption of sample and reagents and increases safety of analyst’s work.

  17. BRDF of Salt Pan Regolith Samples

    Science.gov (United States)

    Georgiev, Georgi T.; Gatebe, Charles K.; Butler, James J.; King, Michael D.

    2008-01-01

    Laboratory Bi-directional Reflectance Distribution Function (BRDF) measurements of salt pan regolith samples are presented in this study in an effort to understand the role of spatial and spectral variability of the natural biome. The samples were obtained from Etosha Pan, Namibia (19.20 deg S, 15.93 deg E, alt. 1100 m). It is shown how the BRDF depends on the measurement geometry - incident and scatter angles and on the sample particle sizes. As a demonstration of the application of the results, airborne BRDF measurements acquires with NASA's Cloud Absorption Radiometer (CAR) over the same general site where the regolith samples were collected are compared with the laboratory results. Good agreement between laboratory measured and field measured BRDF is reported.

  18. Ionic liquids: solvents and sorbents in sample preparation.

    Science.gov (United States)

    Clark, Kevin D; Emaus, Miranda N; Varona, Marcelino; Bowers, Ashley N; Anderson, Jared L

    2018-01-01

    The applications of ionic liquids (ILs) and IL-derived sorbents are rapidly expanding. By careful selection of the cation and anion components, the physicochemical properties of ILs can be altered to meet the requirements of specific applications. Reports of IL solvents possessing high selectivity for specific analytes are numerous and continue to motivate the development of new IL-based sample preparation methods that are faster, more selective, and environmentally benign compared to conventional organic solvents. The advantages of ILs have also been exploited in solid/polymer formats in which ordinarily nonspecific sorbents are functionalized with IL moieties in order to impart selectivity for an analyte or analyte class. Furthermore, new ILs that incorporate a paramagnetic component into the IL structure, known as magnetic ionic liquids (MILs), have emerged as useful solvents for bioanalytical applications. In this rapidly changing field, this Review focuses on the applications of ILs and IL-based sorbents in sample preparation with a special emphasis on liquid phase extraction techniques using ILs and MILs, IL-based solid-phase extraction, ILs in mass spectrometry, and biological applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Spatial-dependence recurrence sample entropy

    Science.gov (United States)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  20. Rapid assessment of Schistosoma mansoni: the validity, applicability and cost-effectiveness of the Lot Quality Assurance Sampling method in Uganda.

    Science.gov (United States)

    Brooker, Simon; Kabatereine, Narcis B; Myatt, Mark; Russell Stothard, J; Fenwick, Alan

    2005-07-01

    Rapid and accurate identification of communities at highest risk of morbidity from schistosomiasis is key for sustainable control. Although school questionnaires can effectively and inexpensively identify communities with a high prevalence of Schistosoma haematobium, parasitological screening remains the preferred option for S. mansoni. To help reduce screening costs, we investigated the validity of Lot Quality Assurance Sampling (LQAS) in classifying schools according to categories of S. mansoni prevalence in Uganda, and explored its applicability and cost-effectiveness. First, we evaluated several sampling plans using computer simulation and then field tested one sampling plan in 34 schools in Uganda. Finally, cost-effectiveness of different screening and control strategies (including mass treatment without prior screening) was determined, and sensitivity analysis undertaken to assess the effect of infection levels and treatment costs. In identifying schools with prevalences > or =50%, computer simulations showed that LQAS had high levels of sensitivity and specificity (>90%) at sample sizes LQAS where 15 children were sampled had excellent diagnostic performance (sensitivity: 100%, specificity: 96.4%, positive predictive value: 85.7% and negative predictive value: 92.3%). Screening using LQAS was more cost-effective than mass treating all schools (US$218 vs. US$482/high prevalence school treated). Threshold analysis indicated that parasitological screening and mass treatment would become equivalent for settings where prevalence > or =50% in 75% of schools and for treatment costs of US$0.19 per schoolchild. We conclude that, in Uganda, LQAS provides a rapid, valid and cost-effective method for guiding decision makers in allocating finite resources for the control of schistosomiasis.

  1. Biopolymers for sample collection, protection, and preservation.

    Science.gov (United States)

    Sorokulova, Iryna; Olsen, Eric; Vodyanoy, Vitaly

    2015-07-01

    One of the principal challenges in the collection of biological samples from air, water, and soil matrices is that the target agents are not stable enough to be transferred from the collection point to the laboratory of choice without experiencing significant degradation and loss of viability. At present, there is no method to transport biological samples over considerable distances safely, efficiently, and cost-effectively without the use of ice or refrigeration. Current techniques of protection and preservation of biological materials have serious drawbacks. Many known techniques of preservation cause structural damages, so that biological materials lose their structural integrity and viability. We review applications of a novel bacterial preservation process, which is nontoxic and water soluble and allows for the storage of samples without refrigeration. The method is capable of protecting the biological sample from the effects of environment for extended periods of time and then allows for the easy release of these collected biological materials from the protective medium without structural or DNA damage. Strategies for sample collection, preservation, and shipment of bacterial, viral samples are described. The water-soluble polymer is used to immobilize the biological material by replacing the water molecules within the sample with molecules of the biopolymer. The cured polymer results in a solid protective film that is stable to many organic solvents, but quickly removed by the application of the water-based solution. The process of immobilization does not require the use of any additives, accelerators, or plastifiers and does not involve high temperature or radiation to promote polymerization.

  2. Prospects for the introduction of Wide Area Monitoring Using Environmental Sampling

    International Nuclear Information System (INIS)

    Wogman, N.A.

    2013-01-01

    Nuclear proliferation signatures released to the environment must be collected and distinguished from primordial and man-made backgrounds in soils, sediments, air, and surface and underground water. The delay time between the nuclear proliferation emissions and the date of the Wide-Area Environmental Sampling (WAES) analysis will determine which radionuclides would be analyzed based upon their half-lives. Various sampling and analysis technologies have been considered here for application to a WAES. Sampling procedures and equipment discussed are aimed at aquatic, airborne particulate, gas, vegetation, sediment and/or soil, and fauna media. Specific procedures must be selected based upon the application scenario; for example, sampling in the northern latitudes under freezing conditions, sampling at the equator under tropical rain-forest conditions, sampling in the mid-latitudes under desert conditions, and sampling in the marine environment require different equipment and procedures. The paper is followed by the slides of the presentation

  3. Sample preparation optimization in fecal metabolic profiling.

    Science.gov (United States)

    Deda, Olga; Chatziioannou, Anastasia Chrysovalantou; Fasoula, Stella; Palachanis, Dimitris; Raikos, Νicolaos; Theodoridis, Georgios A; Gika, Helen G

    2017-03-15

    Metabolomic analysis of feces can provide useful insight on the metabolic status, the health/disease state of the human/animal and the symbiosis with the gut microbiome. As a result, recently there is increased interest on the application of holistic analysis of feces for biomarker discovery. For metabolomics applications, the sample preparation process used prior to the analysis of fecal samples is of high importance, as it greatly affects the obtained metabolic profile, especially since feces, as matrix are diversifying in their physicochemical characteristics and molecular content. However there is still little information in the literature and lack of a universal approach on sample treatment for fecal metabolic profiling. The scope of the present work was to study the conditions for sample preparation of rat feces with the ultimate goal of the acquisition of comprehensive metabolic profiles either untargeted by NMR spectroscopy and GC-MS or targeted by HILIC-MS/MS. A fecal sample pooled from male and female Wistar rats was extracted under various conditions by modifying the pH value, the nature of the organic solvent and the sample weight to solvent volume ratio. It was found that the 1/2 (w f /v s ) ratio provided the highest number of metabolites under neutral and basic conditions in both untargeted profiling techniques. Concerning LC-MS profiles, neutral acetonitrile and propanol provided higher signals and wide metabolite coverage, though extraction efficiency is metabolite dependent. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  5. Manual for the sampling of uranium mine tailings

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1983-04-01

    The purpose of this manual is to describe the requisite sampling procedures to provide a basis for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The report describes the objective and scope of a sampling program, the preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings porewater, and wind-blown dust and radon

  6. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Bruyère, D.; Ismaël, A.; Gallou, G.; Laperche, V.; Michel, K.; Canioni, L.; Bousquet, B.

    2014-01-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples

  7. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  8. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  9. Application of specific extraction chromatographic methods to the Rb-Sr, Sm-Nd isotope study of geological samples: The Hombreiro-Santa Eulalia Granite (Lugo, NW Spain)

    OpenAIRE

    Santos Zalduegui, J. F.; Pin, C.; Aranguren, A.; Gil Ibarguchi, José Ignacio

    1996-01-01

    The analytical application to geological samples of three new chromatographic resins, TRU-Spec ®, Sr-Spec ® and LN-Spec ®) has been investigated. Seven samples of the Hombreiro massif (Lugo, NW Spain) have been studied, that yield a Rb-Sr age of 298 ±5 Ma (SrQ = 0.7086, MSWD = 7.64) for the magma crystallization. Sm-Nd data results for the same massif give eNd values dose to -2 at 300 Ma. This suggests that the origin of the magma might be related to the partial melting of immature sediments,...

  10. Application of FTA technology for sampling, recovery and molecular characterization of viral pathogens and virus-derived transgenes from plant tissues

    Science.gov (United States)

    Ndunguru, Joseph; Taylor, Nigel J; Yadav, Jitender; Aly, Haytham; Legg, James P; Aveling, Terry; Thompson, Graham; Fauquet, Claude M

    2005-01-01

    Background Plant viral diseases present major constraints to crop production. Effective sampling of the viruses infecting plants is required to facilitate their molecular study and is essential for the development of crop protection and improvement programs. Retaining integrity of viral pathogens within sampled plant tissues is often a limiting factor in this process, most especially when sample sizes are large and when operating in developing counties and regions remote from laboratory facilities. FTA is a paper-based system designed to fix and store nucleic acids directly from fresh tissues pressed into the treated paper. We report here the use of FTA as an effective technology for sampling and retrieval of DNA and RNA viruses from plant tissues and their subsequent molecular analysis. Results DNA and RNA viruses were successfully recovered from leaf tissues of maize, cassava, tomato and tobacco pressed into FTA® Classic Cards. Viral nucleic acids eluted from FTA cards were found to be suitable for diagnostic molecular analysis by PCR-based techniques and restriction analysis, and for cloning and nucleotide sequencing in a manner equivalent to that offered by tradition isolation methods. Efficacy of the technology was demonstrated both from sampled greenhouse-grown plants and from leaf presses taken from crop plants growing in farmer's fields in East Africa. In addition, FTA technology was shown to be suitable for recovery of viral-derived transgene sequences integrated into the plant genome. Conclusion Results demonstrate that FTA is a practical, economical and sensitive method for sampling, storage and retrieval of viral pathogens and plant genomic sequences, when working under controlled conditions and in the field. Application of this technology has the potential to significantly increase ability to bring modern analytical techniques to bear on the viral pathogens infecting crop plants. PMID:15904535

  11. Application of FTA technology for sampling, recovery and molecular characterization of viral pathogens and virus-derived transgenes from plant tissues.

    Science.gov (United States)

    Ndunguru, Joseph; Taylor, Nigel J; Yadav, Jitender; Aly, Haytham; Legg, James P; Aveling, Terry; Thompson, Graham; Fauquet, Claude M

    2005-05-18

    Plant viral diseases present major constraints to crop production. Effective sampling of the viruses infecting plants is required to facilitate their molecular study and is essential for the development of crop protection and improvement programs. Retaining integrity of viral pathogens within sampled plant tissues is often a limiting factor in this process, most especially when sample sizes are large and when operating in developing counties and regions remote from laboratory facilities. FTA is a paper-based system designed to fix and store nucleic acids directly from fresh tissues pressed into the treated paper. We report here the use of FTA as an effective technology for sampling and retrieval of DNA and RNA viruses from plant tissues and their subsequent molecular analysis. DNA and RNA viruses were successfully recovered from leaf tissues of maize, cassava, tomato and tobacco pressed into FTA Classic Cards. Viral nucleic acids eluted from FTA cards were found to be suitable for diagnostic molecular analysis by PCR-based techniques and restriction analysis, and for cloning and nucleotide sequencing in a manner equivalent to that offered by tradition isolation methods. Efficacy of the technology was demonstrated both from sampled greenhouse-grown plants and from leaf presses taken from crop plants growing in farmer's fields in East Africa. In addition, FTA technology was shown to be suitable for recovery of viral-derived transgene sequences integrated into the plant genome. Results demonstrate that FTA is a practical, economical and sensitive method for sampling, storage and retrieval of viral pathogens and plant genomic sequences, when working under controlled conditions and in the field. Application of this technology has the potential to significantly increase ability to bring modern analytical techniques to bear on the viral pathogens infecting crop plants.

  12. Application of FTA technology for sampling, recovery and molecular characterization of viral pathogens and virus-derived transgenes from plant tissues

    Directory of Open Access Journals (Sweden)

    Aveling Terry

    2005-05-01

    Full Text Available Abstract Background Plant viral diseases present major constraints to crop production. Effective sampling of the viruses infecting plants is required to facilitate their molecular study and is essential for the development of crop protection and improvement programs. Retaining integrity of viral pathogens within sampled plant tissues is often a limiting factor in this process, most especially when sample sizes are large and when operating in developing counties and regions remote from laboratory facilities. FTA is a paper-based system designed to fix and store nucleic acids directly from fresh tissues pressed into the treated paper. We report here the use of FTA as an effective technology for sampling and retrieval of DNA and RNA viruses from plant tissues and their subsequent molecular analysis. Results DNA and RNA viruses were successfully recovered from leaf tissues of maize, cassava, tomato and tobacco pressed into FTA® Classic Cards. Viral nucleic acids eluted from FTA cards were found to be suitable for diagnostic molecular analysis by PCR-based techniques and restriction analysis, and for cloning and nucleotide sequencing in a manner equivalent to that offered by tradition isolation methods. Efficacy of the technology was demonstrated both from sampled greenhouse-grown plants and from leaf presses taken from crop plants growing in farmer's fields in East Africa. In addition, FTA technology was shown to be suitable for recovery of viral-derived transgene sequences integrated into the plant genome. Conclusion Results demonstrate that FTA is a practical, economical and sensitive method for sampling, storage and retrieval of viral pathogens and plant genomic sequences, when working under controlled conditions and in the field. Application of this technology has the potential to significantly increase ability to bring modern analytical techniques to bear on the viral pathogens infecting crop plants.

  13. Application of slurry nebulization to trace elemental analysis of some biological samples by inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Mochizuki, T.; Sakashita, A.; Iwata, H.; Ishibashi, Y.; Gunji, N.

    1991-01-01

    The application of slurry nebulization/inductively coupled plasma mass spectrometry (ICP-MS) to trace elemental analysis of biological samples has been investigated. Three standard samples of the National Institute of Standards and Technology (NIST) were dispersed in 1% aqueous Triton X-100 solution by grinding with a planetary micronizing mill. The resulting slurries were nebulized into an ICP without any additional treatments. The 1% (m/v) slurry of the NIST bovine liver showed no significant influence on cone blockage and signal suppression/enhancement. Detection limit, precision and accuracy were discussed for the determination of 24 elements of interest in bovine liver, rice flour and pine needles. Detection limits ranged from 0.0001 μg g -1 for U to 0.52 μg g -1 for Zn at the effective integrating time of 10 s. For high mass elements, low blank values were obtained, yielding excellent limits ( -1 ). Acceptable accuracy and precision were obtained for most of the elements in the NIST bovine liver and rice flour, even for the volatile elements, such as As, Se and Br. However, relatively poor accuracy was obtained for the analysis of pine needles. (orig.)

  14. Validation of a standard forensic anthropology examination protocol by measurement of applicability and reliability on exhumed and archive samples of known biological attribution.

    Science.gov (United States)

    Francisco, Raffaela Arrabaça; Evison, Martin Paul; Costa Junior, Moacyr Lobo da; Silveira, Teresa Cristina Pantozzi; Secchieri, José Marcelo; Guimarães, Marco Aurelio

    2017-10-01

    Forensic anthropology makes an important contribution to human identification and assessment of the causes and mechanisms of death and body disposal in criminal and civil investigations, including those related to atrocity, disaster and trafficking victim identification. The methods used are comparative, relying on assignment of questioned material to categories observed in standard reference material of known attribution. Reference collections typically originate in Europe and North America, and are not necessarily representative of contemporary global populations. Methods based on them must be validated when applied to novel populations. This study describes the validation of a standardized forensic anthropology examination protocol by application to two contemporary Brazilian skeletal samples of known attribution. One sample (n=90) was collected from exhumations following 7-35 years of burial and the second (n=30) was collected following successful investigations following routine case work. The study presents measurement of (1) the applicability of each of the methods: used and (2) the reliability with which the biographic parameters were assigned in each case. The results are discussed with reference to published assessments of methodological reliability regarding sex, age and-in particular-ancestry estimation. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Application of ICP-MS radionuclide analysis to 'Real World' samples of Department of Energy radioactive waste

    International Nuclear Information System (INIS)

    Meeks, A.M.; Giaquinto, J.M.; Keller, J.M.

    1998-01-01

    Disposal of Department of Energy (DOE) radioactive waste into repositories such as the Waste Isolation Pilot Plant (WIPP) and the Nevada Test Site (NTS) requires characterization to ensure regulatory and transportation requirements are met and to collect information regarding chemistry of the waste for processing concerns. Recent addition of an inductively coupled plasma quadrupole mass spectrometer in a radioactive contaminated laboratory at the Oak Ridge National Laboratory (ORNL) has allowed the evaluation of advantages of using ICP-MS over traditional techniques for some of these characterization needs. The measurement of long-lived beta nuclides (i.e. 99 Tc) by ICP-MS has resulted in improved detection limits and accuracy than the traditional counting techniques as well as reducing the need for separation/purification techniques which increase personnel exposure to radiation. Using ICP-MS for the measurement of U isotopes versus the traditional Thermal Ionization Mass Spectrometer (TIMS) technique has reduced cost and time by more than half while still maintaining the needed accuracy to determine risk assessment of the waste tanks. In addition, the application of ICP-MS to ORNL waste tank characterization has provided the opportunity to estimate non-routine radionuclides (i.e. 135 Cs and 151 Sm) and non-routine metals (i.e. Li, Ti, rare earths, etc.) using a rapid low cost screening method. These application methodologies and proficiencies on ORNL waste samples are summarized throughout the paper. (author)

  16. Efficient estimation for ergodic diffusions sampled at high frequency

    DEFF Research Database (Denmark)

    Sørensen, Michael

    A general theory of efficient estimation for ergodic diffusions sampled at high fre- quency is presented. High frequency sampling is now possible in many applications, in particular in finance. The theory is formulated in term of approximate martingale estimating functions and covers a large class...

  17. Atmospheric scanning electron microscope system with an open sample chamber: Configuration and applications

    Energy Technology Data Exchange (ETDEWEB)

    Nishiyama, Hidetoshi, E-mail: hinishiy@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Koizumi, Mitsuru, E-mail: koizumi@jeol.co.jp [JEOL Technics Ltd., 2-6-38 Musashino, Akishima, Tokyo 196-0021 (Japan); Ogawa, Koji, E-mail: kogawa@jeol.co.jp [JEOL Technics Ltd., 2-6-38 Musashino, Akishima, Tokyo 196-0021 (Japan); Kitamura, Shinich, E-mail: kitamura@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Konyuba, Yuji, E-mail: ykonyuub@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Watanabe, Yoshiyuki, E-mail: watanabeyoshiy@pref.yamagata.jp [Yamagata Research Institute of Technology, 2-2-1, Matsuei, Yamagata 990-2473 (Japan); Ohbayashi, Norihiko, E-mail: n.ohbayashi@m.tohoku.ac.jp [Laboratory of Membrane Trafficking Mechanisms, Department of Developmental Biology and Neurosciences, Graduate School of Life Sciences, Tohoku University, Aobayama, Aoba-ku, Sendai, Miyagi 980-8578 (Japan); Fukuda, Mitsunori, E-mail: nori@m.tohoku.ac.jp [Laboratory of Membrane Trafficking Mechanisms, Department of Developmental Biology and Neurosciences, Graduate School of Life Sciences, Tohoku University, Aobayama, Aoba-ku, Sendai, Miyagi 980-8578 (Japan); Suga, Mitsuo, E-mail: msuga@jeol.co.jp [JEOL Ltd., 3-1-2, Musashino, Akishima, Tokyo 196-8558 (Japan); Sato, Chikara, E-mail: ti-sato@aist.go.jp [Biomedical Research Institute, National Institute of Advanced Industrial Science and Technology (AIST), 1-1-4, Umezono, Tsukuba 305-8568 (Japan)

    2014-12-15

    An atmospheric scanning electron microscope (ASEM) with an open sample chamber and optical microscope (OM) is described and recent developments are reported. In this ClairScope system, the base of the open sample dish is sealed to the top of the inverted SEM column, allowing the liquid-immersed sample to be observed by OM from above and by SEM from below. The optical axes of the two microscopes are aligned, ensuring that the same sample areas are imaged to realize quasi-simultaneous correlative microscopy in solution. For example, the cathodoluminescence of ZnO particles was directly demonstrated. The improved system has (i) a fully motorized sample stage, (ii) a column protection system in the case of accidental window breakage, and (iii) an OM/SEM operation system controlled by a graphical user interface. The open sample chamber allows the external administration of reagents during sample observation. We monitored the influence of added NaCl on the random motion of silica particles in liquid. Further, using fluorescence as a transfection marker, the effect of small interfering RNA-mediated knockdown of endogenous Varp on Tyrp1 trafficking in melanocytes was examined. A temperature-regulated titanium ASEM dish allowed the dynamic observation of colloidal silver nanoparticles as they were heated to 240 °C and sintered. - Highlights: • Atmospheric SEM (ASEM) allows observation of samples in liquid or gas. • Open sample chamber allows in situ monitoring of evaporation and sintering processes. • in situ monitoring of processes during reagent administration is also accomplished. • Protection system for film breakage is developed for ASEM. • Usability of ASEM has been improved significantly including GUI control.

  18. Atmospheric scanning electron microscope system with an open sample chamber: Configuration and applications

    International Nuclear Information System (INIS)

    Nishiyama, Hidetoshi; Koizumi, Mitsuru; Ogawa, Koji; Kitamura, Shinich; Konyuba, Yuji; Watanabe, Yoshiyuki; Ohbayashi, Norihiko; Fukuda, Mitsunori; Suga, Mitsuo; Sato, Chikara

    2014-01-01

    An atmospheric scanning electron microscope (ASEM) with an open sample chamber and optical microscope (OM) is described and recent developments are reported. In this ClairScope system, the base of the open sample dish is sealed to the top of the inverted SEM column, allowing the liquid-immersed sample to be observed by OM from above and by SEM from below. The optical axes of the two microscopes are aligned, ensuring that the same sample areas are imaged to realize quasi-simultaneous correlative microscopy in solution. For example, the cathodoluminescence of ZnO particles was directly demonstrated. The improved system has (i) a fully motorized sample stage, (ii) a column protection system in the case of accidental window breakage, and (iii) an OM/SEM operation system controlled by a graphical user interface. The open sample chamber allows the external administration of reagents during sample observation. We monitored the influence of added NaCl on the random motion of silica particles in liquid. Further, using fluorescence as a transfection marker, the effect of small interfering RNA-mediated knockdown of endogenous Varp on Tyrp1 trafficking in melanocytes was examined. A temperature-regulated titanium ASEM dish allowed the dynamic observation of colloidal silver nanoparticles as they were heated to 240 °C and sintered. - Highlights: • Atmospheric SEM (ASEM) allows observation of samples in liquid or gas. • Open sample chamber allows in situ monitoring of evaporation and sintering processes. • in situ monitoring of processes during reagent administration is also accomplished. • Protection system for film breakage is developed for ASEM. • Usability of ASEM has been improved significantly including GUI control

  19. Monte Carlo importance sampling optimization for system reliability applications

    International Nuclear Information System (INIS)

    Campioni, Luca; Vestrucci, Paolo

    2004-01-01

    This paper focuses on the reliability analysis of multicomponent systems by the importance sampling technique, and, in particular, it tackles the optimization aspect. A methodology based on the minimization of the variance at the component level is proposed for the class of systems consisting of independent components. The claim is that, by means of such a methodology, the optimal biasing could be achieved without resorting to the typical approach by trials

  20. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  1. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  2. Rapid Sampling from Sealed Containers

    International Nuclear Information System (INIS)

    Johnston, R.G.; Garcia, A.R.E.; Martinez, R.K.; Baca, E.T.

    1999-01-01

    The authors have developed several different types of tools for sampling from sealed containers. These tools allow the user to rapidly drill into a closed container, extract a sample of its contents (gas, liquid, or free-flowing powder), and permanently reseal the point of entry. This is accomplished without exposing the user or the environment to the container contents, even while drilling. The entire process is completed in less than 15 seconds for a 55 gallon drum. Almost any kind of container can be sampled (regardless of the materials) with wall thicknesses up to 1.3 cm and internal pressures up to 8 atm. Samples can be taken from the top, sides, or bottom of a container. The sampling tools are inexpensive, small, and easy to use. They work with any battery-powered hand drill. This allows considerable safety, speed, flexibility, and maneuverability. The tools also permit the user to rapidly attach plumbing, a pressure relief valve, alarms, or other instrumentation to a container. Possible applications include drum venting, liquid transfer, container flushing, waste characterization, monitoring, sampling for archival or quality control purposes, emergency sampling by rapid response teams, counter-terrorism, non-proliferation and treaty verification, and use by law enforcement personnel during drug or environmental raids

  3. Automatic Sample Changer for X-Ray Spectrometry

    International Nuclear Information System (INIS)

    Morales Tarre, Orlando; Diaz Castro, Maikel; Rivero Ramirez, Doris; Lopez Pino, Neivy

    2011-01-01

    The design and construction of an automatic sample changer for Nuclear Analysis Laboratory's X-ray spectrometer at InSTEC is presented by giving basic details about its mechanical structure, control circuits and the software application developed to interact with the data acquisition software of the multichannel analyzer. Results of some test experiments performed with the automatic sample changer are also discussed. The system is currently in use at InSTEC. (Author)

  4. Application of the efficiency tracing method to the liquid scintillation metrology of 3H and 14C dual-labelled samples

    International Nuclear Information System (INIS)

    Martin-Casallo, M. T.; Los Arcos, J. M.; Grau, A.

    1989-01-01

    Two calculation procedures have been tested for the application of the efficiency tracing method to the activity determination of 3H and 14C dual- -labelled samples in the liquid scintillation metrology. A procedure Ieads to the statement of a linear equations system as a function of the quenching parameter, while the other one uses a least-square algorithm to fit the total count rate against the quenching parameter. The first procedure is strongly sensitive to the statistical uncertainty on the partial efficiencies and produces discrepancies which may reach more than 100% compared to the real values. The second procedure leads to more reliable results, showing discrepancies between 0.1% and 0.6% for the 3H activity and between 0.6% and 5% for the 14C activity, as that the efficiency tracing method can be applied to the metro- logy of dual-labelled samples of 3H and 14C by means of this procedure. (Author) 7 refs

  5. A line-based vegetation sampling technique and its application in ...

    African Journals Online (AJOL)

    percentage cover, density and intercept frequency) and also provides plant size distributions, yet requires no more sampling effort than the line-intercept method.. A field test of the three techniques in succulent karoo, showed that the discriminating ...

  6. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  7. Levels of Cadmium in Soil, Sediment and Water Samples from ...

    African Journals Online (AJOL)

    cce

    The agricultural application of phosphate fertilizers represents a direct ... The samples were put into clean plastic containers and sealed. The plastic ... dried samples were ground and homogenized in a porcelain mortar, sieved to 40 mesh size.

  8. Direct protein quantification in complex sample solutions by surface-engineered nanorod probes

    KAUST Repository

    Schrittwieser, Stefan

    2017-06-30

    Detecting biomarkers from complex sample solutions is the key objective of molecular diagnostics. Being able to do so in a simple approach that does not require laborious sample preparation, sophisticated equipment and trained staff is vital for point-of-care applications. Here, we report on the specific detection of the breast cancer biomarker sHER2 directly from serum and saliva samples by a nanorod-based homogeneous biosensing approach, which is easy to operate as it only requires mixing of the samples with the nanorod probes. By careful nanorod surface engineering and homogeneous assay design, we demonstrate that the formation of a protein corona around the nanoparticles does not limit the applicability of our detection method, but on the contrary enables us to conduct in-situ reference measurements, thus further strengthening the point-of-care applicability of our method. Making use of sandwich assays on top of the nanorods, we obtain a limit of detection of 110 pM and 470 pM in 10-fold diluted spiked saliva and serum samples, respectively. In conclusion, our results open up numerous applications in direct protein biomarker quantification, specifically in point-of-care settings where resources are limited and ease-of-use is of essence.

  9. Direct protein quantification in complex sample solutions by surface-engineered nanorod probes

    KAUST Repository

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Schotter, Joerg

    2017-01-01

    Detecting biomarkers from complex sample solutions is the key objective of molecular diagnostics. Being able to do so in a simple approach that does not require laborious sample preparation, sophisticated equipment and trained staff is vital for point-of-care applications. Here, we report on the specific detection of the breast cancer biomarker sHER2 directly from serum and saliva samples by a nanorod-based homogeneous biosensing approach, which is easy to operate as it only requires mixing of the samples with the nanorod probes. By careful nanorod surface engineering and homogeneous assay design, we demonstrate that the formation of a protein corona around the nanoparticles does not limit the applicability of our detection method, but on the contrary enables us to conduct in-situ reference measurements, thus further strengthening the point-of-care applicability of our method. Making use of sandwich assays on top of the nanorods, we obtain a limit of detection of 110 pM and 470 pM in 10-fold diluted spiked saliva and serum samples, respectively. In conclusion, our results open up numerous applications in direct protein biomarker quantification, specifically in point-of-care settings where resources are limited and ease-of-use is of essence.

  10. Stationary and related stochastic processes sample function properties and their applications

    CERN Document Server

    Cramér, Harald

    2004-01-01

    This graduate-level text offers a comprehensive account of the general theory of stationary processes, with special emphasis on the properties of sample functions. Assuming a familiarity with the basic features of modern probability theory, the text develops the foundations of the general theory of stochastic processes, examines processes with a continuous-time parameter, and applies the general theory to procedures key to the study of stationary processes. Additional topics include analytic properties of the sample functions and the problem of time distribution of the intersections between a

  11. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  12. Neutron beam applications - Polymer study and sample environment development for HANARO SANS instrument

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hong Doo [Kyunghee University, Seoul (Korea); Char, Kook Heon [Seoul National University, Seoul (Korea)

    2000-04-01

    A new SANS instrument will be installed in HANARO reactor near future and in parallel it is necessary to develop the sample environment facilities. One of the basic items is the equipment to control the sample temperature of cell block with auto-sample changer. It is required to develop a control software for this purpose. In addition, softwares of the aquisition and analysis for SANS instrument must be developed and supplied in order to function properly. PS/PI block copolymer research in NIST will provide the general understanding of SANS instrument and instrument-related valuable informations such as standard sample for SANS and know-hows of the instrument building. The following are the results of this research. a. Construction of sample cell block. b. Software to control the temperature and auto-sample changer. c. Acquisition of the SANS data analysis routine and its modification for HANARO SANS. d. PS/PI block copolymer research in NIST. e. Calibration data of NIST and HANARO SANS for comparison. 39 figs., 2 tabs. (Author)

  13. Nonadiabatic transition path sampling

    International Nuclear Information System (INIS)

    Sherman, M. C.; Corcelli, S. A.

    2016-01-01

    Fewest-switches surface hopping (FSSH) is combined with transition path sampling (TPS) to produce a new method called nonadiabatic path sampling (NAPS). The NAPS method is validated on a model electron transfer system coupled to a Langevin bath. Numerically exact rate constants are computed using the reactive flux (RF) method over a broad range of solvent frictions that span from the energy diffusion (low friction) regime to the spatial diffusion (high friction) regime. The NAPS method is shown to quantitatively reproduce the RF benchmark rate constants over the full range of solvent friction. Integrating FSSH within the TPS framework expands the applicability of both approaches and creates a new method that will be helpful in determining detailed mechanisms for nonadiabatic reactions in the condensed-phase.

  14. Application of neutron activation analysis to trace element determinations in lung samples

    International Nuclear Information System (INIS)

    Rocero, Sizue Ota

    1992-01-01

    The purpose of this work was to apply the instrumental neutron activation analysis method to determine trace elements in lung samples from smokers and non smokers. Samples of lung tissues and lymph nodes from pulmonary hilum analyzed were collected from autopsies by researchers from the Medicine College of the University of Sao Paulo, SP, Brazil. Adequate conditions for preparation and analysis of samples were previously established. The preparation of samples consisted of homogenization, lyophilization and sterilization in 60 Co source. The samples and standards were irradiated in the IEA-R1 reactor under thermal neutron flux of 3.7 x 10 11 n.cm -2 .s -1 for 30 min to determine Cl, K, Mn and Na and for 16 h under flux of 10 19 n.cm -2 .s -1 for the determination of Au, Br, Ce, Co, Cr, Cs, Eu, Fe, Hf, La, Rb, Sb, Sc, Se, Th and Zn. The counting were carried out with a hiperpure (ge) detector connected to a 4096 channels analyzer and a microcomputer. the results obtained for lung sample analyses indicated a good reproducibility of the method for most of the elements determined with relative standard deviations lower than 10.5%. The accuracy of the method was evaluated by analyzing reference materials such as IAEA Animal Muscle H-4, NIST Bovine Liver 1577a, IUPAC Bowen's Kale and NIES Vehicle Exhaust Particulates. The results obtained from these analyzes agreed with the values of the literature for several elements with relative errors less than 20%. Less precise and accurate results were obtained for elements with concentrations at the Mup/Kg levels. Elemental concentrations obtained in the lung tissue analyses were within the range of reference values for normal subjects presented in the literature, except for the Cl concentrations for non smokers, Hf in both groups and Sb for the smokers. By comparing results obtained for lung samples from smokers and non smokers, the concentrations of Ce, Cr and Sb were higher in lungs from smokers and the others elements were

  15. Application of neutron activation analysis to trace elements determinations in lung samples

    International Nuclear Information System (INIS)

    Rogero, S.O.

    1991-01-01

    The purpose of this work was to apply the instrumental neutron activation analysis method to determine trace elements in lung samples from smokers and non smokers. Samples of lung tissues and lymph nodes from pulmonary hilum analyzed were collected from autopsies by researchers from Faculdade de Medicina da USP. (author)

  16. Inverse sampled Bernoulli (ISB) procedure for estimating a population proportion, with nuclear material applications

    International Nuclear Information System (INIS)

    Wright, T.

    1982-01-01

    A new sampling procedure is introduced for estimating a population proportion. The procedure combines the ideas of inverse binomial sampling and Bernoulli sampling. An unbiased estimator is given with its variance. The procedure can be viewed as a generalization of inverse binomial sampling

  17. Tritium sampling and measurement

    International Nuclear Information System (INIS)

    Wood, M.J.; McElroy, R.G.; Surette, R.A.; Brown, R.M.

    1993-01-01

    Current methods for sampling and measuring tritium are described. Although the basic techniques have not changed significantly over the last 10 y, there have been several notable improvements in tritium measurement instrumentation. The design and quality of commercial ion-chamber-based and gas-flow-proportional-counter-based tritium monitors for tritium-in-air have improved, an indirect result of fusion-related research in the 1980s. For tritium-in-water analysis, commercial low-level liquid scintillation spectrometers capable of detecting tritium-in-water concentrations as low as 0.65 Bq L-1 for counting times of 500 min are available. The most sensitive method for tritium-in-water analysis is still 3He mass spectrometry. Concentrations as low as 0.35 mBq L-1 can be detected with current equipment. Passive tritium-oxide-in-air samplers are now being used for workplace monitoring and even in some environmental sampling applications. The reliability, convenience, and low cost of passive tritium-oxide-in-air samplers make them attractive options for many monitoring applications. Airflow proportional counters currently under development look promising for measuring tritium-in-air in the presence of high gamma and/or noble gas backgrounds. However, these detectors are currently limited by their poor performance in humidities over 30%. 133 refs

  18. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    TEMPLETON, A.M.

    2000-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications

  19. Archive-cup insert for liquid-metal sampling

    International Nuclear Information System (INIS)

    Nelson, P.A.; Kolba, V.M.; Filewicz, E.C.; Holmes, J.T.

    1975-01-01

    An insert for collecting liquid-metal samples within a vertical casing including an elongated housing with an upper and a lower overflow seal of annular shape is described. The lower seal includes a centrally located pedestal on which a sample cup is disposed. Liquid metal enters the annulus of the upper seal and overflows into the cup which fills and overflows into the lower seal. Liquid-metal overflow from the lower seal is discharged from the insert. On cooling, the liquid metal trapped within the seals solidifies to hermetically isolate the metal sample within the cup. The device is particularly applicable for use with sampling systems on liquid metal-cooled reactors. (U.S.)

  20. High-Precision In Situ 87Sr/86Sr Analyses through Microsampling on Solid Samples: Applications to Earth and Life Sciences

    Directory of Open Access Journals (Sweden)

    Sara Di Salvo

    2018-01-01

    Full Text Available An analytical protocol for high-precision, in situ microscale isotopic investigations is presented here, which combines the use of a high-performing mechanical microsampling device and high-precision TIMS measurements on micro-Sr samples, allowing for excellent results both in accuracy and precision. The present paper is a detailed methodological description of the whole analytical procedure from sampling to elemental purification and Sr-isotope measurements. The method offers the potential to attain isotope data at the microscale on a wide range of solid materials with the use of minimally invasive sampling. In addition, we present three significant case studies for geological and life sciences, as examples of the various applications of microscale 87Sr/86Sr isotope ratios, concerning (i the pre-eruptive mechanisms triggering recent eruptions at Nisyros volcano (Greece, (ii the dynamics involved with the initial magma ascent during Eyjafjallajökull volcano’s (Iceland 2010 eruption, which are usually related to the precursory signals of the eruption, and (iii the environmental context of a MIS 3 cave bear, Ursus spelaeus. The studied cases show the robustness of the methods, which can be also be applied in other areas, such as cultural heritage, archaeology, petrology, and forensic sciences.

  1. Methods for the collection of subsurface samples during environmental site assessments

    International Nuclear Information System (INIS)

    Weinstock, E.A.

    1996-01-01

    This paper discusses numerous sample collection techniques that have been successfully employed during Phase 2 Assessments and presents case histories of their application. Pollutants of concern include PCE and petroleum. The collection of shallow soil samples is described using commercially available hand augers and hand-driven core samplers. These devices are modified with extensions to collect deeper samples from storm drains and leaching pools. The performance of soil gas surveys are described using both hand-driven sample probes and vehicle-mounted, hydraulically driven vapor probes. Once the soil vapor is collected at the ground surface, a sample of the media is either analyzed on-site using a field-operated detection device or delivered to a laboratory for analysis. Application and case histories of the Geoprobe(trademark)sampling system, a form of direct push technology, are described. This device uses vehicle-mounted, hydraulically-driven sample probes. The probe can be advanced to depths as great as 100 feet below grade and can retrieve soil, soil gas and groundwater samples

  2. Study of sample-detector assemblies for application to in-situ measurement of radioactivity in liquid effluents

    International Nuclear Information System (INIS)

    Pendharkar, K.A.; Narayanan Kutty, K.; Krishnamony, S.

    1991-01-01

    This paper describes the experimental investigations carried out on four different types of sample-detector assemblies with a view to determining their detection limits and relative merits for application to in-situ measurement of radioactivity in liquid effluents. The four systems studied were: (1) gamma detection using 11 cm x 8 cm NaI (Tl) scintillation detector inserted in the cavity of a specially designed stainless steel chamber of capacity 15 liters, (2) gamma detection using a metal-walled G.M. counter in a similar manner, (3) beta detection using twin thin-walled G.M. counters immersed in liquid, and (4) end window G.M. counter positioned above the liquid surface in a shallow tray. The design features of an in-line monitor employing a 11 cm x 8 cm NaI (Tl) detector used for the routine monitoring of beta gamma activity concentrations in the low level effluents of the Tarapur Fuel Processing Plant are described. (author). 1 tab

  3. Efficient maximal Poisson-disk sampling and remeshing on surfaces

    KAUST Repository

    Guo, Jianwei; Yan, Dongming; Jia, Xiaohong; Zhang, Xiaopeng

    2015-01-01

    Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.

  4. Efficient maximal Poisson-disk sampling and remeshing on surfaces

    KAUST Repository

    Guo, Jianwei

    2015-02-01

    Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.

  5. Sample preparation with solid phase microextraction and exhaustive extraction approaches: Comparison for challenging cases.

    Science.gov (United States)

    Boyacı, Ezel; Rodríguez-Lafuente, Ángel; Gorynski, Krzysztof; Mirnaghi, Fatemeh; Souza-Silva, Érica A; Hein, Dietmar; Pawliszyn, Janusz

    2015-05-11

    In chemical analysis, sample preparation is frequently considered the bottleneck of the entire analytical method. The success of the final method strongly depends on understanding the entire process of analysis of a particular type of analyte in a sample, namely: the physicochemical properties of the analytes (solubility, volatility, polarity etc.), the environmental conditions, and the matrix components of the sample. Various sample preparation strategies have been developed based on exhaustive or non-exhaustive extraction of analytes from matrices. Undoubtedly, amongst all sample preparation approaches, liquid extraction, including liquid-liquid (LLE) and solid phase extraction (SPE), are the most well-known, widely used, and commonly accepted methods by many international organizations and accredited laboratories. Both methods are well documented and there are many well defined procedures, which make them, at first sight, the methods of choice. However, many challenging tasks, such as complex matrix applications, on-site and in vivo applications, and determination of matrix-bound and free concentrations of analytes, are not easily attainable with these classical approaches for sample preparation. In the last two decades, the introduction of solid phase microextraction (SPME) has brought significant progress in the sample preparation area by facilitating on-site and in vivo applications, time weighted average (TWA) and instantaneous concentration determinations. Recently introduced matrix compatible coatings for SPME facilitate direct extraction from complex matrices and fill the gap in direct sampling from challenging matrices. Following introduction of SPME, numerous other microextraction approaches evolved to address limitations of the above mentioned techniques. There is not a single method that can be considered as a universal solution for sample preparation. This review aims to show the main advantages and limitations of the above mentioned sample

  6. A scenario tree model for the Canadian Notifiable Avian Influenza Surveillance System and its application to estimation of probability of freedom and sample size determination.

    Science.gov (United States)

    Christensen, Jette; Stryhn, Henrik; Vallières, André; El Allaki, Farouk

    2011-05-01

    In 2008, Canada designed and implemented the Canadian Notifiable Avian Influenza Surveillance System (CanNAISS) with six surveillance activities in a phased-in approach. CanNAISS was a surveillance system because it had more than one surveillance activity or component in 2008: passive surveillance; pre-slaughter surveillance; and voluntary enhanced notifiable avian influenza surveillance. Our objectives were to give a short overview of two active surveillance components in CanNAISS; describe the CanNAISS scenario tree model and its application to estimation of probability of populations being free of NAI virus infection and sample size determination. Our data from the pre-slaughter surveillance component included diagnostic test results from 6296 serum samples representing 601 commercial chicken and turkey farms collected from 25 August 2008 to 29 January 2009. In addition, we included data from a sub-population of farms with high biosecurity standards: 36,164 samples from 55 farms sampled repeatedly over the 24 months study period from January 2007 to December 2008. All submissions were negative for Notifiable Avian Influenza (NAI) virus infection. We developed the CanNAISS scenario tree model, so that it will estimate the surveillance component sensitivity and the probability of a population being free of NAI at the 0.01 farm-level and 0.3 within-farm-level prevalences. We propose that a general model, such as the CanNAISS scenario tree model, may have a broader application than more detailed models that require disease specific input parameters, such as relative risk estimates. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  7. Sampling Transition Pathways in Highly Correlated Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David

    2004-10-20

    This research grant supported my group's efforts to apply and extend the method of transition path sampling that we invented during the late 1990s. This methodology is based upon a statistical mechanics of trajectory space. Traditional statistical mechanics focuses on state space, and with it, one can use Monte Carlo methods to facilitate importance sampling of states. With our formulation of a statistical mechanics of trajectory space, we have succeeded at creating algorithms by which importance sampling can be done for dynamical processes. In particular, we are able to study rare but important events without prior knowledge of transition states or mechanisms. In perhaps the most impressive application of transition path sampling, my group combined forces with Michele Parrinello and his coworkers to unravel the dynamics of auto ionization of water [5]. This dynamics is the fundamental kinetic step of pH. Other applications concern nature of dynamics far from equilibrium [1, 7], nucleation processes [2], cluster isomerization, melting and dissociation [3, 6], and molecular motors [10]. Research groups throughout the world are adopting transition path sampling. In part this has been the result of our efforts to provide pedagogical presentations of the technique [4, 8, 9], as well as providing new procedures for interpreting trajectories of complex systems [11].

  8. Efficient Sample Tracking With OpenLabFramework

    DEFF Research Database (Denmark)

    List, Markus; Schmidt, Steffen; Trojnar, Jakub

    2014-01-01

    of samples created and need to be replaced with state-of-the-art laboratory information management systems. Such systems have been developed in large numbers, but they are often limited to specific research domains and types of data. One domain so far neglected is the management of libraries of vector clones...... and genetically engineered cell lines. OpenLabFramework is a newly developed web-application for sample tracking, particularly laid out to fill this gap, but with an open architecture allowing it to be extended for other biological materials and functional data. Its sample tracking mechanism is fully customizable...

  9. Drop-on-demand sample introduction system coupled with the flowing atmospheric-pressure afterglow for direct molecular analysis of complex liquid microvolume samples.

    Science.gov (United States)

    Schaper, J Niklas; Pfeuffer, Kevin P; Shelley, Jacob T; Bings, Nicolas H; Hieftje, Gary M

    2012-11-06

    One of the fastest developing fields in analytical spectrochemistry in recent years is ambient desorption/ionization mass spectrometry (ADI-MS). This burgeoning interest has been due to the demonstrated advantages of the method: simple mass spectra, little or no sample preparation, and applicability to samples in the solid, liquid, or gaseous state. One such ADI-MS source, the flowing atmospheric-pressure afterglow (FAPA), is capable of direct analysis of solids just by aiming the source at the solid surface and sampling the produced ions into a mass spectrometer. However, direct introduction of significant volumes of liquid samples into this source has not been possible, as solvent loads can quench the afterglow and, thus, the formation of reagent ions. As a result, the analysis of liquid samples is preferably carried out by analyzing dried residues or by desorbing small amounts of liquid samples directly from the liquid surface. In the former case, reproducibility of sample introduction is crucial if quantitative results are desired. In the present study, introduction of liquid samples as very small droplets helps overcome the issues of sample positioning and reduced levels of solvent intake. A recently developed "drop-on-demand" (DOD) aerosol generator is capable of reproducibly producing very small volumes of liquid (∼17 pL). In this paper, the coupling of FAPA-MS and DOD is reported and applications are suggested. Analytes representing different classes of substances were tested and limits of detections were determined. Matrix tolerance was investigated for drugs of abuse and their metabolites by analyzing raw urine samples and quantification without the use of internal standards. Limits of detection below 2 μg/mL, without sample pretreatment, were obtained.

  10. Authentication Assurance Level Application to the Inventory Sampling Measurement System

    International Nuclear Information System (INIS)

    Devaney, Mike M.; Kouzes, Richard T.; Hansen, Randy R.; Geelhood, Bruce D.

    2001-01-01

    This document concentrates on the identification of a standardized assessment approach for the verification of security functionality in specific equipment, the Inspection Sampling Measurement System (ISMS) being developed for MAYAK. Specifically, an Authentication Assurance Level 3 is proposed to be reached in authenticating the ISMS

  11. Enhanced AFCI Sampling, Analysis, and Safeguards Technology Review

    Energy Technology Data Exchange (ETDEWEB)

    John Svoboda

    2009-09-01

    The focus of this study includes the investigation of sampling technologies used in industry and their potential application to nuclear fuel processing. The goal is to identify innovative sampling methods using state of the art techniques that could evolve into the next generation sampling and analysis system for metallic elements. Sampling and analysis of nuclear fuel recycling plant processes is required both to monitor the operations and ensure Safeguards and Security goals are met. In addition, environmental regulations lead to additional samples and analysis to meet licensing requirements. The volume of samples taken by conventional means, can restrain productivity while results samples are analyzed, require process holding tanks that are sized to meet analytical issues rather than process issues (and that create a larger facility footprint), or, in some cases, simply overwhelm analytical laboratory capabilities. These issues only grow when process flowsheets propose new separations systems and new byproduct material for transmutation purposes. Novel means of streamlining both sampling and analysis are being evaluated to increase the efficiency while meeting all requirements for information. This report addresses just a part of the effort to develop and study novel methods by focusing on the sampling and analysis of aqueous samples for metallic elements. It presents an overview of the sampling requirements, including frequency, sensitivity, accuracy, and programmatic drivers, to demonstrate the magnitude of the task. The sampling and analysis system needed for metallic element measurements is then discussed, and novel options being applied to other industrial analytical needs are presented. Inductively coupled mass spectrometry instruments are the most versatile for metallic element analyses and are thus chosen as the focus for the study. Candidate novel means of process sampling, as well as modifications that are necessary to couple such instruments to

  12. Monoclonal antibody-based dipstick assay: a reliable field applicable technique for diagnosis of Schistosoma mansoni infection using human serum and urine samples.

    Science.gov (United States)

    Demerdash, Zeinab; Mohamed, Salwa; Hendawy, Mohamed; Rabia, Ibrahim; Attia, Mohy; Shaker, Zeinab; Diab, Tarek M

    2013-02-01

    A field applicable diagnostic technique, the dipstick assay, was evaluated for its sensitivity and specificity in diagnosing human Schistosoma mansoni infection. A monoclonal antibody (mAb) against S. mansoni adult worm tegumental antigen (AWTA) was employed in dipstick and sandwich ELISA for detection of circulating schistosome antigen (CSA) in both serum and urine samples. Based on clinical and parasitological examinations, 60 S. mansoni-infected patients, 30 patients infected with parasites other than schistosomiasis, and 30 uninfected healthy individuals were selected. The sensitivity and specificity of dipstick assay in urine samples were 86.7% and 90.0%, respectively, compared to 90.0% sensitivity and 91.7% specificity of sandwich ELISA. In serum samples, the sensitivity and specificity were 88.3% and 91.7% for dipstick assay vs. 91.7% and 95.0% for sandwich ELISA, respectively. The diagnostic efficacy of dipstick assay in urine and serum samples was 88.3% and 90.0%, while it was 90.8% and 93.3% for sandwich ELISA, respectively. The diagnostic indices of dipstick assay and ELISA either in serum or in urine were statistically comparable (P>0.05). In conclusion, the dipstick assay offers an alternative simple, rapid, non-invasive technique in detecting CSA or complement to stool examinations especially in field studies.

  13. Stratospheric Air Sub-sampler (SAS) and its application to analysis of Delta O-17(CO2) from small air samples collected with an AirCore

    NARCIS (Netherlands)

    Mrozek, Dorota Janina; van der Veen, Carina; Hofmann, Magdalena E. G.; Chen, Huilin; Kivi, Rigel; Heikkinen, Pauli; Rockmann, Thomas

    2016-01-01

    We present the set-up and a scientific application of the Stratospheric Air Sub-sampler (SAS), a device to collect and to store the vertical profile of air collected with an AirCore (Karion et al., 2010) in numerous sub-samples for later analysis in the laboratory. The SAS described here is a 20m

  14. Honest Importance Sampling with Multiple Markov Chains.

    Science.gov (United States)

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable

  15. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    Science.gov (United States)

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Non-uniform sampling and wide range angular spectrum method

    International Nuclear Information System (INIS)

    Kim, Yong-Hae; Byun, Chun-Won; Oh, Himchan; Lee, JaeWon; Pi, Jae-Eun; Heon Kim, Gi; Lee, Myung-Lae; Ryu, Hojun; Chu, Hye-Yong; Hwang, Chi-Sun

    2014-01-01

    A novel method is proposed for simulating free space field propagation from a source plane to a destination plane that is applicable for both small and large propagation distances. The angular spectrum method (ASM) was widely used for simulating near field propagation, but it caused a numerical error when the propagation distance was large because of aliasing due to under sampling. Band limited ASM satisfied the Nyquist condition on sampling by limiting a bandwidth of a propagation field to avoid an aliasing error so that it could extend the applicable propagation distance of the ASM. However, the band limited ASM also made an error due to the decrease of an effective sampling number in a Fourier space when the propagation distance was large. In the proposed wide range ASM, we use a non-uniform sampling in a Fourier space to keep a constant effective sampling number even though the propagation distance is large. As a result, the wide range ASM can produce simulation results with high accuracy for both far and near field propagation. For non-paraxial wave propagation, we applied the wide range ASM to a shifted destination plane as well. (paper)

  17. Application of neutron activation analysis technique in elemental determination of lichen samples

    International Nuclear Information System (INIS)

    Djoko Prakoso Dwi Atmodjo; Syukria Kurniawati; Woro Yatu Niken Syahfitri; Nana Suherman; Dadang Supriatna

    2010-01-01

    Lichen is one of the biological materials as pollution monitor that can give information about level, direction, and history of various pollutants in environment. Small sample weights and elemental content of lichens is on the order of ppm, so that its characterization requires advanced analytical techniques that has high sensitivity and capable of analyzing samples with weight of - 25 mg, such as neutron activation analysis. In this research, determination of elements was done in lichen samples obtained from Kiaracondong and Holis areas in Bandung city, to understanding the difference of industrial exposure level on surrounding environment. Samples were irradiated in RSG GA Siwabessy, Serpong, at 15 MW for 1-2 and 60 minutes for short irradiation and long irradiation, respectively. The samples were then counted using HPGe detector with GENIE 2000 software. The level of element in lichen for Kiaracondong area were Co, Cr, Cs, Fe, Mg, Mn, Sb, Sc, and V in the range of 0.55-0.86, 1.47-2.57, 0.87-1.19, 540-1005, 949-1674, 34.91-45.94, 0.08-0.14, 0.16-0.31, and ≤ 2.33 mg/kg, respectively, while for Holis area were 1.04-2.37, 4.41-10.36, 0.41-0.89, 3166-709, 1131-1422, 40.97-72.51, 0.33-0.50, 0.98-2.18, and 5.30-13.05 mg/kg respectively. From these results, it is known that pollution exposure from the semi industrial area Holis provide greater influence than in the semi industrial area Kiaracondong. (author)

  18. Hydrazine Determination in Sludge Samples by High Performance Liquid Chromatography

    Energy Technology Data Exchange (ETDEWEB)

    G. Elias; G. A. Park

    2006-02-01

    A high-performance liquid chromatographic method using ultraviolet (UV) detection was developed to detect and quantify hydrazine in a variety of environmental matrices. The method was developed primarily for sludge samples, but it is also applicable to soil and water samples. The hydrazine in the matrices was derivatized to their hydrazones with benzaldehyde. The derivatized hydrazones were separated using high performance liquid chromatography (HPLC) with a reversed-phase C-18 column in an isocratic mode with methanol-water (95:5, v/v), and detected with UV detection at 313 nm. The detection limit (25 ml) for the new analytical method is 0.0067 mg ml-1of hydrazine. Hydrazine showed low recovery in soil samples because components in soil oxidized hydrazine. Sludge samples that contained relatively high soil content also showed lower recovery. The technique is relatively simple and cost-effective, and is applicable for hydrazine analysis in different environmental matrices.

  19. Pressure Stimulated Currents (PSCin marble samples

    Directory of Open Access Journals (Sweden)

    F. Vallianatos

    2004-06-01

    Full Text Available The electrical behaviour of marble samples from Penteli Mountain was studied while they were subjected to uniaxial stress. The application of consecutive impulsive variations of uniaxial stress to thirty connatural samples produced Pressure Stimulated Currents (PSC. The linear relationship between the recorded PSC and the applied variation rate was investigated. The main results are the following: as far as the samples were under pressure corresponding to their elastic region, the maximum PSC value obeyed a linear law with respect to pressure variation. In the plastic region deviations were observed which were due to variations of Young s modulus. Furthermore, a special burst form of PSC recordings during failure is presented. The latter is emitted when irregular longitudinal splitting is observed during failure.

  20. 40 CFR Appendix II to Part 600 - Sample Fuel Economy Calculations

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sample Fuel Economy Calculations II... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Pt. 600, App. II Appendix II to Part 600—Sample Fuel Economy Calculations (a) This sample fuel economy calculation is applicable to...

  1. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  2. Dynamic Headspace Sampling as an Initial Step for Sample Preparation in Chromatographic Analysis.

    Science.gov (United States)

    Wojnowski, Wojciech; Majchrzak, Tomasz; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2017-11-01

    This work represents a brief summary of the use of dynamic headspace (DHS) as a technique for sample preparation in chromatographic analysis. Despite numerous developments in the area of analyte isolation and enrichment, DHS remains one of the fundamental methods used with GC. In our opinion, interest in this technique will not diminish significantly because it conforms to stipulations of green analytical chemistry. Moreover, DHS fulfills the need for methods that facilitate detection and determination of analytes present at ultratrace levels in complex matrixes. The main focus of this work was placed on the theoretical fundamentals of this method. Also described herein were DHS development, the advantages and disadvantages of this technique compared with other headspace sampling techniques, and selected examples of its applications in food and environmental analyses.

  3. A computational platform for MALDI-TOF mass spectrometry data: application to serum and plasma samples.

    Science.gov (United States)

    Mantini, Dante; Petrucci, Francesca; Pieragostino, Damiana; Del Boccio, Piero; Sacchetta, Paolo; Candiano, Giovanni; Ghiggeri, Gian Marco; Lugaresi, Alessandra; Federici, Giorgio; Di Ilio, Carmine; Urbani, Andrea

    2010-01-03

    Mass spectrometry (MS) is becoming the gold standard for biomarker discovery. Several MS-based bioinformatics methods have been proposed for this application, but the divergence of the findings by different research groups on the same MS data suggests that the definition of a reliable method has not been achieved yet. In this work, we propose an integrated software platform, MASCAP, intended for comparative biomarker detection from MALDI-TOF MS data. MASCAP integrates denoising and feature extraction algorithms, which have already shown to provide consistent peaks across mass spectra; furthermore, it relies on statistical analysis and graphical tools to compare the results between groups. The effectiveness in mass spectrum processing is demonstrated using MALDI-TOF data, as well as SELDI-TOF data. The usefulness in detecting potential protein biomarkers is shown comparing MALDI-TOF mass spectra collected from serum and plasma samples belonging to the same clinical population. The analysis approach implemented in MASCAP may simplify biomarker detection, by assisting the recognition of proteomic expression signatures of the disease. A MATLAB implementation of the software and the data used for its validation are available at http://www.unich.it/proteomica/bioinf. (c) 2009 Elsevier B.V. All rights reserved.

  4. The GSF anticoincidence-shielded Ge(Li) gamma-ray spectrometer and its application to the analysis of environmental samples

    International Nuclear Information System (INIS)

    Hoetzl, H.; Winkler, R.

    1981-01-01

    A high-efficiency gamma-ray spectrometer has been designed and built to provide simultaneous anticoincidence and coincidence spectrometry of low-level environmental samples. The spectrometer consists of a large-volume Ge(Li) detector as the main detector and a well-type NaI(Tl) guard detector. The Ge(Li) detector is a closed-end coaxial detector housed in a crystal of the vertical dip-stick type. Its relative photopeak efficiency is 27.5%. The guard counter is a 23-cm-dia. by 23-cm-long NaI(Tl) crystal with a 7.8-cm-dia. by 18-cm-deep centre well. The passive shield consists of a 10-cm lead shield with copper and cadmium lining. The electronics is designed to operate independently and simultaneously in the anticoincidence mode as well as in the coincidence or in the normal passive shield mode. When operating in the anticoincidence mode the Compton edge of 137 Cs is reduced by a factor of 7.7 to provide a peak-to-Compton edge ratio of 480:1. Bulk samples up to about 300 cm 3 can be measured on the top of the detector end cap inside the well of the NaI(Tl) crystal. The lower limit of detection (1000 min counting time, 95% confidence level) for 137 Cs is 1.6 pCi in a 3.8-cm-dia. by 3.5-cm-high sample geometry. The design of the spectrometer, its properties and the application to investigations on the migration of radionuclides in the soil, the analysis of radioactive emissions of coal-fired power plants and to fallout studies are described. (author)

  5. Plasma emission induced by an Nd-YAG laser at low pressure on solid organic sample, its mechanism, and analytical application

    International Nuclear Information System (INIS)

    Suliyanti, Maria Margaretha; Sardy, Sar; Kusnowo, Anung; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Kurniawan, Koo Hendrik; Lie, T.J.; Pardede, Marincan; Kagawa, Kiichiro; Tjia, M.O.

    2005-01-01

    An Nd-YAG laser (1064 nm, 120 mJ, 8 ns) was focused on various types of solid organic samples such as a black acrylic plate, a black polyvinyl chloride plastic sheet, and a methoxy polyaniline film coated on the surface of a glass substrate, under a surrounding air pressure of 2 Torr. A modulated plasma technique was used to study the mechanism of excitation of the emission of the organic material. As a result, we conclude that ablated atoms and molecules are excited by a shock-wave mechanism, similar to the case of hard samples such as metal. The ablation speed of hydrogen emission (H I 656.2 nm) was examined and the results show that the release speed of the ablated atoms is relatively low (less than Mach 10) and persists for a longer period of time (around 1 μs); this phenomenon can be understood by assuming that the soft target absorbs recoil energy, causing a low release speed of ablated atoms which would form the shock wave. This was overcome by placing a subtarget on the back of the soft sample so as to enhance the repelling force, thus increasing the release speed of the atoms. A possible application of the low-pressure plasma on an organic solid was demonstrated in the detection of chlorine in a black polyvinyl chloride plastic sheet

  6. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    Science.gov (United States)

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  7. The method of Sample Management in Neutron Activation Analysis Laboratory-Serpong

    International Nuclear Information System (INIS)

    Elisabeth-Ratnawati

    2005-01-01

    In the testing laboratory used by neutron activation analysis method, sample preparation is the main factor and it can't be neglect. The error in the sample preparation can give result with lower accuracy. In this article is explained the scheme of sample preparation i.e sample receive administration, the separate of sample, fluid and solid sample preparation, sample grouping, irradiation, sample counting and holding the sample post irradiation. If the management of samples were good application based on Standard Operation Procedure, therefore each samples has good traceability. To optimize the management of samples is needed the trained and skilled personal and good facility. (author)

  8. Importance sampling of rare events in chaotic systems

    DEFF Research Database (Denmark)

    Leitão, Jorge C.; Parente Lopes, João M.Viana; Altmann, Eduardo G.

    2017-01-01

    space of chaotic systems. As examples of our general framework we compute the distribution of finite-time Lyapunov exponents (in different chaotic maps) and the distribution of escape times (in transient-chaos problems). Our methods sample exponentially rare states in polynomial number of samples (in......Finding and sampling rare trajectories in dynamical systems is a difficult computational task underlying numerous problems and applications. In this paper we show how to construct Metropolis-Hastings Monte-Carlo methods that can efficiently sample rare trajectories in the (extremely rough) phase...... both low- and high-dimensional systems). An open-source software that implements our algorithms and reproduces our results can be found in reference [J. Leitao, A library to sample chaotic systems, 2017, https://github.com/jorgecarleitao/chaospp]....

  9. Large scale sample management and data analysis via MIRACLE

    DEFF Research Database (Denmark)

    Block, Ines; List, Markus; Pedersen, Marlene Lemvig

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. In the past years the technology advanced based on improved methods and protocols concerning sample preparation and printing, antibody selection, optimization...... of staining conditions and mode of signal analysis. However, the sample management and data analysis still poses challenges because of the high number of samples, sample dilutions, customized array patterns, and various programs necessary for array construction and data processing. We developed...... a comprehensive and user-friendly web application called MIRACLE (MIcroarray R-based Analysis of Complex Lysate Experiments), which bridges the gap between sample management and array analysis by conveniently keeping track of the sample information from lysate preparation, through array construction and signal...

  10. Microfunnel-supported liquid-phase microextraction: application to extraction and determination of Irgarol 1051 and diuron in the Persian Gulf seawater samples.

    Science.gov (United States)

    Saleh, Abolfazl; Sheijooni Fumani, Neda; Molaei, Saeideh

    2014-08-22

    In the present work, microfunnel-supported liquid-phase microextraction method (MF-LPME) based on applying low density organic solvent was developed for the determination of antifoulings (Irgarol 1051, diuron and 3,4-dichloroaniline) from seawater samples. In this method, home-designed MF device was used for facile loading and retrieving of organic solvent during the extraction procedure. The extraction was carried out with introduction of 400 μL of toluene via syringe into the MF device placed on the surface of sample solution (300 mL) containing analytes. After the extraction, extractant layer was narrowed into the capillary part of MF by pushing the device inside the sample and withdrawn by using a syringe to evaporate by nitrogen purging. The residual redissolved into 50 μL methanol, diluted to 100 μL with deionized water and injected into the high performance liquid chromatography with UV detection (HPLC-UV). Several factors influencing the extraction such as the type and volume of extraction solvent, sample pH, extraction time and ionic strength were investigated and optimized. Under the optimized conditions, the limits of detection in seawater were 1.4, 4.8 and 1.0 ng L(-1) for 3,4-dichloroaniline (DCA), diuron and Irgarol 1051, respectively. Enrichment factors were obtained 333, 150 and 373 for DCA, diuron and Irgarol 1051, respectively. The precision of the technique was evaluated in terms of repeatability which was less than 12.0% (n=5). The applicability of the proposed method was evaluated by the extraction and determination of antifoulings from seawater samples collected from harbors of Bushehr located in northern Persian Gulf coast. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Simulation and experimental studies of three-dimensional (3D) image reconstruction from insufficient sampling data based on compressed-sensing theory for potential applications to dental cone-beam CT

    International Nuclear Information System (INIS)

    Je, U.K.; Lee, M.S.; Cho, H.S.; Hong, D.K.; Park, Y.O.; Park, C.K.; Cho, H.M.; Choi, S.I.; Woo, T.H.

    2015-01-01

    In practical applications of three-dimensional (3D) tomographic imaging, there are often challenges for image reconstruction from insufficient sampling data. In computed tomography (CT), for example, image reconstruction from sparse views and/or limited-angle (<360°) views would enable fast scanning with reduced imaging doses to the patient. In this study, we investigated and implemented a reconstruction algorithm based on the compressed-sensing (CS) theory, which exploits the sparseness of the gradient image with substantially high accuracy, for potential applications to low-dose, high-accurate dental cone-beam CT (CBCT). We performed systematic simulation works to investigate the image characteristics and also performed experimental works by applying the algorithm to a commercially-available dental CBCT system to demonstrate its effectiveness for image reconstruction in insufficient sampling problems. We successfully reconstructed CBCT images of superior accuracy from insufficient sampling data and evaluated the reconstruction quality quantitatively. Both simulation and experimental demonstrations of the CS-based reconstruction from insufficient data indicate that the CS-based algorithm can be applied directly to current dental CBCT systems for reducing the imaging doses and further improving the image quality

  12. Sensitivity Sampling Over Dynamic Geometric Data Streams with Applications to $k$-Clustering

    OpenAIRE

    Song, Zhao; Yang, Lin F.; Zhong, Peilin

    2018-01-01

    Sensitivity based sampling is crucial for constructing nearly-optimal coreset for $k$-means / median clustering. In this paper, we provide a novel data structure that enables sensitivity sampling over a dynamic data stream, where points from a high dimensional discrete Euclidean space can be either inserted or deleted. Based on this data structure, we provide a one-pass coreset construction for $k$-means %and M-estimator clustering using space $\\widetilde{O}(k\\mathrm{poly}(d))$ over $d$-dimen...

  13. Legacy sample disposition project. Volume 2: Final report

    International Nuclear Information System (INIS)

    Gurley, R.N.; Shifty, K.L.

    1998-02-01

    This report describes the legacy sample disposition project at the Idaho Engineering and Environmental Laboratory (INEEL), which assessed Site-wide facilities/areas to locate legacy samples and owner organizations and then characterized and dispositioned these samples. This project resulted from an Idaho Department of Environmental Quality inspection of selected areas of the INEEL in January 1996, which identified some samples at the Test Reactor Area and Idaho Chemical Processing Plant that had not been characterized and dispositioned according to Resource Conservation and Recovery Act (RCRA) requirements. The objective of the project was to manage legacy samples in accordance with all applicable environmental and safety requirements. A systems engineering approach was used throughout the project, which included collecting the legacy sample information and developing a system for amending and retrieving the information. All legacy samples were dispositioned by the end of 1997. Closure of the legacy sample issue was achieved through these actions

  14. Applicability of solid-phase microextraction combined with gas chromatography atomic emission detection (GC-MIP AED) for the determination of butyltin compounds in sediment samples

    Energy Technology Data Exchange (ETDEWEB)

    Carpinteiro, J.; Rodriguez, I.; Cela, R. [Universidad de Santiago de Compostela, Departamento de Quimica Analitica, Nutricion y Bromatologia, Instituto de Investigacion y Analisis Alimentario, Santiago de Compostela 15782 (Spain)

    2004-11-01

    The performance of solid-phase microextraction (SPME) applied to the determination of butyltin compounds in sediment samples is systematically evaluated. Matrix effects and influence of blank signals on the detection limits of the method are studied in detail. The interval of linear response is also evaluated in order to assess the applicability of the method to sediments polluted with butyltin compounds over a large range of concentrations. Advantages and drawbacks of including an SPME step, instead of the classic liquid-liquid extraction of the derivatized analytes, in the determination of butyltin compounds in sediment samples are considered in terms of achieved detection limits and experimental effort. Analytes were extracted from the samples by sonication using glacial acetic acid. An aliquot of the centrifuged extract was placed on a vial where compounds were ethylated and concentrated on a PDMS fiber using the headspace mode. Determinations were carried out using GC-MIP AED. (orig.)

  15. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  16. Application of solid-phase micro extraction for the determination of pesticides in vegetable samples by gas chromatography with an electron capture detector

    International Nuclear Information System (INIS)

    Chai, Mee Kin; Tan, Guan Huat; Kumari, Asha

    2008-01-01

    A solid-phase micro extraction (SPME) method has been developed for the determination of 9 pesticides in 2 vegetables -cucumber and tomato - samples, based on direct immersion mode and subsequent desorption into the injection port of a gas chromatograph with an electron capture detector (GC-ECD). The main factors affecting the SPME process such as extraction time and temperature, desorption time and temperature, the effect of salt addition and fiber depth into the liner were studied and optimized. The analytical procedure proposed consisted of a 30 minute ultrasonic extraction of the target compounds from 1.0 g vegetable samples with 5 mL of distilled water. Then, the samples were filtered and topped up with distilled water to 10 mL. The analytes in this aqueous extract were extracted for 15 minutes with a 100 μm thickness polydimethylsiloxane SPME fiber. Relative standard deviations for triplicate analyses of samples were less than 10 %. The recoveries of the pesticides studied in cucumber and tomato ranged from 52 % to 82 % and the RSD were below 10 %. Therefore, the proposed method is applicable in the analysis of pesticides in vegetable matrices. SPME has been shown to be a simple extraction technique, which has a number of advantages such as solvent free extraction, simplicity and compatibility with the chromatographic analytical system. (author)

  17. Advanced sampling techniques for hand-held FT-IR instrumentation

    Science.gov (United States)

    Arnó, Josep; Frunzi, Michael; Weber, Chris; Levy, Dustin

    2013-05-01

    FT-IR spectroscopy is the technology of choice to identify solid and liquid phase unknown samples. The challenging ConOps in emergency response and military field applications require a significant redesign of the stationary FT-IR bench-top instruments typically used in laboratories. Specifically, field portable units require high levels of resistance against mechanical shock and chemical attack, ease of use in restrictive gear, extreme reliability, quick and easy interpretation of results, and reduced size. In the last 20 years, FT-IR instruments have been re-engineered to fit in small suitcases for field portable use and recently further miniaturized for handheld operation. This article introduces the HazMatID™ Elite, a FT-IR instrument designed to balance the portability advantages of a handheld device with the performance challenges associated with miniaturization. In this paper, special focus will be given to the HazMatID Elite's sampling interfaces optimized to collect and interrogate different types of samples: accumulated material using the on-board ATR press, dispersed powders using the ClearSampler™ tool, and the touch-to-sample sensor for direct liquid sampling. The application of the novel sample swipe accessory (ClearSampler) to collect material from surfaces will be discussed in some detail. The accessory was tested and evaluated for the detection of explosive residues before and after detonation. Experimental results derived from these investigations will be described in an effort to outline the advantages of this technology over existing sampling methods.

  18. [Confirming Indicators of Qualitative Results by Chromatography-mass Spectrometry in Biological Samples].

    Science.gov (United States)

    Liu, S D; Zhang, D M; Zhang, W; Zhang, W F

    2017-04-01

    Because of the exist of complex matrix, the confirming indicators of qualitative results for toxic substances in biological samples by chromatography-mass spectrometry are different from that in non-biological samples. Even in biological samples, the confirming indicators are different in various application areas. This paper reviews the similarities and differences of confirming indicators for the analyte in biological samples by chromatography-mass spectrometry in the field of forensic toxicological analysis and other application areas. These confirming indicators include retention time (RT), relative retention time (RRT), signal to noise (S/N), characteristic ions, relative abundance of characteristic ions, parent ion-daughter ion pair and abundance ratio of ion pair, etc. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  19. Influence of the mechanical sample treatment on the thermally stimulated exoelectron emission in aspect of the application for sample dating

    International Nuclear Information System (INIS)

    Zastawny, Andrzej; Bialon, Jan

    1999-01-01

    The examination was focused on a typical, contemporary produced ceramics, irradiated by the beta particles 90 Sr- 90 Y source. According to measurements, the mechanical treatment of the sample in the form of abrading and washing in alcohol did not affect the glow curve of the TSEE above the temperature 130 deg. C. Because the peaks of the TSEE, which can be taken full advantage for the dating must lie above 300 deg. C, the mechanical and washing preparing of the samples should not affect the measurements

  20. Simulation of sampling effects in FPAs

    Science.gov (United States)

    Cook, Thomas H.; Hall, Charles S.; Smith, Frederick G.; Rogne, Timothy J.

    1991-09-01

    The use of multiplexers and large focal plane arrays in advanced thermal imaging systems has drawn renewed attention to sampling and aliasing issues in imaging applications. As evidenced by discussions in a recent workshop, there is no clear consensus among experts whether aliasing in sensor designs can be readily tolerated, or must be avoided at all cost. Further, there is no straightforward, analytical method that can answer the question, particularly when considering image interpreters as different as humans and autonomous target recognizers (ATR). However, the means exist for investigating sampling and aliasing issues through computer simulation. The U.S. Army Tank-Automotive Command (TACOM) Thermal Image Model (TTIM) provides realistic sensor imagery that can be evaluated by both human observers and TRs. This paper briefly describes the history and current status of TTIM, explains the simulation of FPA sampling effects, presents validation results of the FPA sensor model, and demonstrates the utility of TTIM for investigating sampling effects in imagery.

  1. Study on the Method of Association Rules Mining Based on Genetic Algorithm and Application in Analysis of Seawater Samples

    Directory of Open Access Journals (Sweden)

    Qiuhong Sun

    2014-04-01

    Full Text Available Based on the data mining research, the data mining based on genetic algorithm method, the genetic algorithm is briefly introduced, while the genetic algorithm based on two important theories and theoretical templates principle implicit parallelism is also discussed. Focuses on the application of genetic algorithms for association rule mining method based on association rule mining, this paper proposes a genetic algorithm fitness function structure, data encoding, such as the title of the improvement program, in particular through the early issues study, proposed the improved adaptive Pc, Pm algorithm is applied to the genetic algorithm, thereby improving efficiency of the algorithm. Finally, a genetic algorithm based association rule mining algorithm, and be applied in sea water samples database in data mining and prove its effective.

  2. Exploring biomolecular dynamics and interactions using advanced sampling methods

    International Nuclear Information System (INIS)

    Luitz, Manuel; Bomblies, Rainer; Ostermeir, Katja; Zacharias, Martin

    2015-01-01

    Molecular dynamics (MD) and Monte Carlo (MC) simulations have emerged as a valuable tool to investigate statistical mechanics and kinetics of biomolecules and synthetic soft matter materials. However, major limitations for routine applications are due to the accuracy of the molecular mechanics force field and due to the maximum simulation time that can be achieved in current simulations studies. For improving the sampling a number of advanced sampling approaches have been designed in recent years. In particular, variants of the parallel tempering replica-exchange methodology are widely used in many simulation studies. Recent methodological advancements and a discussion of specific aims and advantages are given. This includes improved free energy simulation approaches and conformational search applications. (topical review)

  3. Studying the sampling representativeness in the NPP ventilation ducts

    International Nuclear Information System (INIS)

    Sosnovskij, R.I.; Fedchenko, T.K.; Minin, S.A.

    2000-01-01

    Measurements of the gas and aerosol voluminous activity in the NPP ventilation ducts are an important source of information on the radiation contaminants ingress into the environmental medium. These measurements include sampling, samples transport and proper measurements. The work is devoted to calculation of metrological characteristics of the sampling systems for the NPP gas-aerosol releases by different parameters of these systems and ventilation ducts. The results obtained are intended for application by designing such systems and their metrological certification [ru

  4. Applications of free-jet, molecular beam, mass spectrometric sampling: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Milne, T. [ed.

    1995-03-01

    Over the past 35 years, the study of die behavior and uses of free-jet expansions for laboratory experiments has greatly expanded and matured. Not the least of these uses of free-jet expansions, is that of extractive sampling from high temperature, reactive systems. The conversion of the free-jet expanded gases to molecular flow for direct introduction into the ion source of a mass spectrometer offers several advantages, to be illustrated in these pages. Two meetings on this subject were held in 1965 and 1972 in Missouri, sponsored by the Office of Naval Research and Midwest Research Institute. At these meetings rarefied gas dynamicists came together with scientists using free-jet sampling for analytical purposes. After much too long a time, this workshop was convened to bring together modem practitioners of FJMBS (Free-jet, Molecular-beam, mass spectrometry) and long time students of the free-jet process itself, to assess the current state of the art and to forge a community that can foster the development of this novel analytical approach. This proceedings is comprised of 38 individually submitted papers. Individual papers are indexed separately on the Energy Data Base.

  5. Structural-morphological peculiarities of zirconium oxyhydrate with applicated ions

    International Nuclear Information System (INIS)

    Korshunova, N.K.; Sukharev, Yu.I.; Egorov, Yu.V.

    1976-01-01

    Some results of applicated zirconium ozyhydrate investigation by thermography and electronography are considered as well as the results of microscopic and picnometric investigations. Bichromate and polyvanadate ions were used as applicants. It is demonstrated that two kinds of granules are formed: globular and plane-scaly, depending on the method of applicated synthesis of hydrated zirconium dioxide samples (HZD) and the nature of applicants. It was established by electronographycal methods that samples with plane-scaly morphology have an ordered structure. Influence of the HZD granules morphology on the absorption-exchange properties was established. Globular samples are the most sensitive to applicated additions and have better absorption characteristics. The character of the density variation as a function of applicant concentration in solid phase is the same: in the region of applicant concentration 0.15-0.30 g ion/mol ZrO 2 the density is the highest, then samples density is decreasing and increasing once again at applicant concentration 0.5-0.6 g ion/mol ZrO 2

  6. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  7. Performance Comparison of Reconstruction Algorithms in Discrete Blind Multi-Coset Sampling

    DEFF Research Database (Denmark)

    Grigoryan, Ruben; Arildsen, Thomas; Tandur, Deepaknath

    2012-01-01

    This paper investigates the performance of different reconstruction algorithms in discrete blind multi-coset sampling. Multi-coset scheme is a promising compressed sensing architecture that can replace traditional Nyquist-rate sampling in the applications with multi-band frequency sparse signals...

  8. Integrating the Theory of Sampling into Underground Mine Grade Control Strategies

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available Grade control in underground mines aims to deliver quality tonnes to the process plant via the accurate definition of ore and waste. It comprises a decision-making process including data collection and interpretation; local estimation; development and mining supervision; ore and waste destination tracking; and stockpile management. The foundation of any grade control programme is that of high-quality samples collected in a geological context. The requirement for quality samples has long been recognised, where they should be representative and fit-for-purpose. Once a sampling error is introduced, it propagates through all subsequent processes contributing to data uncertainty, which leads to poor decisions and financial loss. Proper application of the Theory of Sampling reduces errors during sample collection, preparation, and assaying. To achieve quality, sampling techniques must minimise delimitation, extraction, and preparation errors. Underground sampling methods include linear (chip and channel, grab (broken rock, and drill-based samples. Grade control staff should be well-trained and motivated, and operating staff should understand the critical need for grade control. Sampling must always be undertaken with a strong focus on safety and alternatives sought if the risk to humans is high. A quality control/quality assurance programme must be implemented, particularly when samples contribute to a reserve estimate. This paper assesses grade control sampling with emphasis on underground gold operations and presents recommendations for optimal practice through the application of the Theory of Sampling.

  9. Application of CRAFT (complete reduction to amplitude frequency table) in nonuniformly sampled (NUS) 2D NMR data processing.

    Science.gov (United States)

    Krishnamurthy, Krish; Hari, Natarajan

    2017-09-15

    The recently published CRAFT (complete reduction to amplitude frequency table) technique converts the raw FID data (i.e., time domain data) into a table of frequencies, amplitudes, decay rate constants, and phases. It offers an alternate approach to decimate time-domain data, with minimal preprocessing step. It has been shown that application of CRAFT technique to process the t 1 dimension of the 2D data significantly improved the detectable resolution by its ability to analyze without the use of ubiquitous apodization of extensively zero-filled data. It was noted earlier that CRAFT did not resolve sinusoids that were not already resolvable in time-domain (i.e., t 1 max dependent resolution). We present a combined NUS-IST-CRAFT approach wherein the NUS acquisition technique (sparse sampling technique) increases the intrinsic resolution in time-domain (by increasing t 1 max), IST fills the gap in the sparse sampling, and CRAFT processing extracts the information without loss due to any severe apodization. NUS and CRAFT are thus complementary techniques to improve intrinsic and usable resolution. We show that significant improvement can be achieved with this combination over conventional NUS-IST processing. With reasonable sensitivity, the models can be extended to significantly higher t 1 max to generate an indirect-DEPT spectrum that rivals the direct observe counterpart. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Sampling theorem for geometric moment determination and its application to a laser beam position detector.

    Science.gov (United States)

    Loce, R P; Jodoin, R E

    1990-09-10

    Using the tools of Fourier analysis, a sampling requirement is derived that assures that sufficient information is contained within the samples of a distribution to calculate accurately geometric moments of that distribution. The derivation follows the standard textbook derivation of the Whittaker-Shannon sampling theorem, which is used for reconstruction, but further insight leads to a coarser minimum sampling interval for moment determination. The need for fewer samples to determine moments agrees with intuition since less information should be required to determine a characteristic of a distribution compared with that required to construct the distribution. A formula for calculation of the moments from these samples is also derived. A numerical analysis is performed to quantify the accuracy of the calculated first moment for practical nonideal sampling conditions. The theory is applied to a high speed laser beam position detector, which uses the normalized first moment to measure raster line positional accuracy in a laser printer. The effects of the laser irradiance profile, sampling aperture, number of samples acquired, quantization, and noise are taken into account.

  11. Procedures for sampling radium-contaminated soils

    International Nuclear Information System (INIS)

    Fleischhauer, H.L.

    1985-10-01

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel or spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described

  12. Evaluation of complex gonioapparent samples using a bidirectional spectrometer.

    Science.gov (United States)

    Rogelj, Nina; Penttinen, Niko; Gunde, Marta Klanjšek

    2015-08-24

    Many applications use gonioapparent targets whose appearance depends on irradiation and viewing angles; the strongest effects are provided by light diffraction. These targets, optically variable devices (OVDs), are used in both security and authentication applications. This study introduces a bidirectional spectrometer, which enables to analyze samples with most complex angular and spectral properties. In our work, the spectrometer is evaluated with samples having very different types of reflection, concerning spectral and angular distributions. Furthermore, an OVD containing several different grating patches is evaluated. The device uses automatically adjusting exposure time to provide maximum signal dynamics and is capable of doing steps as small as 0.01°. However, even 2° steps for the detector movement showed that this device is more than capable of characterizing even the most complex reflecting surfaces. This study presents sRGB visualizations, discussion of bidirectional reflection, and accurate grating period calculations for all of the grating samples used.

  13. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.

    2013-04-27

    for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.

  14. Preparation and application of radioactive soil samples for intercomparison

    International Nuclear Information System (INIS)

    Gao Zequan; Li Zhou; Li Pengxiang; Wang Ruijun; Ren Xiaona

    2014-01-01

    This article summarized the preparation process and intercomparison results of the simulated environmental radioactive soil samples. The components of the matrix were: SiO 2 , Al 2 O 3 , Fe 2 O 3 , MgO, CaO, NaCl, KCl and TiO 2 . All of the components were milled, oven-dried, sieved and then blended together. The homogeneity test was according to GB 15000. 5-1994, and no significant differences were observed. The 3 H analysis soils were spiked natural soils with the moisture content of 15%. Eight laboratories attended this intercomparison. The results proves that the preparation of the simulated soils were suitable for the inter-laboratories comparison. (authors)

  15. Critique of Hanford Waste Vitrification Plant off-gas sampling requirements

    International Nuclear Information System (INIS)

    Goles, R.W.

    1996-03-01

    Off-gas sampling and monitoring activities needed to support operations safety, process control, waste form qualification, and environmental protection requirements of the Hanford Waste Vitrification Plant (HWVP) have been evaluated. The locations of necessary sampling sites have been identified on the basis of plant requirements, and the applicability of Defense Waste Processing Facility (DWPF) reference sampling equipment to these HWVP requirements has been assessed for all sampling sites. Equipment deficiencies, if present, have been described and the bases for modifications and/or alternative approaches have been developed

  16. New adaptive sampling method in particle image velocimetry

    International Nuclear Information System (INIS)

    Yu, Kaikai; Xu, Jinglei; Tang, Lan; Mo, Jianwei

    2015-01-01

    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method. (technical design note)

  17. Addictive Potential of Internet Applications and Differential Correlates of Problematic Use in Internet Gamers versus Generalized Internet Users in a Representative Sample of Adolescents.

    Science.gov (United States)

    Rosenkranz, Tabea; Müller, Kai W; Dreier, Michael; Beutel, Manfred E; Wölfling, Klaus

    2017-01-01

    This paper examines the addictive potential of 8 different Internet applications, distinguishing male and female users. Moreover, differential correlates of problematic use are investigated in Internet gamers (IG) and generalized Internet users (GIU). In a representative sample of 5,667 adolescents aged 12-19 years, use of Internet applications, problematic Internet use, psychopathologic symptoms (emotional problems, hyperactivity/inattention, and psychosomatic complaints), personality (conscientiousness and extraversion), psychosocial correlates (perceived stress and self-efficacy), and coping strategies were assessed. The addictive potential of Internet applications was examined in boys and girls using regression analysis. MANOVAs were conducted to examine differential correlates of problematic Internet use between IG and GIU. Chatting and social networking most strongly predicted problematic Internet use in girls, while gaming was the strongest predictor in boys. Problematic IG exhibited multiple psychosocial problems compared to non-problematic IG. In problematic Internet users, GIU reported even higher psychosocial burden and displayed dysfunctional coping strategies more frequently than gamers. The results extend previous findings on the addictive potential of Internet applications and validate the proposed distinction between specific and generalized problematic Internet use. In addition to Internet gaming disorder, future studies should also focus on other highly addictive Internet applications, that is, chatting or social networking, regarding differential correlates of problematic use. © 2017 S. Karger AG, Basel.

  18. Performance of sampling methods to estimate log characteristics for wildlife.

    Science.gov (United States)

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton

    2004-01-01

    Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...

  19. Laser-induced breakdown spectroscopy for detection of heavy metals in environmental samples

    Science.gov (United States)

    Wisbrun, Richard W.; Schechter, Israel; Niessner, Reinhard; Schroeder, Hartmut

    1993-03-01

    The application of LIBS technology as a sensor for heavy metals in solid environmental samples has been studied. This specific application introduces some new problems in the LIBS analysis. Some of them are related to the particular distribution of contaminants in the grained samples. Other problems are related to mechanical properties of the samples and to general matrix effects, like the water and organic fibers content of the sample. An attempt has been made to optimize the experimental set-up for the various involved parameters. The understanding of these factors has enabled the adjustment of the technique to the substrates of interest. The special importance of the grain size and of the laser-induced aerosol production is pointed out. Calibration plots for the analysis of heavy metals in diverse sand and soil samples have been carried out. The detection limits are shown to be usually below the recent regulation restricted concentrations.

  20. Thermal quenching of thermoluminescence in quartz samples of various origin

    International Nuclear Information System (INIS)

    Subedi, B.; Oniya, E.; Polymeris, G.S.; Afouxenidis, D.; Tsirliganis, N.C.; Kitis, G.

    2011-01-01

    The effect of thermal quenching stands among the most important properties in the thermoluminescence (TL) of quartz on which many applications of TL are based. Since the quartz samples used in various applications are all of different origin it is useful to investigate whether the values of the thermal quenching parameters, i.e. the activation energy for thermal quenching W and a parameter C which describes the ratio of non-radiative to radiative luminescence transitions, evaluated mainly in specific quartz samples can be extrapolated to quartz samples of unknown origin as well as to quartz samples which are annealed at high temperatures. In the present work the TL glow curve of a series of un-annealed and annealed natural and synthetic quartz samples were studied as a function of the heating rate between 0.25 K/s and 16 K/s. Using an indirect fitting method it was found that the thermal quenching parameters W and C in most of the quartz samples are very similar to the values accepted in the literature. Furthermore, in some cases the thermal quenching parameters W and C are not the same for all TL glow-peaks in the same glow-curve. Finally, the strong external treatment of annealing the quartz samples at very high temperature can also influence at least one of the thermal quenching parameters.

  1. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  2. Rapid extraction and assay of uranium from environmental surface samples

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, Christopher A.; Chouyyok, Wilaiwan; Speakman, Robert J.; Olsen, Khris B.; Addleman, Raymond Shane

    2017-10-01

    Extraction methods enabling faster removal and concentration of uranium compounds for improved trace and low-level assay are demonstrated for standard surface sampling material in support of nuclear safeguards efforts, health monitoring, and other nuclear analysis applications. A key problem with the existing surface sampling swipes is the requirement for complete digestion of sample and sampling matrix. This is a time-consuming and labour-intensive process that limits laboratory throughput, elevates costs, and increases background levels. Various extraction methods are explored for their potential to quickly and efficiently remove different chemical forms of uranium from standard surface sampling material. A combination of carbonate and peroxide solutions is shown to give the most rapid and complete form of uranyl compound extraction and dissolution. This rapid extraction process is demonstrated to be compatible with standard inductive coupled plasma mass spectrometry methods for uranium isotopic assay as well as screening techniques such as x-ray fluorescence. The general approach described has application beyond uranium to other analytes of nuclear forensic interest (e.g., rare earth elements and plutonium) as well as heavy metals for environmental and industrial hygiene monitoring.

  3. Spectrofluorimetric determination of gallium with N-(3-hydroxy-2-pyridyl) salicrystaldimine and its application to the analysis of biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Rojas, F.; Cano Pavon, J.M. [Department of Analytical Chemistry, Faculty of Science, University of Malaga, Malaga (Spain)

    1995-12-31

    The fluorimetric determination of gallium at the nanogram level, based on the formation of a fluorescent complex between Ga(III) and N-(3-hydroxy-2-pyridyl) salicylaldimine (3-OH-PSA), is purposed. With excitation at 397 nm, the chelate has a maximum emission at 498 nm. The reaction is carried out at acidic pH in aqueous-DMF medium (40% v/v DMF). The influence of the reaction variables are discussed. The detection limit is 0.9 ng ml``-1 and the range of application is 1-125 ng ml``-1. The relative error of the methods is + - 1.6%. The proposed method has been applied to the determination of gallium in biological sample. (Author) 12 refs.

  4. Sensitivity analysis using contribution to sample variance plot: Application to a water hammer model

    International Nuclear Information System (INIS)

    Tarantola, S.; Kopustinskas, V.; Bolado-Lavin, R.; Kaliatka, A.; Ušpuras, E.; Vaišnoras, M.

    2012-01-01

    This paper presents “contribution to sample variance plot”, a natural extension of the “contribution to the sample mean plot”, which is a graphical tool for global sensitivity analysis originally proposed by Sinclair. These graphical tools have a great potential to display graphically sensitivity information given a generic input sample and its related model realizations. The contribution to the sample variance can be obtained at no extra computational cost, i.e. from the same points used for deriving the contribution to the sample mean and/or scatter-plots. The proposed approach effectively instructs the analyst on how to achieve a targeted reduction of the variance, by operating on the extremes of the input parameters' ranges. The approach is tested against a known benchmark for sensitivity studies, the Ishigami test function, and a numerical model simulating the behaviour of a water hammer effect in a piping system.

  5. Equilibrium sampling by reweighting nonequilibrium simulation trajectories.

    Science.gov (United States)

    Yang, Cheng; Wan, Biao; Xu, Shun; Wang, Yanting; Zhou, Xin

    2016-03-01

    Based on equilibrium molecular simulations, it is usually difficult to efficiently visit the whole conformational space of complex systems, which are separated into some metastable regions by high free energy barriers. Nonequilibrium simulations could enhance transitions among these metastable regions and then be applied to sample equilibrium distributions in complex systems, since the associated nonequilibrium effects can be removed by employing the Jarzynski equality (JE). Here we present such a systematical method, named reweighted nonequilibrium ensemble dynamics (RNED), to efficiently sample equilibrium conformations. The RNED is a combination of the JE and our previous reweighted ensemble dynamics (RED) method. The original JE reproduces equilibrium from lots of nonequilibrium trajectories but requires that the initial distribution of these trajectories is equilibrium. The RED reweights many equilibrium trajectories from an arbitrary initial distribution to get the equilibrium distribution, whereas the RNED has both advantages of the two methods, reproducing equilibrium from lots of nonequilibrium simulation trajectories with an arbitrary initial conformational distribution. We illustrated the application of the RNED in a toy model and in a Lennard-Jones fluid to detect its liquid-solid phase coexistence. The results indicate that the RNED sufficiently extends the application of both the original JE and the RED in equilibrium sampling of complex systems.

  6. Th, Pa and U isotopes in an echinoderm, Encope grandis. An application to dating of some fossil samples from Southern Baja California

    Energy Technology Data Exchange (ETDEWEB)

    Omura, A [Kanazawa Univ. (Japan). Faculty of Science; Ku, T

    1979-03-01

    The application of /sup 230/Th and /sup 231/Pa growth methods to the hard tissues of living things, which are effective for the radiometric age measurement for latter Quaternary period, has been limited to certain corals, therefore it has been scarcely utilized in other areas than coral reefs. Reef coral fossils (Porites) were obtained from terrace deposits of Magdalena Island in Southern Baja California, and the methods were applied to them. At the time, the isotope compositions of Th, Pa and U in the shells of echinoderm Encope Grandis and of the living samples were examined. The estimated ages were in agreement with those of coral. It suggests that the reliable /sup 230/Th and /sup 231/Pa ages of sea-urchin fossils were presented for the first time and that the method is applicable to such fossils only if the conditions can be met. The results are highly significant, since the method may be used in other areas than coral reefs. (J.P.N.).

  7. Evaluation and application of static headspace-multicapillary column-gas chromatography-ion mobility spectrometry for complex sample analysis.

    Science.gov (United States)

    Denawaka, Chamila J; Fowlis, Ian A; Dean, John R

    2014-04-18

    An evaluation of static headspace-multicapillary column-gas chromatography-ion mobility spectrometry (SHS-MCC-GC-IMS) has been undertaken to assess its applicability for the determination of 32 volatile compounds (VCs). The key experimental variables of sample incubation time and temperature have been evaluated alongside the MCC-GC variables of column polarity, syringe temperature, injection temperature, injection volume, column temperature and carrier gas flow rate coupled with the IMS variables of temperature and drift gas flow rate. This evaluation resulted in six sets of experimental variables being required to separate the 32 VCs. The optimum experimental variables for SHS-MCC-GC-IMS, the retention time and drift time operating parameters were determined; to normalise the operating parameters, the relative drift time and normalised reduced ion mobility for each VC were determined. In addition, a full theoretical explanation is provided on the formation of the monomer, dimer and trimer of a VC. The optimum operating condition for each VC calibration data was obtained alongside limit of detection (LOD) and limit of quantitation (LOQ) values. Typical detection limits ranged from 0.1ng bis(methylthio)methane, ethylbutanoate and (E)-2-nonenal to 472ng isovaleric acid with correlation coefficient (R(2)) data ranging from 0.9793 (for the dimer of octanal) through to 0.9990 (for isobutyric acid). Finally, the developed protocols were applied to the analysis of malodour in sock samples. Initial work involved spiking an inert matrix and sock samples with appropriate concentrations of eight VCs. The average recovery from the inert matrix was 101±18% (n=8), while recoveries from the sock samples were lower, that is, 54±30% (n=8) for sock type 1 and 78±24% (n=6) for sock type 2. Finally, SHS-MCC-GC-IMS was applied to sock malodour in a field trial based on 11 volunteers (mixed gender) over a 3-week period. By applying the SHS-MCC-GC-IMS database, four VCs were

  8. On the assessment of extremely low breakdown probabilities by an inverse sampling procedure [gaseous insulation

    DEFF Research Database (Denmark)

    Thyregod, Poul; Vibholm, Svend

    1991-01-01

    the flashover probability function and the corresponding distribution of first breakdown voltages under the inverse sampling procedure, and show how this relation may be utilized to assess the single-shot flashover probability corresponding to the observed average first breakdown voltage. Since the procedure......First breakdown voltages obtained under the inverse sampling procedure assuming a double exponential flashover probability function are discussed. An inverse sampling procedure commences the voltage application at a very low level, followed by applications at stepwise increased levels until...... is based on voltage applications in the neighbourhood of the quantile under investigation, the procedure is found to be insensitive to the underlying distributional assumptions...

  9. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S

    2002-01-01

    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... higher than in control soil, probably due mainly to release of predation from indigenous protozoa. In order to minimize solvent effects on indigenous soil microorganisms when spiking native soil samples with compounds having a low water solubility, we propose a common protocol in which the contaminant...... tagged with luxAB::Tn5. For both solvents, application to the whole sample resulted in severe side effects on both indigenous protozoa and bacteria. Application of dichloromethane to the whole soil volume immediately reduced the number of protozoa to below the detection limit. In one of the soils...

  10. High-resolution X-ray diffraction with no sample preparation.

    Science.gov (United States)

    Hansford, G M; Turner, S M R; Degryse, P; Shortland, A J

    2017-07-01

    It is shown that energy-dispersive X-ray diffraction (EDXRD) implemented in a back-reflection geometry is extremely insensitive to sample morphology and positioning even in a high-resolution configuration. This technique allows high-quality X-ray diffraction analysis of samples that have not been prepared and is therefore completely non-destructive. The experimental technique was implemented on beamline B18 at the Diamond Light Source synchrotron in Oxfordshire, UK. The majority of the experiments in this study were performed with pre-characterized geological materials in order to elucidate the characteristics of this novel technique and to develop the analysis methods. Results are presented that demonstrate phase identification, the derivation of precise unit-cell parameters and extraction of microstructural information on unprepared rock samples and other sample types. A particular highlight was the identification of a specific polytype of a muscovite in an unprepared mica schist sample, avoiding the time-consuming and difficult preparation steps normally required to make this type of identification. The technique was also demonstrated in application to a small number of fossil and archaeological samples. Back-reflection EDXRD implemented in a high-resolution configuration shows great potential in the crystallographic analysis of cultural heritage artefacts for the purposes of scientific research such as provenancing, as well as contributing to the formulation of conservation strategies. Possibilities for moving the technique from the synchrotron into museums are discussed. The avoidance of the need to extract samples from high-value and rare objects is a highly significant advantage, applicable also in other potential research areas such as palaeontology, and the study of meteorites and planetary materials brought to Earth by sample-return missions.

  11. Capacitive deionization on-chip as a method for microfluidic sample preparation

    NARCIS (Netherlands)

    Roelofs, Susan Helena; Kim, Bumjoo; Eijkel, Jan C.T.; Han, Jongyoon; van den Berg, Albert; Odijk, Mathieu

    2015-01-01

    Desalination as a sample preparation step is essential for noise reduction and reproducibility of mass spectrometry measurements. A specific example is the analysis of proteins for medical research and clinical applications. Salts and buffers that are present in samples need to be removed before

  12. Construction of electron beam machine 350 keV/10 mA for multipurpose application of thin sample at P3TM-BATAN

    International Nuclear Information System (INIS)

    Darsono

    2004-01-01

    Research and development starting in 1984 of electron beam technology in Indonesia is first briefly presented. BATAN assigned to the Yogyakarta Nuclear Center the project of constructing an electron beam machine of 350 keV/10 mA for multipurpose applications especially for thin samples for duration of five years. The main objective of the project was the young scientists training and demonstration purposes in operation and maintenance of the machine. The engineers have learned through experience of the low energy ion accelerator (150 kV) many techniques to construct the system component such as E-gun, high voltage, vacuum, beam optics, scanning horn and window, beam stopper, and conveyer as well as the embedded control system. Because of the window cooling system, the uses of the machine are limited for irradiating a thin sample of plastics, hydrogel, powder, or liquid. Future plans for modification of the machine are stated. (S. Ohno)

  13. A neural algorithm for the non-uniform and adaptive sampling of biomedical data.

    Science.gov (United States)

    Mesin, Luca

    2016-04-01

    Body sensors are finding increasing applications in the self-monitoring for health-care and in the remote surveillance of sensitive people. The physiological data to be sampled can be non-stationary, with bursts of high amplitude and frequency content providing most information. Such data could be sampled efficiently with a non-uniform schedule that increases the sampling rate only during activity bursts. A real time and adaptive algorithm is proposed to select the sampling rate, in order to reduce the number of measured samples, but still recording the main information. The algorithm is based on a neural network which predicts the subsequent samples and their uncertainties, requiring a measurement only when the risk of the prediction is larger than a selectable threshold. Four examples of application to biomedical data are discussed: electromyogram, electrocardiogram, electroencephalogram, and body acceleration. Sampling rates are reduced under the Nyquist limit, still preserving an accurate representation of the data and of their power spectral densities (PSD). For example, sampling at 60% of the Nyquist frequency, the percentage average rectified errors in estimating the signals are on the order of 10% and the PSD is fairly represented, until the highest frequencies. The method outperforms both uniform sampling and compressive sensing applied to the same data. The discussed method allows to go beyond Nyquist limit, still preserving the information content of non-stationary biomedical signals. It could find applications in body sensor networks to lower the number of wireless communications (saving sensor power) and to reduce the occupation of memory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Cadmium and lead determination by ICPMS: Method optimization and application in carabao milk samples

    Directory of Open Access Journals (Sweden)

    Riza A. Magbitang

    2012-06-01

    Full Text Available A method utilizing inductively coupled plasma mass spectrometry (ICPMS as the element-selective detector with microwave-assisted nitric acid digestion as the sample pre-treatment technique was developed for the simultaneous determination of cadmium (Cd and lead (Pb in milk samples. The estimated detection limits were 0.09ìg kg-1 and 0.33ìg kg-1 for Cd and Pb, respectively. The method was linear in the concentration range 0.01 to 500ìg kg-1with correlation coefficients of 0.999 for both analytes.The method was validated using certified reference material BCR 150 and the determined values for Cd and Pb were 18.24 ± 0.18 ìg kg-1 and 807.57 ± 7.07ìg kg-1, respectively. Further validation using another certified reference material, NIST 1643e, resulted in determined concentrations of 6.48 ± 0.10 ìg L-1 for Cd and 21.96 ± 0.87 ìg L-1 for Pb. These determined values agree well with the certified values in the reference materials.The method was applied to processed and raw carabao milk samples collected in Nueva Ecija, Philippines.The Cd levels determined in the samples were in the range 0.11 ± 0.07 to 5.17 ± 0.13 ìg kg-1 for the processed milk samples, and 0.11 ± 0.07 to 0.45 ± 0.09 ìg kg-1 for the raw milk samples. The concentrations of Pb were in the range 0.49 ± 0.21 to 5.82 ± 0.17 ìg kg-1 for the processed milk samples, and 0.72 ± 0.18 to 6.79 ± 0.20 ìg kg-1 for the raw milk samples.

  15. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  16. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  17. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    Science.gov (United States)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  18. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  19. Successful application of FTA Classic Card technology and use of bacteriophage phi29 DNA polymerase for large-scale field sampling and cloning of complete maize streak virus genomes.

    Science.gov (United States)

    Owor, Betty E; Shepherd, Dionne N; Taylor, Nigel J; Edema, Richard; Monjane, Adérito L; Thomson, Jennifer A; Martin, Darren P; Varsani, Arvind

    2007-03-01

    Leaf samples from 155 maize streak virus (MSV)-infected maize plants were collected from 155 farmers' fields in 23 districts in Uganda in May/June 2005 by leaf-pressing infected samples onto FTA Classic Cards. Viral DNA was successfully extracted from cards stored at room temperature for 9 months. The diversity of 127 MSV isolates was analysed by PCR-generated RFLPs. Six representative isolates having different RFLP patterns and causing either severe, moderate or mild disease symptoms, were chosen for amplification from FTA cards by bacteriophage phi29 DNA polymerase using the TempliPhi system. Full-length genomes were inserted into a cloning vector using a unique restriction enzyme site, and sequenced. The 1.3-kb PCR product amplified directly from FTA-eluted DNA and used for RFLP analysis was also cloned and sequenced. Comparison of cloned whole genome sequences with those of the original PCR products indicated that the correct virus genome had been cloned and that no errors were introduced by the phi29 polymerase. This is the first successful large-scale application of FTA card technology to the field, and illustrates the ease with which large numbers of infected samples can be collected and stored for downstream molecular applications such as diversity analysis and cloning of potentially new virus genomes.

  20. Fast sampling from a Hidden Markov Model posterior for large data

    DEFF Research Database (Denmark)

    Bonnevie, Rasmus; Hansen, Lars Kai

    2014-01-01

    Hidden Markov Models are of interest in a broad set of applications including modern data driven systems involving very large data sets. However, approximate inference methods based on Bayesian averaging are precluded in such applications as each sampling step requires a full sweep over the data...

  1. Electrofracturing test system and method of determining material characteristics of electrofractured material samples

    Science.gov (United States)

    Bauer, Stephen J.; Glover, Steven F.; Pfeifle, Tom; Su, Jiann-Cherng; Williamson, Kenneth Martin; Broome, Scott Thomas; Gardner, William Payton

    2017-08-01

    A device for electrofracturing a material sample and analyzing the material sample is disclosed. The device simulates an in situ electrofracturing environment so as to obtain electrofractured material characteristics representative of field applications while allowing permeability testing of the fractured sample under in situ conditions.

  2. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy

    2014-09-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  3. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  4. Sampled-data models for linear and nonlinear systems

    CERN Document Server

    Yuz, Juan I

    2014-01-01

    Sampled-data Models for Linear and Nonlinear Systems provides a fresh new look at a subject with which many researchers may think themselves familiar. Rather than emphasising the differences between sampled-data and continuous-time systems, the authors proceed from the premise that, with modern sampling rates being as high as they are, it is becoming more appropriate to emphasise connections and similarities. The text is driven by three motives: ·      the ubiquity of computers in modern control and signal-processing equipment means that sampling of systems that really evolve continuously is unavoidable; ·      although superficially straightforward, sampling can easily produce erroneous results when not treated properly; and ·      the need for a thorough understanding of many aspects of sampling among researchers and engineers dealing with applications to which they are central. The authors tackle many misconceptions which, although appearing reasonable at first sight, are in fact either p...

  5. DOE methods for evaluating environmental and waste management samples

    International Nuclear Information System (INIS)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1993-03-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others

  6. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  7. Apparatus for freeze drying of biologic and sediment samples

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Freeze drying to obtain water from individual samples, though not complicated, usually requires considerable effort to maintain the cold traps on a 24-hr basis. In addition, the transfer of a sample from sample containers to freeze-dry flasks is usually made with some risk of contamination to the sample. If samples are large, 300 g to 600 g, usually several days are required to dry the samples. The use of an unattended system greatly improves personnel and drying efficiency. Commercial freeze dryers are not readily applicable to the problems of collecting water from individual samples, and lab-designed collectors required sample transfer and continual replenishment of the dry ice. A freeze-dry apparatus for collecting water from individual sediment and/or biological samples was constructed to determine the tritium concentrations in fish for dose calcaluations and the tritium distribution in sediment cores for water movement studies. The freeze, dry apparatus, which can handle eight samples simultaneously and conveniently, is set up for unattended 24-hr operation and is designed to avoid sample transfer problems

  8. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  9. Sampled data spectroscopy (SDS): A new technology for radiation instrumentation

    International Nuclear Information System (INIS)

    Odell, D.M.C.

    1992-01-01

    A new instrumentation architecture for radiation spectroscopy is in the early stages of development at Savannah River. Based upon the same digital sampling techniques used in sonar and radar, sampled data spectroscopy (SDS) has produced Na(I)/PMT spectra with resolution comparable to conventional PHA systems. This work has laid the foundation for extending SDS techniques to solid state detector applications as well. Two-dimensional SDS processes raw, unintegrated detector output pulses to produce both energy and shape information that is used to construct a conventional energy spectrum. System advantages include zero electronic deadtime to support very high count rates, elimination of pulse pile-up peaks, high noise immunity, and digital system stability and reliability. Small size and low power requirements make 2-D SDS anideal technology for portable instrumentation and remote monitoring applications. Applications of potential interest at Savannah River include on-the-spot spill analysis, real-time waste stream monitoring, and personnel and area monitoring below background levels. A three-dimensional sampled data architecture is also being developed. Relying on image analysis and enhancement techniques, 3-D SDS identifies spectral peaks without determining the energy of any individual detector pulses. These techniques also open up a new avenue of exploration for reducing or removing Compton effects from the spectra of single detector systems. The intended application for this technique is waste characterization where lower energy isotopes are often obscured by the Compton scattering from dominant isotopes such as Csl37

  10. The Applicability of Wildlife Value Orientations Scales to a Muslim Student Sample in Malaysia

    NARCIS (Netherlands)

    Jacobs, M.H.; Zainal Abidin, Zulkhairi

    2016-01-01

    This article addresses the applicability of quantitative wildlife value
    orientation scales in Muslim students in Malaysia. As Malaysian culture
    is deeply influenced by Islam ideology, this article presents a
    case for addressing the cross-cultural applicability of the scales.

  11. Sweat: a sample with limited present applications and promising future in metabolomics.

    Science.gov (United States)

    Mena-Bravo, A; Luque de Castro, M D

    2014-03-01

    Sweat is a biofluid with present scant use as clinical sample. This review tries to demonstrate the advantages of sweat over other biofluids such as blood or urine for routine clinical analyses and the potential when related to metabolomics. With this aim, critical discussion of sweat samplers and equipment for analysis of target compounds in this sample is made. Well established routine analyses in sweat as is that to diagnose cystic fibrosis, and the advantages and disadvantages of sweat versus urine or blood for doping control have also been discussed. Methods for analytes such as essential metals and xenometals, ethanol and electrolytes in sweat in fact constitute target metabolomics approaches or belong to any metabolomics subdiscipline such as metallomics, ionomics or xenometabolomics. The higher development of biomarkers based on genomics or proteomics as omics older than metabolomics is discussed and also the potential role of metabolomics in systems biology taking into account its emergent implementation. Normalization of the volume of sampled sweat constitutes a present unsolved shortcoming that deserves investigation. Foreseeable trends in this area are outlined. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Application of importance sampling method in sliding failure simulation of caisson breakwaters

    Science.gov (United States)

    Wang, Yu-chi; Wang, Yuan-zhan; Li, Qing-mei; Chen, Liang-zhi

    2016-06-01

    It is assumed that the storm wave takes place once a year during the design period, and N histories of storm waves are generated on the basis of wave spectrum corresponding to the N-year design period. The responses of the breakwater to the N histories of storm waves in the N-year design period are calculated by mass-spring-dashpot mode and taken as a set of samples. The failure probability of caisson breakwaters during the design period of N years is obtained by the statistical analysis of many sets of samples. It is the key issue to improve the efficiency of the common Monte Carlo simulation method in the failure probability estimation of caisson breakwaters in the complete life cycle. In this paper, the kernel method of importance sampling, which can greatly increase the efficiency of failure probability calculation of caisson breakwaters, is proposed to estimate the failure probability of caisson breakwaters in the complete life cycle. The effectiveness of the kernel method is investigated by an example. It is indicated that the calculation efficiency of the kernel method is over 10 times the common Monte Carlo simulation method.

  13. Methodology of simultaneous analysis of Uranium and Thorium by nuclear and atomic techniques. Application to the Uranium and Thorium dosing in mineralogic samples

    International Nuclear Information System (INIS)

    Fakhi, S.

    1988-01-01

    This work concerns essentially the potential applications of 100 kW nuclear reactor of Strasbourg Nuclear Research Centre to neutron activation analysis of Uranium and Thorium. The Uranium dosing has been made using: 239-U, 239-Np, fission products or delayed neutrons. Thorium has been showed up by means of 233-Th or 233-Pa. The 239-U and 233-Th detection leads to a rapid and non-destructive analysis of Uranium and Thorium. The maximum sensitivity is of 78 ng for Uranium and of 160 ng for Thorium. The Uranium and Thorium dosing based on 239-Np and 233-Pa detection needs chemical selective separations for each of these radionuclides. The liquid-liquid extraction has permitted to elaborate rapid and quantitative separation methods. The sensitivities of the analysis after extraction reach 30 ng for Uranium and 50 ng for Thorium. The fission products separation study has allowed to elaborate the La, Ce and Nd extractions and its application to the Uranium dosing gives satisfying results. A rapid dosing method with a sensitivity of 0.35 microgramme has been elaborated with the help of delayed neutrons measurement. These different methods have been applied to the Uranium and Thorium dosing in samples coming from Oklo mine in Gabon. The analyses of these samples by atomic absorption spectroscopy and by the proton induced X-ray emission (PIXE) method confirm that the neutron activation analysis methods are reliable. 37 figs., 14 tabs., 50 refs

  14. Field Sample Preparation Method Development for Isotope Ratio Mass Spectrometry

    International Nuclear Information System (INIS)

    Leibman, C.; Weisbrod, K.; Yoshida, T.

    2015-01-01

    Non-proliferation and International Security (NA-241) established a working group of researchers from Los Alamos National Laboratory (LANL), Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to evaluate the utilization of in-field mass spectrometry for safeguards applications. The survey of commercial off-the-shelf (COTS) mass spectrometers (MS) revealed no instrumentation existed capable of meeting all the potential safeguards requirements for performance, portability, and ease of use. Additionally, fieldable instruments are unlikely to meet the International Target Values (ITVs) for accuracy and precision for isotope ratio measurements achieved with laboratory methods. The major gaps identified for in-field actinide isotope ratio analysis were in the areas of: 1. sample preparation and/or sample introduction, 2. size reduction of mass analyzers and ionization sources, 3. system automation, and 4. decreased system cost. Development work in 2 through 4, numerated above continues, in the private and public sector. LANL is focusing on developing sample preparation/sample introduction methods for use with the different sample types anticipated for safeguard applications. Addressing sample handling and sample preparation methods for MS analysis will enable use of new MS instrumentation as it becomes commercially available. As one example, we have developed a rapid, sample preparation method for dissolution of uranium and plutonium oxides using ammonium bifluoride (ABF). ABF is a significantly safer and faster alternative to digestion with boiling combinations of highly concentrated mineral acids. Actinides digested with ABF yield fluorides, which can then be analyzed directly or chemically converted and separated using established column chromatography techniques as needed prior to isotope analysis. The reagent volumes and the sample processing steps associated with ABF sample digestion lend themselves to automation and field

  15. Automating Groundwater Sampling At Hanford, The Next Step

    International Nuclear Information System (INIS)

    Connell, C.W.; Conley, S.F.; Hildebrand, R.D.; Cunningham, D.E.

    2010-01-01

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very 'people intensive.' Approximately 1500 wells are sampled each year by field personnel or 'samplers.' These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  16. Reliability estimation system: its application to the nuclear geophysical sampling of ore deposits

    International Nuclear Information System (INIS)

    Khaykovich, I.M.; Savosin, S.I.

    1992-01-01

    The reliability estimation system accepted in the Soviet Union for sampling data in nuclear geophysics is based on unique requirements in metrology and methodology. It involves estimating characteristic errors in calibration, as well as errors in measurement and interpretation. This paper describes the methods of estimating the levels of systematic and random errors at each stage of the problem. The data of nuclear geophysics sampling are considered to be reliable if there are no statistically significant, systematic differences between ore intervals determined by this method and by geological control, or by other methods of sampling; the reliability of the latter having been verified. The difference between the random errors is statistically insignificant. The system allows one to obtain information on the parameters of ore intervals with a guaranteed random error and without systematic errors. (Author)

  17. Industrial variographic analysis for continuous sampling system validation

    DEFF Research Database (Denmark)

    Engström, Karin; Esbensen, Kim Harry

    2017-01-01

    Karin Engström, LKAB mining, Kiruna, Sweden, continues to present illuminative cases from process industry. Here she reveals more from her ongoing PhD project showing application of variographic characterisation for on-line continuous control of process sampling systems, including the one...

  18. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  19. A 172 $\\mu$W Compressively Sampled Photoplethysmographic (PPG) Readout ASIC With Heart Rate Estimation Directly From Compressively Sampled Data.

    Science.gov (United States)

    Pamula, Venkata Rajesh; Valero-Sarmiento, Jose Manuel; Yan, Long; Bozkurt, Alper; Hoof, Chris Van; Helleputte, Nick Van; Yazicioglu, Refet Firat; Verhelst, Marian

    2017-06-01

    A compressive sampling (CS) photoplethysmographic (PPG) readout with embedded feature extraction to estimate heart rate (HR) directly from compressively sampled data is presented. It integrates a low-power analog front end together with a digital back end to perform feature extraction to estimate the average HR over a 4 s interval directly from compressively sampled PPG data. The application-specified integrated circuit (ASIC) supports uniform sampling mode (1x compression) as well as CS modes with compression ratios of 8x, 10x, and 30x. CS is performed through nonuniformly subsampling the PPG signal, while feature extraction is performed using least square spectral fitting through Lomb-Scargle periodogram. The ASIC consumes 172  μ W of power from a 1.2 V supply while reducing the relative LED driver power consumption by up to 30 times without significant loss of relevant information for accurate HR estimation.

  20. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  1. In-air micro-pixe analysis of tissue samples

    International Nuclear Information System (INIS)

    Tanaka, A.; Ishii, K.; Komori, Y.

    2002-01-01

    Micro-PIXE is capable of providing spatial distributions of elements in the micro-meter scale and its application to biology is useful to elucidate the cellular metabolism. Since, in this method, a sample target is usually irradiated with proton or α-particle beams in vacuum, beam heating results in evaporation of volatile elements an shrinking of the sample. In order to avoid these side effects, we previously developed a technique of in-air micro-PIXE analysis for samples of cultured cells. In addition to these, analysis of exposed tissue samples from living subjects is highly desirable in biological and medical research. Here, we describe a technique of in-air micro-PIXE analysis of such tissue samples. The target samples of exposed tissue slices from a Donryu rat, in which a tumor had been transplanted, were analyzed with proton micro-beams of 2.6 MeV. We report that the shape of cells and the distribution of volatile elements in the tissue sample remain uncharged when using a target preparation based on a freeze-drying method. (author)

  2. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  3. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  4. Applicability of portable spectrometer for activity measurement of contaminated water and soil samples

    International Nuclear Information System (INIS)

    Krishnan, Narayani; Rekha, A.K.; Anilkumar, S.; Sharma, D.N.

    2011-01-01

    The absolute activity measurement is often necessary to assess the impact of radioactivity contamination due to various incidents. Commercially available portable spectrometer cum dose rate meter is used for the identification of radionuclides involved and associated dose rates. In this paper the authors discusses the study carried out on the applicability of portable spectrometer for absolute radioactivity measurements in water and soil matrices. The portable spectrometer and the methodology developed for activity estimation has been used in many insitu applications. (author)

  5. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  6. Analysis of atmospheric particulate samples via instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Greenberg, R.R.

    1990-01-01

    Instrumental neutron activation analysis (INAA) is a powerful analytical technique for the elemental characterization of atmospheric particulate samples. It is a true multielement technique with adequate sensitivity to determine 30 to 40 elements in a sample of atmospheric particulate material. Its nondestructive nature allows sample reanalysis by the same or a different analytical technique. In this paper as an example of the applicability of INAA to the study of atmospheric particulate material, a study of the emissions from municipal incinerators is described

  7. A metric for cross-sample comparisons using logit and probit

    DEFF Research Database (Denmark)

    Karlson, Kristian Bernt

    relative to an arbitrary scale, which makes the coefficients difficult both to interpret and to compare across groups or samples. Do differences in coefficients reflect true differences or differences in scales? This cross-sample comparison problem raises concerns for comparative research. However, we......* across groups or samples, making it suitable for situations met in real applications in comparative research. Our derivations also extend to the probit and to ordered and multinomial models. The new metric is implemented in the Stata command nlcorr....

  8. Accelerator mass spectrometry of small biological samples.

    Science.gov (United States)

    Salehpour, Mehran; Forsgard, Niklas; Possnert, Göran

    2008-12-01

    Accelerator mass spectrometry (AMS) is an ultra-sensitive technique for isotopic ratio measurements. In the biomedical field, AMS can be used to measure femtomolar concentrations of labeled drugs in body fluids, with direct applications in early drug development such as Microdosing. Likewise, the regenerative properties of cells which are of fundamental significance in stem-cell research can be determined with an accuracy of a few years by AMS analysis of human DNA. However, AMS nominally requires about 1 mg of carbon per sample which is not always available when dealing with specific body substances such as localized, organ-specific DNA samples. Consequently, it is of analytical interest to develop methods for the routine analysis of small samples in the range of a few tens of microg. We have used a 5 MV Pelletron tandem accelerator to study small biological samples using AMS. Different methods are presented and compared. A (12)C-carrier sample preparation method is described which is potentially more sensitive and less susceptible to contamination than the standard procedures.

  9. Gas-driven pump for ground-water samples

    Science.gov (United States)

    Signor, Donald C.

    1978-01-01

    Observation wells installed for artificial-recharge research and other wells used in different ground-water programs are frequently cased with small-diameter steel pipe. To obtain samples from these small-diameter wells in order to monitor water quality, and to calibrate solute-transport models, a small-diameter pump with unique operating characteristics is required that causes a minimum alternation of samples during field sampling. A small-diameter gas-driven pump was designed and built to obtain water samples from wells of two-inch diameter or larger. The pump is a double-piston type with the following characteristics: (1) The water sample is isolated from the operating gas, (2) no source of electricity is ncessary, (3) operation is continuous, (4) use of compressed gas is efficient, and (5) operation is reliable over extended periods of time. Principles of operation, actual operation techniques, gas-use analyses and operating experience are described. Complete working drawings and a component list are included. Recent modifications and pump construction for high-pressure applications also are described. (Woodard-USGS)

  10. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  11. Scheduling whole-air samples above the Trade Wind Inversion from SUAS using real-time sensors

    Science.gov (United States)

    Freer, J. E.; Greatwood, C.; Thomas, R.; Richardson, T.; Brownlow, R.; Lowry, D.; MacKenzie, A. R.; Nisbet, E. G.

    2015-12-01

    Small Unmanned Air Systems (SUAS) are increasingly being used in science applications for a range of applications. Here we explore their use to schedule the sampling of air masses up to 2.5km above ground using computer controlled bespoked Octocopter platforms. Whole-air sampling is targeted above, within and below the Trade Wind Inversion (TWI). On-board sensors profiled the TWI characteristics in real time on ascent and, hence, guided the altitudes at which samples were taken on descent. The science driver for this research is investigation of the Southern Methane Anomaly and, more broadly, the hemispheric-scale transport of long-lived atmospheric tracers in the remote troposphere. Here we focus on the practical application of SUAS for this purpose. Highlighting the need for mission planning, computer control, onboard sensors and logistics in deploying such technologies for out of line-of-sight applications. We show how such a platform can be deployed successfully, resulting in some 60 sampling flights within a 10 day period. Challenges remain regarding the deployment of such platforms routinely and cost-effectively, particularly regarding training and support. We present some initial results from the methane sampling and its implication for exploring and understanding the Southern Methane Anomaly.

  12. Measurement of cerebral blood flow the blood sampling method using 99mTc-ECD. Simultaneous scintigram scanning of arterial blood samples and the brain with a gamma camera

    International Nuclear Information System (INIS)

    Hachiya, Takenori; Inugami, Atsushi; Iida, Hidehiro; Mizuta, Yoshihiko; Kawakami, Takeshi; Inoue, Minoru

    1999-01-01

    To measure regional cerebral blood flow (rCBF) by blood sampling using 99m Tc-ECD we devised a method of measuring the radioactive concentration in arterial blood sample with a gamma camera. In this method the head and a blood sample are placed within the same visual field to record the SPECT data of both specimens simultaneously. The results of an evaluation of the counting rate performance, applying the 30 hours decaying method using 99m Tc solution showed that this method is not comparable to the well-type scintillation counter and in clinical cases the active concentration in arterial blood sample remained well within the dynamic range. In addition, examination of the influence of scattered radiation from the brain by the dilution method showed that it was negligible at a distance of more than 7.5 cm between the brain and the arterial blood sample. In the present study we placed a head-shaped phantom next to the sample. The results of the examinations suggested that this method is suitable for clinical application, and because it does not require a well-type scintillation counter, it is expected to find wide application. (author)

  13. Chroni - an Android Application for Geochronologists to Access Archived Sample Analyses from the NSF-Funded Geochron.Org Data Repository.

    Science.gov (United States)

    Nettles, J. J.; Bowring, J. F.

    2014-12-01

    NSF requires data management plans as part of funding proposals and geochronologists, among other scientists, are archiving their data and results to the public cloud archives managed by the NSF-funded Integrated Earth Data Applications, or IEDA. GeoChron is a database for geochronology housed within IEDA. The software application U-Pb_Redux developed at the Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES.org) at the College of Charleston provides seamless connectivity to GeoChron for uranium-lead (U-Pb) geochronologists to automatically upload and retrieve their data and results. U-Pb_Redux also manages publication-quality documents including report tables and graphs. CHRONI is a lightweight mobile application for Android devices that provides easy access to these archived data and results. With CHRONI, U-Pb geochronologists can view archived data and analyses downloaded from the Geochron database, or any other location, in a customizable format. CHRONI uses the same extensible markup language (XML) schema and documents used by U-Pb_Redux and GeoChron. Report Settings are special XML files that can be customized in U-Pb_Redux, stored in the cloud, and then accessed and used in CHRONI to create the same customized data display on the mobile device. In addition to providing geologists effortless and mobile access to archived data and analyses, CHRONI allows users to manage their GeoChron credentials, quickly download private and public files via a specified IEDA International Geo Sample Number (IGSN) or URL, and view specialized graphics associated with particular IGSNs. Future versions of CHRONI will be developed to support iOS compatible devices. CHRONI is an open source project under the Apache 2 license and is hosted at https://github.com/CIRDLES/CHRONI. We encourage community participation in its continued development.

  14. Bounds for Tail Probabilities of the Sample Variance

    Directory of Open Access Journals (Sweden)

    Van Zuijlen M

    2009-01-01

    Full Text Available We provide bounds for tail probabilities of the sample variance. The bounds are expressed in terms of Hoeffding functions and are the sharpest known. They are designed having in mind applications in auditing as well as in processing data related to environment.

  15. Application of pulsed OSL to polymineral fine-grained samples

    International Nuclear Information System (INIS)

    Feathers, James K.; Casson, M. Aksel; Schmidt, Amanda Henck; Chithambo, Makaiko L.

    2012-01-01

    Pulsed OSL is applied to nine fine-grained sediment samples from Sichuan province, China, using stimulating pulses of 10 μs on and 240 μs off, with an infrared exposure prior to each OSL measurement. Comparison of fading rates between pulsed and non-pulsed signals, the latter also obtained with a preceding IR exposure, shows that fading is significant for mainly the non-pulsed signals. Presence of a pulsed IRSL and the magnitudes of b-value to correct for lower alpha efficiency suggest that pulsing does not fully remove a significant feldspar signal, only a fading component. Comparison with ages of quartz extracts shows that pulsed OSL ages are consistent, while CW-OSL ages are slightly older and CW-IRSL ages are much older. The older ages suggest a less well-bleached feldspar component.

  16. Comparative study of the characteristics of some suction devices for gas sampling applications

    International Nuclear Information System (INIS)

    Donguy, R.; Drouet, J.

    1959-06-01

    Gas sampling (used to determine the characteristics of dusts or aerosols contained in a gas) needs a suction device. In order to select the right device and the right conditions of use, the characteristics and performances of various suction devices (helicoidal and centrifugal aspirators, air pumps, volumetric pumps) have been experimentally measured: flow rate, head loss, sampling volume and duration, aerosol and dust concentration, gas density, nature of the gas, suction circuit configuration, etc

  17. Biomechanical analysis of a salt-modified polyvinyl alcohol hydrogel for knee meniscus applications, including comparison with human donor samples.

    Science.gov (United States)

    Hayes, Jennifer C; Curley, Colin; Tierney, Paul; Kennedy, James E

    2016-03-01

    The primary objective of this research was the biomechanical analysis of a salt-modified polyvinyl alcohol hydrogel, in order to assess its potential for use as an artificial meniscal implant. Aqueous polyvinyl alcohol (PVA) was treated with a sodium sulphate (Na2SO4) solution to precipitate out the polyvinyl alcohol resulting in a pliable hydrogel. The freeze-thaw process, a strictly physical method of crosslinking, was employed to crosslink the hydrogel. Development of a meniscal shaped mould and sample housing unit allowed the production of meniscal shaped hydrogels for direct comparison to human meniscal tissue. Results obtained show that compressive responses were slightly higher in PVA/Na2SO4 menisci, displaying maximum compressive loads of 2472N, 2482N and 2476N for samples having undergone 1, 3 and 5 freeze-thaw cycles respectively. When compared to the human meniscal tissue tested under the same conditions, an average maximum load of 2467.5N was observed. This suggests that the PVA/Na2SO4 menisci are mechanically comparable to the human meniscus. Biocompatibility analysis of PVA/Na2SO4 hydrogels revealed no acute cytotoxicity. The work described herein has innovative potential in load bearing applications, specifically as an alternative to meniscectomy as replacement of critically damaged meniscal tissue in the knee joint where repair is not viable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Systematic adaptive cluster sampling for the assessment of rare tree species in Nepal

    NARCIS (Netherlands)

    Acharya, B.; Bhattarai, G.; Gier, de A.; Stein, A.

    2000-01-01

    Sampling to assess rare tree species poses methodic problems, because they may cluster and many plots with no such trees are to be expected. We used systematic adaptive cluster sampling (SACS) to sample three rare tree species in a forest area of about 40 ha in Nepal. We checked its applicability

  19. Fast Ordered Sampling of DNA Sequence Variants

    Directory of Open Access Journals (Sweden)

    Anthony J. Greenberg

    2018-05-01

    Full Text Available Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects.

  20. Fast Ordered Sampling of DNA Sequence Variants.

    Science.gov (United States)

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  1. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  2. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  3. Attenuation correction for the collimated gamma ray assay of cylindrical samples

    International Nuclear Information System (INIS)

    Patra, Sabyasachi; Agarwal, Chhavi; Goswami, A.; Gathibandhe, M.

    2015-01-01

    The Hybrid Monte Carlo (HMC) method developed earlier for attenuation correction of non-collimated samples [Agarwal et al., 2008, Nucl. Instrum. Methods A 597, 198], has been extended to the segmented gamma ray assay of cylindrical samples. The method has been validated both experimentally and theoretically. For experimental validation, the results of HMC calculation have been compared with the experimentally obtained attenuation correction factors. The HMC attenuation correction factors have also been compared with the results obtained from literature available near-field and far-field formulae at two sample-to-detector distances (10.3 cm and 20.4 cm). The method has been found to be valid at all sample-to-detector distances over a wide range of transmittance. On the other hand, the literature available near-field and far-field formulae have been found to work over a limited range of sample-to detector distances and transmittances. The HMC method has been further extended to circular collimated geometries where analytical formula for attenuation correction does not exist. - Highlights: • Hybrid Monte Carlo method for attenuation correction developed for SGA system. • Method found to work for all sample-detector geometries for all transmittances. • The near-field formula applicable only after certain sample-detector distance. • The far-field formula applicable only for higher transmittances (>18%). • Hybrid Monte Carlo method further extended to circular collimated geometry

  4. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  5. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  6. Sample volume and alignment analysis for an optical particle counter sizer, and other applications

    International Nuclear Information System (INIS)

    Holve, D.J.; Davis, G.W.

    1985-01-01

    Optical methods for particle size distribution measurements in practical high temperature environments are approaching feasibility and offer significant advantages over conventional sampling methods. A key requirement of single particle counting techniques is the need to know features of the sample volume intensity distribution which in general are a function of the particle scattering properties and optical system geometry. In addition, the sample volume intensity distribution is sensitive to system alignment and thus calculations of alignment sensitivity are required for assessment of practical alignment tolerances. To this end, an analysis of sample volume characteristics for single particle counters in general has been developed. Results from the theory are compared with experimental measurements and shown to be in good agreement. A parametric sensitivity analysis is performed and a criterion for allowable optical misalignment is derived for conditions where beam steering caused by fluctuating refractive-index gradients is significant

  7. Standard practice for bulk sampling of liquid uranium hexafluoride

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2001-01-01

    1.1 This practice covers methods for withdrawing representative samples of liquid uranium hexafluoride (UF6) from bulk quantities of the material. Such samples are used for determining compliance with the applicable commercial specification, for example Specification C787 and Specification C996. 1.2 It is assumed that the bulk liquid UF6 being sampled comprises a single quality and quantity of material. This practice does not address any special additional arrangements that might be required for taking proportional or composite samples, or when the sampled bulk material is being added to UF6 residues already in a container (“heels recycle”). 1.3 The number of samples to be taken, their nominal sample weight, and their disposition shall be agreed upon between the parties. 1.4 The scope of this practice does not include provisions for preventing criticality incidents. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of th...

  8. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    OpenAIRE

    Chis Anca Oana; Danescu Tatiana

    2013-01-01

    Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic ...

  9. Estimating Sample Size for Usability Testing

    Directory of Open Access Journals (Sweden)

    Alex Cazañas

    2017-02-01

    Full Text Available One strategy used to assure that an interface meets user requirements is to conduct usability testing. When conducting such testing one of the unknowns is sample size. Since extensive testing is costly, minimizing the number of participants can contribute greatly to successful resource management of a project. Even though a significant number of models have been proposed to estimate sample size in usability testing, there is still not consensus on the optimal size. Several studies claim that 3 to 5 users suffice to uncover 80% of problems in a software interface. However, many other studies challenge this assertion. This study analyzed data collected from the user testing of a web application to verify the rule of thumb, commonly known as the “magic number 5”. The outcomes of the analysis showed that the 5-user rule significantly underestimates the required sample size to achieve reasonable levels of problem detection.

  10. Dielectric sample with two-layer charge distribution for space charge calibration purposes

    DEFF Research Database (Denmark)

    Holbøll, Joachim; Henriksen, Mogens; Rasmussen, C.

    2002-01-01

    In the present paper is described a dielectric test sample with two very narrow concentrations of bulk charges, achieved by two internal electrodes not affecting the acoustical properties of the sample, a fact important for optimal application of most space charge measuring systems. Space charge...

  11. Feynman diagrams sampling for quantum field theories on the QPACE 2 supercomputer

    Energy Technology Data Exchange (ETDEWEB)

    Rappl, Florian

    2016-08-01

    This work discusses the application of Feynman diagram sampling in quantum field theories. The method uses a computer simulation to sample the diagrammatic space obtained in a series expansion. For running large physical simulations powerful computers are obligatory, effectively splitting the thesis in two parts. The first part deals with the method of Feynman diagram sampling. Here the theoretical background of the method itself is discussed. Additionally, important statistical concepts and the theory of the strong force, quantum chromodynamics, are introduced. This sets the context of the simulations. We create and evaluate a variety of models to estimate the applicability of diagrammatic methods. The method is then applied to sample the perturbative expansion of the vertex correction. In the end we obtain the value for the anomalous magnetic moment of the electron. The second part looks at the QPACE 2 supercomputer. This includes a short introduction to supercomputers in general, as well as a closer look at the architecture and the cooling system of QPACE 2. Guiding benchmarks of the InfiniBand network are presented. At the core of this part, a collection of best practices and useful programming concepts are outlined, which enables the development of efficient, yet easily portable, applications for the QPACE 2 system.

  12. Integrated sampling vs ion chromatography: Mathematical considerations

    International Nuclear Information System (INIS)

    Sundberg, L.L.

    1992-01-01

    This paper presents some general purpose considerations that can be utilized when comparisons are made between the results of integrated sampling over several hours or days, and ion chromatography where sample collection times are measured in minutes. The discussion is geared toward the measurement of soluble transition metal ions in BWR feedwater. Under steady-state conditions, the concentrations reported by both techniques should be in reasonable agreement. Transient operations effect both types of measurements. A simplistic model, applicable to both sampling techniques, is presented that demonstrates the effect of transients which occur during the acquisition of a steady-state sample. For a common set of conditions, the integrated concentration is proportional to the concentration and duration of the transient, and inversely proportional to the sample collection time. The adjustment of the collection period during a known transient allows an estimation of peak transient concentration. Though the probability of sampling a random transient with the integrated sampling technique is very high, the magnitude is severely diluted with long integration times. Transient concentrations are magnified with ion chromatography, but the probability of sampling a transient is significantly lower using normal ion chromatography operations. Various data averaging techniques are discussed for integrated sampling and IC determinations. The use of time-weighted averages appears to offer more advantages over arithmetic and geometric means for integrated sampling when the collection period is variable. For replicate steady-state ion chromatography determinations which bracket a transient sample, it may be advantageous to ignore the calculation of averages, and report the data as trending information only

  13. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Directory of Open Access Journals (Sweden)

    Casey Olives

    Full Text Available Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa.We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa.Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error.This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  14. Application of Artificial Neural Networks to the Analysis of NORM Samples; Aplicación de las Redes Neuronales al Análisis de Muestras NORM

    Energy Technology Data Exchange (ETDEWEB)

    Moser, H.; Peyrés, V.; Mejuto, M.; García-Toraño, E.

    2015-07-01

    This work describes the application of artificial neural networks (ANNs) to analyze the raw data of gamma-ray spectra of NORM samples and decide if the activity content of a certain nuclide is above or below the exemption limit of 1 Bq/g. The main advantage of using an ANN for this purpose is that for the user no specialized knowledge in the field of gamma-ray spectrometry is necessary. In total a number of 635 spectra consisting of varying activity concentrations, seven different materials and three densities each have been generated by Monte Carlo simulation to provide training material to the ANN. These spectra have been created using the simulation code PENELOPE. Validation was carried out with a number of NORM samples previously characterized by conventional gamma-ray spectrometry with peak fitting.

  15. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    through web services. Based on valuable feedback from the user community, we will introduce enhancements that add greater flexibility to the system to accommodate the vast diversity of metadata that users want to store. Users will be able to create custom metadata fields and use these for the samples they register. Users will also be able to group samples into 'collections' to make retrieval for research projects or publications easier. An improved interface design will allow for better workflow transition and navigation throughout the application. In keeping up with the demands of a growing community, SESAR has also made process changes to ensure efficiency in system development. For example, we have implemented a release cycle to better track enhancements and fixes to the system, and an API library that facilitates reusability of code. Usage tracking, metrics and surveys capture information to guide the direction of future developments. A new set of administrative tools allows greater control of system management.

  16. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  17. Sampling in forests for radionuclide analysis. General and practical guidance

    Energy Technology Data Exchange (ETDEWEB)

    Aro, Lasse (Finnish Forest Research Inst. (METLA) (Finland)); Plamboeck, Agneta H. (Swedish Defence Research Agency (FOI) (Sweden)); Rantavaara, Aino; Vetikko, Virve (Radiation and Nuclear Safety Authority (STUK) (Finland)); Straalberg, Elisabeth (Inst. Energy Technology (IFE) (Norway))

    2009-01-15

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  18. Sampling in forests for radionuclide analysis. General and practical guidance

    International Nuclear Information System (INIS)

    Aro, Lasse; Plamboeck, Agneta H.; Rantavaara, Aino; Vetikko, Virve; Straelberg, Elisabeth

    2009-01-01

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  19. Large Sample Neutron Activation Analysis: A Challenge in Cultural Heritage Studies

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Tzika, F.

    2007-01-01

    Large sample neutron activation analysis compliments and significantly extends the analytical tools available for cultural heritage and authentication studies providing unique applications of non-destructive, multi-element analysis of materials that are too precious to damage for sampling purposes, representative sampling of heterogeneous materials or even analysis of whole objects. In this work, correction factors for neutron self-shielding, gamma-ray attenuation and volume distribution of the activity in large volume samples composed of iron and ceramic material were derived. Moreover, the effect of inhomogeneity on the accuracy of the technique was examined

  20. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  1. The Toggle Local Planner for sampling-based motion planning

    KAUST Repository

    Denny, Jory; Amato, Nancy M.

    2012-01-01

    Sampling-based solutions to the motion planning problem, such as the probabilistic roadmap method (PRM), have become commonplace in robotics applications. These solutions are the norm as the dimensionality of the planning space grows, i.e., d > 5

  2. Electrothermal vaporisation ICP-mass spectrometry (ETV-ICP-MS) for the determination and speciation of trace elements in solid samples - A review of real-life applications from the author's lab

    Energy Technology Data Exchange (ETDEWEB)

    Vanhaecke, Frank; Resano, Martin; Moens, Luc [Laboratory of Analytical Chemistry, Ghent University, Institute for Nuclear Sciences, Proeftuinstraat 86, 9000 Ghent (Belgium)

    2002-09-01

    The use of electrothermal vaporisation (ETV) from a graphite furnace as a means of sample introduction in inductively coupled plasma mass spectrometry (ICP-MS) permits the direct analysis of solid samples. A multi-step furnace temperature programme is used to separate the vaporisation of the target element(s) and of the matrix components from one another. Sometimes, a chemical modifier is used to enable a higher thermal pre-treatment temperature, by avoiding premature analyte losses (stabilisation) or promoting the selective volatilisation of matrix components. In almost all instances, accurate results can be obtained via external calibration or single standard addition using an aqueous standard solution. Absolute limits of detection are typically 1 pg, which corresponds to 1 ng/g for a typical sample mass of 1 mg. Real-life applications carried out in the author's lab are used to illustrate the utility of this approach. These applications aim at trace element determination in industrial and environmental materials. The industrial materials analysed include different types of plastics - Carilon, polyethylene, poly(ethyleneterephtalate) and polyamide - and photo- and thermographic materials. As samples from environmental origin, plant material, animal tissue and sediments were investigated. Some applications aimed at a multi-element determination, while in other, the content of a single, but often challenging, element (e.g., Si or S) had to be measured. ETV-ICP-MS was also used in elemental speciation studies. Separation of Se-containing proteins was accomplished using polyacrylamide gel electrophoresis (PAGE). Subsequent quantification of the Se content in the protein spots was carried out using ETV-ICP-MS. As the volatilisation of methylmercury and inorganic mercury could be separated from one another with respect to time, no chromatographic or electrophoretic separation procedure was required, but ETV-ICP-MS as such sufficed for Hg speciation in fish tissue

  3. Analysis of the Raven CPM Subtest Scores for a Sample of Gifted Children.

    Science.gov (United States)

    Kluever, Raymond C.; Green, Kathy E.

    The inter-subject/intra-subject subtest patterns (profiles) of the same sample of gifted children were examined based on factors found in a previous study of the Raven Coloured Progressive Matrices Test (CPM) that investigated structural properties with specific application to a sample of gifted children. The sample consisted of 166 children (78…

  4. Study of probe-sample distance for biomedical spectra measurement

    Directory of Open Access Journals (Sweden)

    Li Lei

    2011-11-01

    Full Text Available Abstract Background Fiber-based optical spectroscopy has been widely used for biomedical applications. However, the effect of probe-sample distance on the collection efficiency has not been well investigated. Method In this paper, we presented a theoretical model to maximize the illumination and collection efficiency in designing fiber optic probes for biomedical spectra measurement. This model was in general applicable to probes with single or multiple fibers at an arbitrary incident angle. In order to demonstrate the theory, a fluorescence spectrometer was used to measure the fluorescence of human finger skin at various probe-sample distances. The fluorescence spectrum and the total fluorescence intensity were recorded. Results The theoretical results show that for single fiber probes, contact measurement always provides the best results. While for multi-fiber probes, there is an optimal probe distance. When a 400- μm excitation fiber is used to deliver the light to the skin and another six 400- μm fibers surrounding the excitation fiber are used to collect the fluorescence signal, the experimental results show that human finger skin has very strong fluorescence between 475 nm and 700 nm under 450 nm excitation. The fluorescence intensity is heavily dependent on the probe-sample distance and there is an optimal probe distance. Conclusions We investigated a number of probe-sample configurations and found that contact measurement could be the primary choice for single-fiber probes, but was very inefficient for multi-fiber probes. There was an optimal probe-sample distance for multi-fiber probes. By carefully choosing the probe-sample distance, the collection efficiency could be enhanced by 5-10 times. Our experiments demonstrated that the experimental results of the probe-sample distance dependence of collection efficiency in multi-fiber probes were in general agreement with our theory.

  5. A support vector density-based importance sampling for reliability assessment

    International Nuclear Information System (INIS)

    Dai, Hongzhe; Zhang, Hao; Wang, Wei

    2012-01-01

    An importance sampling method based on the adaptive Markov chain simulation and support vector density estimation is developed in this paper for efficient structural reliability assessment. The methodology involves the generation of samples that can adaptively populate the important region by the adaptive Metropolis algorithm, and the construction of importance sampling density by support vector density. The use of the adaptive Metropolis algorithm may effectively improve the convergence and stability of the classical Markov chain simulation. The support vector density can approximate the sampling density with fewer samples in comparison to the conventional kernel density estimation. The proposed importance sampling method can effectively reduce the number of structural analysis required for achieving a given accuracy. Examples involving both numerical and practical structural problems are given to illustrate the application and efficiency of the proposed methodology.

  6. Controlling a sample changer using the integrated counting system

    International Nuclear Information System (INIS)

    Deacon, S.; Stevens, M.P.

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described, firstly the running options are given, followed by a program description listing and flowchart. (author)

  7. Controlling a sample changer using the integrated counting system

    Energy Technology Data Exchange (ETDEWEB)

    Deacon, S; Stevens, M P

    1985-06-01

    Control of the Sample Changer from a counting system can be achieved by using a Scaler Timer type 6255 and Sample Changer Control Interface type 6263. The interface used, however, has quite complex circuitry. The application therefore lends itself to the use of another 6000 Series module-the Integrated Counting System (ICS). Using this unit control is carried out through a control program written in BASIC for the Commodore PET (or any other device with an IEEE-488 interface). The ICS then controls the sample changer through an interface unit which is relatively simple. A brief description of how ICS controls the sample changer is given. The control program is then described; first the running options are given, followed by a program description listing and flowchart.

  8. Application of instrument neutron-activation analysis in a comparative investigation of soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Dimitrov, D. (Institute po Kriminalistika i Kriminologiya, Sofia (Bulgaria))

    1983-01-01

    A quantitative measurement of the contents of 17 chemical elements in soil samples, collected using the existing network from the surface of five cultivated areas in Bulgaria has been carried out. The values obtained have been used to calculate the evaluations psub(i) of the dispersions and for the ordering of the chemical elements according to their importance in criminology. The possibility for criminological comparison of single soil samples using the contents of the five most important elements - Th, Fe, Sc, Ce and Mn has been shown.

  9. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  10. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  11. Multiple category-lot quality assurance sampling: a new classification system with application to schistosomiasis control.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello

    2012-01-01

    Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.

  12. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Tooth width predictions in a sample of Black South Africans.

    Science.gov (United States)

    Khan, M I; Seedat, A K; Hlongwa, P

    2007-07-01

    Space analysis during the mixed dentition requires prediction of the mesiodistal widths of the unerupted permanent canines and premolars and prediction tables and equations may be used for this purpose. The Tanaka and Johnston prediction equations, which were derived from a North American White sample, is one example which is widely used. This prediction equation may be inapplicable to other race groups due to racial tooth size variability. Therefore the purpose of this study was to derive prediction equations that would be applicable to Black South African subjects. One hundred and ten pre-treatment study casts of Black South African subjects were analysed from the Department of Orthodontics' records at the University of Limpopo. The sample was equally divided by gender with all subjects having Class I molar relationship and relatively well aligned teeth. The mesiodistal widths of the maxillary and mandibular canines and premolars were measured with a digital vernier calliper and compared with the measurements predicted with the Tanaka and Johnston equations. The relationship between the measured and predicted values were analysed by correlation and regression analyses. The results indicated that the Tanaka and Johnston prediction equations were not fully applicable to the Black South African sample. The equations tended to underpredict the male sample, while slight overprediction was observed in the female sample. Therefore, new equations were formulated and proposed that would be accurate for Black subjects.

  14. Optimal sampling schemes for vegetation and geological field visits

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2012-07-01

    Full Text Available The presentation made to Wits Statistics Department was on common classification methods used in the field of remote sensing, and the use of remote sensing to design optimal sampling schemes for field visits with applications in vegetation...

  15. IP Sample Plan #1 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Sample letter that shows how Universities including co-investigators, consultants, and collaborators can describe a data and research tool sharing plan and procedures for exercising intellectual property rights. The letter is to be used as part of the University's application

  16. Application of instrument neutron-activation analysis in a comparative investigation of soil samples

    International Nuclear Information System (INIS)

    Dimitrov, D.

    1983-01-01

    A quantitative measurement of the contents of 17 chemical elements in soil samples, collected using the existing network from the surface of five cultivated areas in Bulgaria has been carried out. The values obtained have been used to calculate the evaluations psub(i) of the dispersions and for the ordering of the chemical elements according to their importance in criminology. The possibility for criminological comparison of single soil samples using the contents of the five most important elements - Th, Fe, Sc, Ce and Mn has been shown. (author)

  17. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  18. 12 CFR Appendix B to Part 707 - Model Clauses and Sample Forms

    Science.gov (United States)

    2010-01-01

    ...) B-8—Sample Form (Money Market Share Account Disclosures) B-9—Sample Form (Term Share (Certificate... money market accounts. These are some of the more common limitations applicable. The credit union... will assist members in reviewing and understanding the change. B-3Model Clauses for Pre-Maturity...

  19. Highly photoluminescent MoO{sub x} quantum dots: Facile synthesis and application in off-on Pi sensing in lake water samples

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Sai Jin [Jiangxi Key Laboratory of Mass Spectrometry and Instrumentation, East China University of Technology (ECUT), Nanchang 330013 (China); School of Chemistry, Biology and Material Science, ECUT, Nanchang 330013 (China); Zhao, Xiao Jing; Zuo, Jun [School of Chemistry, Biology and Material Science, ECUT, Nanchang 330013 (China); Huang, Hai Qing [State Key Laboratory Breeding Base of Nuclear Resources and Environment, ECUT, Nanchang 330013 (China); Zhang, Li, E-mail: zhangli8@ncu.edu.cn [College of Chemistry, Nanchang University, Nanchang 330031 (China)

    2016-02-04

    Molybdenum oxide (MoO{sub x}) is a well-studied transition-metal semiconductor material, and has a wider band gap than MoS{sub 2} which makes it become a promising versatile probe in a variety of fields, such as gas sensor, catalysis, energy storage ect. However, few MoO{sub x} nanomaterials possessing photoluminescence have been reported until now, not to mention the application as photoluminescent probes. Herein, a one-pot method is developed for facile synthesis of highly photoluminescent MoO{sub x} quantum dots (MoO{sub x} QDs) in which commercial molybdenum disulfide powder and hydrogen peroxide (H{sub 2}O{sub 2}) are involved as the precursor and oxidant, respectively. Compared with current synthesis methods, the proposed one has the advantages of rapid, one-pot, easily prepared, environment friendly as well as strong photoluminescence. The obtained MoO{sub x} QDs is further utilized as an efficient photoluminescent probe, and a new off-on sensor has been constructed for phosphate (Pi) determination in complicated lake water samples, attributed to the fact that the binding affinity of Eu{sup 3+} ions to the oxygen atoms from Pi is much higher than that from the surface of MoO{sub x} QDs. Under the optimal conditions, a good linear relationship was found between the enhanced photoluminescence intensity and Pi concentration in the range of 0.1–160.0 μM with the detection limit of 56 nM (3σ/k). The first application of the photoluminescent MoO{sub x} nanomaterials for ion photochemical sensing will open the gate of employing MoO{sub x} nanomaterials as versatile probes in a variety of fields, such as chemi-/bio-sensor, cell imaging, biomedical and so on. - Highlights: • Though increasing effort has been devoted to MoO{sub x} nanomaterials synthesis, only a few reports mentioning its photoluminescence property are available, while even no evidence has shown its applications in chemical and biological sensing. • Herein, a one-pot method possessing the

  20. Safety evaluation of small samples for isotope production

    International Nuclear Information System (INIS)

    Sharma, Archana; Singh, Tej; Varde, P.V.

    2015-09-01

    Radioactive isotopes are widely used in basic and applied science and engineering, most notably as environmental and industrial tracers, and for medical imaging procedures. Production of radioisotope constitutes important activity of Indian nuclear program. Since its initial criticality DHRUVA reactor has been facilitating the regular supply of most of the radioisotopes required in the country for application in the fields of medicine, industry and agriculture. In-pile irradiation of the samples requires a prior estimation of the sample reactivity load, heating rate, activity developed and shielding thickness required for post irradiation handling. This report is an attempt to highlight the contributions of DHRUVA reactor, as well as to explain in detail the methodologies used in safety evaluation of the in pile irradiation samples. (author)

  1. HASE - The Helsinki adaptive sample preparation line

    Energy Technology Data Exchange (ETDEWEB)

    Palonen, V., E-mail: vesa.palonen@helsinki.fi [Department of Physics, University of Helsinki, P.O. Box 43, FI-00014 (Finland); Pesonen, A. [Laboratory of Chronology, Finnish Museum of Natural History, P.O. Box 64, FI-00014 (Finland); Herranen, T.; Tikkanen, P. [Department of Physics, University of Helsinki, P.O. Box 43, FI-00014 (Finland); Oinonen, M. [Laboratory of Chronology, Finnish Museum of Natural History, P.O. Box 64, FI-00014 (Finland)

    2013-01-15

    We have designed and built an adaptive sample preparation line with separate modules for combustion, molecular sieve handling, CO{sub 2} gas cleaning, CO{sub 2} storage, and graphitization. The line is also connected to an elemental analyzer. Operation of the vacuum equipment, a flow controller, pressure sensors, ovens, and graphitization reactors are automated with a reliable NI-cRIO real-time system. Stepped combustion can be performed in two ovens at temperatures up to 900 Degree-Sign C. Depending on the application, CuO or O{sub 2}-flow combustion can be used. A flow controller is used to adjust the O{sub 2} flow and pressure during combustion. For environmental samples, a module for molecular sieve regeneration and sample desorption is attached to the line replacing the combustion module. In the storage module, CO{sub 2} samples can be stored behind a gas-tight diaphragm valve and either stored for later graphitization or taken for measurements with separate equipment (AMS gas ion source or a separate mass spectrometer). The graphitization module consists of four automated reactors, capable of graphitizing samples with masses from 3 mg down to 50 {mu}g.

  2. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  3. Hydrogeologic applications for historical records and images from rock samples collected at the Nevada National Security Site and vicinity, Nye County, Nevada - A supplement to Data Series 297

    Science.gov (United States)

    Wood, David B.

    2018-03-14

    Rock samples have been collected, analyzed, and interpreted from drilling and mining operations at the Nevada National Security Site for over one-half of a century. Records containing geologic and hydrologic analyses and interpretations have been compiled into a series of databases. Rock samples have been photographed and thin sections scanned. Records and images are preserved and available for public viewing and downloading at the U.S. Geological Survey ScienceBase, Mercury Core Library and Data Center Web site at https://www.sciencebase.gov/mercury/ and documented in U.S. Geological Survey Data Series 297. Example applications of these data and images are provided in this report.

  4. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    Science.gov (United States)

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption.

  5. Development of sample size allocation program using hypergeometric distribution

    International Nuclear Information System (INIS)

    Kim, Hyun Tae; Kwack, Eun Ho; Park, Wan Soo; Min, Kyung Soo; Park, Chan Sik

    1996-01-01

    The objective of this research is the development of sample allocation program using hypergeometric distribution with objected-oriented method. When IAEA(International Atomic Energy Agency) performs inspection, it simply applies a standard binomial distribution which describes sampling with replacement instead of a hypergeometric distribution which describes sampling without replacement in sample allocation to up to three verification methods. The objective of the IAEA inspection is the timely detection of diversion of significant quantities of nuclear material, therefore game theory is applied to its sampling plan. It is necessary to use hypergeometric distribution directly or approximate distribution to secure statistical accuracy. Improved binomial approximation developed by Mr. J. L. Jaech and correctly applied binomial approximation are more closer to hypergeometric distribution in sample size calculation than the simply applied binomial approximation of the IAEA. Object-oriented programs of 1. sample approximate-allocation with correctly applied standard binomial approximation, 2. sample approximate-allocation with improved binomial approximation, and 3. sample approximate-allocation with hypergeometric distribution were developed with Visual C ++ and corresponding programs were developed with EXCEL(using Visual Basic for Application). 8 tabs., 15 refs. (Author)

  6. Application of SIMS to the analysis of environmental samples

    International Nuclear Information System (INIS)

    Seyama, Haruhiko

    2003-01-01

    As an example of surface analysis of environmental samples, SIMS was applied to airborne particulates, fish otoliths (a calcareous ear-stone) and biotites (a rock-forming aluminosilicate mineral). Airborne particulates deposited on leaf surface were analyzed directly by fast atom bombardment (FAB)-SIMS using an O 2 primary neutral beam. Some metal elements, such as Pb, of aerosol origin could be detected. Local areas of a thin section of an otolith were analyzed by FAB-SIMS. Line scans and images of secondary ions revealed seasonal periodicity in Sr, Na and K concentrations in the otolith that corresponded to the annual band structure. Surface alteration of acid-treated and naturally weathered biotites was studied by SIMS depth profiling using an O - primary ion. The depth profile of the acid-treated biotite showed the formation of an altered surface layer rich in Si. In contrast a thick altered surface layer was not observed and Al was held on the surface under natural weathering

  7. A two-sample Bayesian t-test for microarray data

    Directory of Open Access Journals (Sweden)

    Dimmic Matthew W

    2006-03-01

    Full Text Available Abstract Background Determining whether a gene is differentially expressed in two different samples remains an important statistical problem. Prior work in this area has featured the use of t-tests with pooled estimates of the sample variance based on similarly expressed genes. These methods do not display consistent behavior across the entire range of pooling and can be biased when the prior hyperparameters are specified heuristically. Results A two-sample Bayesian t-test is proposed for use in determining whether a gene is differentially expressed in two different samples. The test method is an extension of earlier work that made use of point estimates for the variance. The method proposed here explicitly calculates in analytic form the marginal distribution for the difference in the mean expression of two samples, obviating the need for point estimates of the variance without recourse to posterior simulation. The prior distribution involves a single hyperparameter that can be calculated in a statistically rigorous manner, making clear the connection between the prior degrees of freedom and prior variance. Conclusion The test is easy to understand and implement and application to both real and simulated data shows that the method has equal or greater power compared to the previous method and demonstrates consistent Type I error rates. The test is generally applicable outside the microarray field to any situation where prior information about the variance is available and is not limited to cases where estimates of the variance are based on many similar observations.

  8. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  9. The Recreation Applications in Local Administrations: The Sample of Konya City

    Directory of Open Access Journals (Sweden)

    Murat KOÇYİĞİT

    2014-08-01

    Full Text Available The population increase in the big cities, the crowd and urbanization phenomenon’s cause changes in expectations of people as well. Anyone who lives in the big city and feels the atmosphere of metropolis culture, are desiring to have a healthier and greene r city atmosphere alongside with their increasing demands from technological developments, and ease in the transportation to increase in the living standards. They want to live with much amusement with joyful activities and healthier, through the shopping malls constructed at the city centers, movie houses and amusement centers and also playgrounds, walking trials and runways, parks and green fields for people to relax. People want to reach such centers and recreational fields not with long time, hour long voyages but immediately. They desire it to be close to his house, workplace. He wants to stop by a park, a recreational ground, picnic area even for a short time, to fish, blow off the daily steam, refresh, and relax. People want the recreational grounds t o be close to their houses, nursery, and school of their children. While the social activity grounds continues to increase in Konya, now big recreational grounds, parks, green grounds are being formed that can offer all those means to citizens. In this con text the aim of the working is, to examine the recreational applications made by Konya Metropolitan Municipality, by researching the facilities and application methods related with the recreation, to form the infrastructure of a study that can be model for the recreational applications in other Metropolis. In the study, the current recreational grounds constructed by the Konya Metropolitan Municipality have been examined and potential of the recreation areas has been investigated. The data of the research h ave been acquired through the context analysis method applied to observation and data. According to the findings of the research, the recreation areas constructed by Konya

  10. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    Science.gov (United States)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable

  12. Enhanced Sampling and Analysis, Selection of Technology for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Svoboda, John; Meikrantz, David

    2010-02-01

    The focus of this study includes the investigation of sampling technologies used in industry and their potential application to nuclear fuel processing. The goal is to identify innovative sampling methods using state of the art techniques that could evolve into the next generation sampling and analysis system for metallic elements. This report details the progress made in the first half of FY 2010 and includes a further consideration of the research focus and goals for this year. Our sampling options and focus for the next generation sampling method are presented along with the criteria used for choosing our path forward. We have decided to pursue the option of evaluating the feasibility of microcapillary based chips to remotely collect, transfer, track and supply microliters of sample solutions to analytical equipment in support of aqueous processes for used nuclear fuel cycles. Microchip vendors have been screened and a choice made for the development of a suitable microchip design followed by production of samples for evaluation by ANL, LANL, and INL on an independent basis.

  13. OHB's Exploration Capabilities Overview Relevant to Mars Sample Return Mission

    Science.gov (United States)

    Jaime, A.; Gerth, I.; Rohrbeck, M.; Scheper, M.

    2018-04-01

    The presentation will give an overview to all the OHB past and current projects that are relevant to the Mars Sample Return (MSR) mission, including some valuable lessons learned applicable to the upcoming MSR mission.

  14. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    Science.gov (United States)

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  15. Total reflection X-ray fluorescence with synchrotron radiation applied to biological and environmental samples

    International Nuclear Information System (INIS)

    Simabuco, S.M.; Matsumoto, E.; Jesus, E.F.O.; Lopes, R.T.; Perez, C.; Nascimento Filho, V.F.; Costa, R.S.S.; Tavares do Carmo, M.G.; Saunders, C.

    2001-01-01

    Full text: The Total Reflection X-ray Fluorescence has been applied for trace elements in water and aqueous solutions, environmental samples and biological materials after sample preparation and to surface analysis of silicon wafers. The present paper shows some results of applications for rainwater, atmospheric particulate material, colostrum and nuclear samples. (author)

  16. Turbulent fluxes by "Conditional Eddy Sampling"

    Science.gov (United States)

    Siebicke, Lukas

    2015-04-01

    Turbulent flux measurements are key to understanding ecosystem scale energy and matter exchange, including atmospheric trace gases. While the eddy covariance approach has evolved as an invaluable tool to quantify fluxes of e.g. CO2 and H2O continuously, it is limited to very few atmospheric constituents for which sufficiently fast analyzers exist. High instrument cost, lack of field-readiness or high power consumption (e.g. many recent laser-based systems requiring strong vacuum) further impair application to other tracers. Alternative micrometeorological approaches such as conditional sampling might overcome major limitations. Although the idea of eddy accumulation has already been proposed by Desjardin in 1972 (Desjardin, 1977), at the time it could not be realized for trace gases. Major simplifications by Businger and Oncley (1990) lead to it's widespread application as 'Relaxed Eddy Accumulation' (REA). However, those simplifications (flux gradient similarity with constant flow rate sampling irrespective of vertical wind velocity and introduction of a deadband around zero vertical wind velocity) have degraded eddy accumulation to an indirect method, introducing issues of scalar similarity and often lack of suitable scalar flux proxies. Here we present a real implementation of a true eddy accumulation system according to the original concept. Key to our approach, which we call 'Conditional Eddy Sampling' (CES), is the mathematical formulation of conditional sampling in it's true form of a direct eddy flux measurement paired with a performant real implementation. Dedicated hardware controlled by near-real-time software allows full signal recovery at 10 or 20 Hz, very fast valve switching, instant vertical wind velocity proportional flow rate control, virtually no deadband and adaptive power management. Demonstrated system performance often exceeds requirements for flux measurements by orders of magnitude. The system's exceptionally low power consumption is ideal

  17. Surface plasmon resonance biosensors for highly sensitive detection in real samples

    Science.gov (United States)

    Sepúlveda, B.; Carrascosa, L. G.; Regatos, D.; Otte, M. A.; Fariña, D.; Lechuga, L. M.

    2009-08-01

    In this work we summarize the main results obtained with the portable surface plasmon resonance (SPR) device developed in our group (commercialised by SENSIA, SL, Spain), highlighting its applicability for the real-time detection of extremely low concentrations of toxic pesticides in environmental water samples. In addition, we show applications in clinical diagnosis as, on the one hand, the real-time and label-free detection of DNA hybridization and single point mutations at the gene BRCA-1, related to the predisposition in women to develop an inherited breast cancer and, on the other hand, the analysis of protein biomarkers in biological samples (urine, serum) for early detection of diseases. Despite the large number of applications already proven, the SPR technology has two main drawbacks: (i) not enough sensitivity for some specific applications (where pM-fM or single-molecule detection are needed) (ii) low multiplexing capabilities. In order solve such drawbacks, we work in several alternative configurations as the Magneto-optical Surface Plasmon Resonance sensor (MOSPR) based on a combination of magnetooptical and ferromagnetic materials, to improve the SPR sensitivity, or the Localized Surface Plasmon Resonance (LSPR) based on nanostructures (nanoparticles, nanoholes,...), for higher multiplexing capabilities.

  18. Wet-digestion of environmental sample using silver-mediated electrochemical method

    International Nuclear Information System (INIS)

    Kuwabara, Jun

    2010-01-01

    An application of silver-mediated electrochemical method to environmental samples as the effective digestion method for iodine analysis was tried. Usual digestion method for 129 I in many type of environmental sample is combustion method using quartz glass tube. Chemical yield of iodine on the combustion method reduce depending on the type of sample. The silver-mediated electrochemical method is expected to achieve very low loss of iodine. In this study, dried kombu (Laminaria) sample was tried to digest with electrochemical cell. At the case of 1g of sample, digestion was completed for about 24 hours under the electric condition of <10V and <2A. After the digestion, oxidized species of iodine was reduced to iodide by adding sodium sulfite. And then the precipitate of silver iodide was obtained. (author)

  19. Analysis of special recovery samples by Pu(III) spectrophotometry

    International Nuclear Information System (INIS)

    Van Hare, D.R.

    1985-11-01

    A simple spectrophotometric method has been developed to determine the plutonium concentration of FB-Line Special Recovery samples. The method is applicable over the 1 to 150 g/L range, with an accuracy and precision of better than 1%. 9 refs., 3 figs., 3 tabs

  20. IP Sample Plan #4 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Sample letter from Research Institutes and their principal investigator and consultants, describing a data and research tool sharing plan and procedures for sharing data, research materials, and patent and licensing of intellectual property. This letter is designed to be included as part of an application.