Sample records for sample processing methods

  1. Sample processing device and method

    DEFF Research Database (Denmark)


    A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates....

  2. Sediment sampling and processing methods in Hungary, and possible improvements (United States)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy


    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  3. Development of an automated data processing method for sample to sample comparison of seized methamphetamines. (United States)

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun


    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  4. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald


    -line dilution, derivatization, separation and preconcentration methods encompassing solid reactors, solvent extraction, sorbent extraction, precipitation/coprecipitation, hydride/vapor generation and digestion/leaching protocols as hyphenated to a plethora of detection devices is discussed in detail...

  5. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods (United States)


    enhanced situational awareness of biological threats to the environment, human health, agriculture , and food supplies. Specifically mentioned is the...preparing for the possibility of biologically based attacks on military, civilian, or agricultural targets. To be fully prepared for this...from the various collected samples was extracted using an identical process—the Blood and Tissue Midi Preparation Kit (Qiagen, Inc.; Valencia , CA)—and

  6. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples. (United States)

    Aristov, Alexander; Nosova, Ekaterina


    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  7. A method for disaggregating clay concretions and eliminating formalin smell in the processing of sediment samples

    DEFF Research Database (Denmark)

    Cedhagen, Tomas


    A complete handling procedure for processing sediment samples is described. It includes some improvements of conventional methods. The fixed sediment sample is mixed with a solution of the alkaline detergent AJAX® (Colgate-Palmolive). It is kept at 80-900 C for 20-40 min. This treatment facilitates...... subsequent sorting as it disaggregates clay concretions and faecal pellets ·but leaves even fragile organisms clean and unaffected. The ammonia in the detergent eliminates the formalin smell....

  8. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR. (United States)

    Mobli, Mehdi; Hoch, Jeffrey C


    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample (United States)

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.


    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  10. A new formulation of the linear sampling method: spatial resolution and post-processing

    Energy Technology Data Exchange (ETDEWEB)

    Piana, M [Dipartimento di Informatica, Universita di Verona, Ca' Vignal 2, 37134 Verona (Italy); Aramini, R; Brignone, M [Dipartimento di Matematica, Universita di Genova, via Dodecaneso 35, 16146 Genova (Italy); Coyle, J [Department of Mathematics, Monmouth University, 400 Cedar Avenue, West Long Branch, 07764 New Jersey (United States)], E-mail:


    A new formulation of the linear sampling method is described, which requires the regularized solution of a single functional equation set in a direct sum of L{sup 2} spaces. This new approach presents the following notable advantages: it is computationally more effective than the traditional implementation, since time consuming samplings of the Tikhonov minimum problem and of the generalized discrepancy equation are avoided; it allows a quantitative estimate of the spatial resolution achievable by the method; it facilitates a post-processing procedure for the optimal selection of the scatterer profile by means of edge detection techniques. The formulation is described in a two-dimensional framework and in the case of obstacle scattering, although generalizations to three dimensions and penetrable inhomogeneities are straightforward.


    Energy Technology Data Exchange (ETDEWEB)

    Click, D.; Jones, M.; Edwards, T.


    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion method was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U

  12. Evaluation of standard methods for collecting and processing fuel moisture samples (United States)

    Sally M. Haase; José Sánchez; David R. Weise


    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  13. Verification Of The Defense Waste Processing Facility's (DWPF) Process Digestion Methods For The Sludge Batch 8 Qualification Sample

    Energy Technology Data Exchange (ETDEWEB)

    Click, D. R.; Edwards, T. B.; Wiedenman, B. J.; Brown, L. W.


    This report contains the results and comparison of data generated from inductively coupled plasma – atomic emission spectroscopy (ICP-AES) analysis of Aqua Regia (AR), Sodium Peroxide/Sodium Hydroxide Fusion Dissolution (PF) and Cold Chem (CC) method digestions and Cold Vapor Atomic Absorption analysis of Hg digestions from the DWPF Hg digestion method of Sludge Batch 8 (SB8) Sludge Receipt and Adjustment Tank (SRAT) Receipt and SB8 SRAT Product samples. The SB8 SRAT Receipt and SB8 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB8 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 7b (SB7b), to form the SB8 Blend composition.

  14. Validation of a novel rinse and filtration method for efficient processing of fresh produce samples for microbiological indicator enumeration. (United States)

    Heredia, Norma; Solís-Soto, Luisa; Venegas, Fabiola; Bartz, Faith E; de Aceituno, Anna Fabiszewski; Jaykus, Lee-Ann; Leon, Juan S; García, Santos


    Several methods have been described to prepare fresh produce samples for microbiological analysis, each with its own advantages and disadvantages. The aim of this study was to compare the performance of a novel combined rinse and membrane filtration method to two alternative sample preparation methods for the quantification of indicator microorganisms from fresh produce. Decontaminated cantaloupe melons and jalapeño peppers were surface inoculated with a cocktail containing 10(6) CFU/ml Escherichia coli, Salmonella Typhimurium, and Enterococcus faecalis. Samples were processed using a rinse and filtration method, homogenization by stomacher, or a sponge-rubbing method, followed by quantification of bacterial load using culture methods. Recovery efficiencies of the three methods were compared. On inoculated cantaloupes, the rinse and filtration method had higher recovery of coliforms (0.95 log CFU/ml higher recovery, P = 0.0193) than the sponge-rubbing method. Similarly, on inoculated jalapeños, the rinse and filtration method had higher recovery for coliforms (0.84 log CFU/ml higher, P = 0.0130) and E. coli (1.46 log CFU/ml higher, P filtration method outperformed the homogenization method for all three indicators (0.79 to 1.71 log CFU/ml higher, P values ranging from 0.0075 to 0.0002). The precision of the three methods was also compared. The precision of the rinse and filtration method was similar to that of the other methods for recovery of two of three indicators from cantaloupe (E. coli P = 0.7685, E. faecalis P = 0.1545) and was more precise for recovery of two of three indicators from jalapeño (coliforms P = 0.0026, E. coli P = 0.0243). Overall, the rinse and filtration method performed equivalent to, and sometimes better than, either of the compared methods. The rinse and filtration method may have logistical advantages when processing large numbers of samples, improving sampling efficiency and facilitating microbial detection.

  15. Sampling system and method (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee


    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  16. Sampling system and method

    Energy Technology Data Exchange (ETDEWEB)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee


    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  17. Methods for Improving Human Gut Microbiome Data by Reducing Variability through Sample Processing and Storage of Stool. (United States)

    Gorzelak, Monika A; Gill, Sandeep K; Tasnim, Nishat; Ahmadi-Vand, Zahra; Jay, Michael; Gibson, Deanna L


    Gut microbiome community analysis is used to understand many diseases like inflammatory bowel disease, obesity, and diabetes. Sampling methods are an important consideration for human microbiome research, yet are not emphasized in many studies. In this study, we demonstrate that the preparation, handling, and storage of human faeces are critical processes that alter the outcomes of downstream DNA-based bacterial community analyses via qPCR. We found that stool subsampling resulted in large variability of gut microbiome data due to different microenvironments harbouring various taxa within an individual stool. However, we reduced intra-sample variability by homogenizing the entire stool sample in liquid nitrogen and subsampling from the resulting crushed powder prior to DNA extraction. We experimentally determined that the bacterial taxa varied with room temperature storage beyond 15 minutes and beyond three days storage in a domestic frost-free freezer. While freeze thawing only had an effect on bacterial taxa abundance beyond four cycles, the use of samples stored in RNAlater should be avoided as overall DNA yields were reduced as well as the detection of bacterial taxa. Overall we provide solutions for processing and storing human stool samples that reduce variability of microbiome data. We recommend that stool is frozen within 15 minutes of being defecated, stored in a domestic frost-free freezer for less than three days, and homogenized prior to DNA extraction. Adoption of these simple protocols will have a significant and positive impact on future human microbiome research.

  18. Methods for Improving Human Gut Microbiome Data by Reducing Variability through Sample Processing and Storage of Stool.

    Directory of Open Access Journals (Sweden)

    Monika A Gorzelak

    Full Text Available Gut microbiome community analysis is used to understand many diseases like inflammatory bowel disease, obesity, and diabetes. Sampling methods are an important consideration for human microbiome research, yet are not emphasized in many studies. In this study, we demonstrate that the preparation, handling, and storage of human faeces are critical processes that alter the outcomes of downstream DNA-based bacterial community analyses via qPCR. We found that stool subsampling resulted in large variability of gut microbiome data due to different microenvironments harbouring various taxa within an individual stool. However, we reduced intra-sample variability by homogenizing the entire stool sample in liquid nitrogen and subsampling from the resulting crushed powder prior to DNA extraction. We experimentally determined that the bacterial taxa varied with room temperature storage beyond 15 minutes and beyond three days storage in a domestic frost-free freezer. While freeze thawing only had an effect on bacterial taxa abundance beyond four cycles, the use of samples stored in RNAlater should be avoided as overall DNA yields were reduced as well as the detection of bacterial taxa. Overall we provide solutions for processing and storing human stool samples that reduce variability of microbiome data. We recommend that stool is frozen within 15 minutes of being defecated, stored in a domestic frost-free freezer for less than three days, and homogenized prior to DNA extraction. Adoption of these simple protocols will have a significant and positive impact on future human microbiome research.

  19. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L


    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...


    Energy Technology Data Exchange (ETDEWEB)

    Click, D.; Edwards, T.; Jones, M.; Wiedenman, B.


    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO{sub 3} acid dissolution (i.e., DWPF Cold Chem Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestions of Sludge Batch 7a (SB7a) SRAT Receipt and SB7a SRAT Product samples. The SB7a SRAT Receipt and SB7a SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constituates the SB7a Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 6 (SB6), to form the Sb7a Blend composition.

  1. Representative process sampling - in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen


    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and/or...... presented cases of variography either solved the initial problems or served to understand the reasons and causes behind the specific process structures revealed in the variograms. Process Analytical Technologies (PAT) are not complete without process TOS....

  2. Sampling and sample processing in pesticide residue analysis. (United States)

    Lehotay, Steven J; Cook, Jo Marie


    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  3. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples (United States)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.


    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  4. Neonatal blood gas sampling methods

    African Journals Online (AJOL)

    Blood gas sampling is part of everyday practice in the care of babies admitted to the neonatal intensive care unit, particularly for those receiving respiratory support. There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual ...


    National Research Council Canada - National Science Library



    .... Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors...


    Directory of Open Access Journals (Sweden)



    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  7. Distance sampling methods and applications

    CERN Document Server

    Buckland, S T; Marques, T A; Oedekoven, C S


    In this book, the authors cover the basic methods and advances within distance sampling that are most valuable to practitioners and in ecology more broadly. This is the fourth book dedicated to distance sampling. In the decade since the last book published, there have been a number of new developments. The intervening years have also shown which advances are of most use. This self-contained book covers topics from the previous publications, while also including recent developments in method, software and application. Distance sampling refers to a suite of methods, including line and point transect sampling, in which animal density or abundance is estimated from a sample of distances to detected individuals. The book illustrates these methods through case studies; data sets and computer code are supplied to readers through the book’s accompanying website.  Some of the case studies use the software Distance, while others use R code. The book is in three parts.  The first part addresses basic methods, the ...

  8. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F


    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  9. Numerical simulations of regolith sampling processes (United States)

    Schäfer, Christoph M.; Scherrer, Samuel; Buchwald, Robert; Maindl, Thomas I.; Speith, Roland; Kley, Wilhelm


    We present recent improvements in the simulation of regolith sampling processes in microgravity using the numerical particle method smooth particle hydrodynamics (SPH). We use an elastic-plastic soil constitutive model for large deformation and failure flows for dynamical behaviour of regolith. In the context of projected small body (asteroid or small moons) sample return missions, we investigate the efficiency and feasibility of a particular material sampling method: Brushes sweep material from the asteroid's surface into a collecting tray. We analyze the influence of different material parameters of regolith such as cohesion and angle of internal friction on the sampling rate. Furthermore, we study the sampling process in two environments by varying the surface gravity (Earth's and Phobos') and we apply different rotation rates for the brushes. We find good agreement of our sampling simulations on Earth with experiments and provide estimations for the influence of the material properties on the collecting rate.

  10. Improvements in methods of analyzing dust concentrations, and influence of the storage processes on dust concentrations in polar snow and ice samples

    Directory of Open Access Journals (Sweden)

    Takayuki Miyake


    Full Text Available We sought to improve the analytical methods employed when operating a laser particle counter and to evaluate the influence of the storage processes on dust concentrations in polar snow and ice samples. We corrected the particle size ranges and threshold voltage using the new calibration curve, confirmed the analytical precision and dust concentrations of blank of wipers using in a clean room, and managed any variations in the laser sensor's sensitivity by measuring standard particles. The 15 ml glass screw bottles without packing (liner of cap of bottles yielded the lowest dust concentration of the blank among two types of bottles and nine types of packing for dust analysis. Storage of samples of the Dome Fuji ice core (Antarctica in a refrigerator for 1 year resulted in just a 4% decrease in dust concentration, which is within the analytical precision of the laser particle counter. Storage in a freezer resulted in an increase in dust concentrations and a decrease in the ratio of large particles more than 0.98 μm in particle diameter in the samples, suggesting a change in dust particle size during storage and an influence by the materials of the storage bottles. The addition of dispersants to the Antarctic snow samples is not clearly suitable when analyzing dust concentrations after sample storage by refrigeration or freezing.

  11. Dynamic Method for Identifying Collected Sample Mass (United States)

    Carson, John


    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  12. A new selective liquid membrane extraction method for the determination of basic herbicides in agro-processed fruit juices and Ethiopian honey wine (Tej) samples. (United States)

    Megersa, Negussie; Kassahun, Samuel


    Supported liquid membrane (SLM) extraction was optimised for trace extraction and enrichment of selected triazine herbicides from a variety of agro-processed fruit juices and Ethiopian honey wine (Tej) samples. In the extraction process, a 1:1 mixture of n-undecane and di-n-hexylether was immobilised in a thin porous PTFE membrane that forms a barrier between two aqueous phases (the donor and acceptor phases) in a flow system. The extracts constitute the selectively enriched analytes collected from the acceptor phase and were analysed by transferring to a HPLC-UV system using a diode array detector at 235 nm. High enrichment factors were obtained with very good repeatability of results, and the detection limit was lower than 3.00 µg l⁻¹ for ametryn in apple juice. The optimised method showed very good linearity of over 50-500 µg l⁻¹ with a correlation coefficient of >0.990 or better for triplicate analysis. All chromatograms gave well resolved peaks with no interfering peaks at the retention times of the selected triazines, showing high selectivity of the SLM extraction method in combination with HPLC-UV for the selected matrices. The optimised method can be used as an alternative solventless extraction method for microgram-level extraction of other triazine herbicides and a variety of pesticides from liquid and semi-liquid environmental, biological and food matrices.

  13. Towards Cost-efficient Sampling Methods

    CERN Document Server

    Peng, Luo; Chong, Wu


    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and selects the high degree nodes with higher probability by classifying the nodes according to their degree distribution. The second sampling method improves the existing snowball sampling method so that it enables to sample the targeted nodes selectively in every sampling step. Besides, the two proposed sampling methods not only sample the nodes but also pick the edges directly connected to these nodes. In order to demonstrate the two methods' availability and accuracy, we compare them with the existing sampling methods in...

  14. Sampling of temporal networks: Methods and biases (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter


    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  15. Radar Doppler Processing with Nonuniform Sampling.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Conventional signal processing to estimate radar Doppler frequency often assumes uniform pulse/sample spacing. This is for the convenience of t he processing. More recent performance enhancements in processor capability allow optimally processing nonuniform pulse/sample spacing, thereby overcoming some of the baggage that attends uniform sampling, such as Doppler ambiguity and SNR losses due to sidelobe control measures.

  16. Evaluation of sample processing methods for the polar contaminant analysis of sewage sludge using liquid chromatography - mass spectrometry (LC/MS

    Directory of Open Access Journals (Sweden)

    Rênnio F. de Sena


    Full Text Available Monitoring of sewage sludge has proved the presence of many polar anthropogenic pollutants since LC/MS techniques came into routine use. While advanced techniques may improve characterizations, flawed sample processing procedures, however, may disturb or disguise the presence and fate of many target compounds present in this type of complex matrix before analytical process starts. Freeze-drying or oven-drying, in combination with centrifugation or filtration as sample processing techniques were performed followed by visual pattern recognition of target compounds for assessment of pretreatment processes. The results shown that oven-drying affected the sludge characterization, while freeze-drying led to less analytical misinterpretations.

  17. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris (United States)

    Michael S. Williams; Jeffrey H. Gove


    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  18. (3) Simple processing method

    African Journals Online (AJOL)

    Adeyinka Odunsi

    Simple Processing Method for Recycling Poultry Waste into. Animal Feed Ingredient. *Komolafe, A. A. and Sonaiya, E. B. ... recycled and become consumables to livestock, thus entering the human food chain. Poultry waste is not ... on the concrete roof (20.5 m high) of the Faculty of Agriculture, Obafemi. Awolowo University ...

  19. A method of language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik; Hengeveld, Kees


    created with this method will reflect optimally the diversity of the languages of the world. On the basis of the internal structure of each genetic language tree a measure is computed that reflects the linguistic diversity in the language families represented by these trees. This measure is used...

  20. One single, fast and robust capillary electrophoresis method for the direct quantification of intact adenovirus particles in upstream and downstream processing samples

    NARCIS (Netherlands)

    van Tricht, Ewoud; Geurink, Lars; Backus, Harold; Germano, Marta; Somsen, Govert W.; Sänger–van de Griend, Cari E.


    During development of adenovirus-based vaccines, samples have to be analyzed in order to either monitor the production process or control the quality and safety of the product. An important quality attribute is the total concentration of intact adenoviruses, which currently is determined by

  1. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim


    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...

  2. New prior sampling methods for nested sampling - Development and testing (United States)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene


    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  3. Quality evaluation of processed clay soil samples. (United States)

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku


    This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. "Small" market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  4. Degradation process of lead chromate in paintings by Vincent van Gogh studied by means of synchrotron X-ray spectromicroscopy and related methods. 2. Original paint layer samples. (United States)

    Monico, Letizia; Van der Snickt, Geert; Janssens, Koen; De Nolf, Wout; Miliani, Costanza; Dik, Joris; Radepont, Marie; Hendriks, Ella; Geldof, Muriel; Cotte, Marine


    The darkening of the original yellow areas painted with the chrome yellow pigment (PbCrO(4), PbCrO(4)·xPbSO(4), or PbCrO(4)·xPbO) is a phenomenon widely observed on several paintings by Vincent van Gogh, such as the famous different versions of Sunflowers. During our previous investigations on artificially aged model samples of lead chromate, we established for the first time that darkening of chrome yellow is caused by reduction of PbCrO(4) to Cr(2)O(3)·2H(2)O (viridian green), likely accompanied by the presence of another Cr(III) compound, such as either Cr(2)(SO(4))(3)·H(2)O or (CH(3)CO(2))(7)Cr(3)(OH)(2) [chromium(III) acetate hydroxide]. In the second part of this work, in order to demonstrate that this reduction phenomenon effectively takes place in real paintings, we study original paint samples from two paintings of V. van Gogh. As with the model samples, in view of the thin superficial alteration layers that are present, high lateral resolution spectroscopic methods that make use of synchrotron radiation (SR), such as microscopic X-ray absorption near edge (μ-XANES) and X-ray fluorescence spectrometry (μ-XRF) were employed. Additionally, μ-Raman and mid-FTIR analyses were carried out to completely characterize the samples. On both paint microsamples, the local presence of reduced Cr was demonstrated by means of μ-XANES point measurements. The presence of Cr(III) was revealed in specific areas, in some cases correlated to the presence of Ba(sulfate) and/or to that of aluminum silicate compounds.

  5. Sample normalization methods in quantitative metabolomics. (United States)

    Wu, Yiman; Li, Liang


    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Fluidics platform and method for sample preparation (United States)

    Benner, Henry W.; Dzenitis, John M.


    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  7. Hyperspectral image processing methods (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  8. Influence of diluent and sample processing methods on the recovery of the biocontrol agent Pantoea agglomerans CPA-2 from different fruit surfaces. (United States)

    Torres, R; Viñas, I; Usall, J; Remón, D; Teixidó, N


    Determining the populations of biocontrol agents applied as a postharvest treatment on fruit surfaces is fundamental to the assessment of the microorganisms' ability to colonise and persist on fruit. To obtain maximum recovery, we must develop a methodology that involves both diluent and processing methods and that does not affect the viability of the microorganisms. The effect of diluent composition was evaluated using three diluents: phosphate buffer, peptone saline and buffered peptone saline. An additional study was performed to compare three processing methods (shaking plus sonication, stomaching and shaking plus centrifugation) on the recovery efficiency of Pantoea agglomerans strain CPA-2 from apples, oranges, nectarines and peaches treated with this biocontrol agent. Overall, slight differences occurred among diluents, although the phosphate buffer maintained the most ideal pH for CPA-2 growth (between 5.2 and 6.2). Stomaching, using the phosphate buffer as diluent, was the best procedure for recovering and enumerating the biocontrol agent; this fact suggested that no lethal effects from naturally occurring antimicrobial compounds present on the fruit skins and/or produced when the tissues were disrupted affected the recovery of the CPA-2 cells, regardless of fruit type. The growth pattern of CPA-2 on fruits maintained at 20°C and under cold conditions was similar to that obtained in previous studies, which confirms the excellent adaptation of this strain to conditions commonly used for fruit storage. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Intelligent adaptive sampling guided by Gaussian process inference (United States)

    Chen, Yuhang; Peng, Chaoyang


    With the aim of reducing sampling density while having minimal impact on surface reconstruction accuracy, an adaptive sampling method based on Gaussian process inference is proposed. In each iterative step, the current sampling points serve as the training data to predict surface topography and then a new sampling point is adaptively located and inserted at the position where the maximum inference uncertainty is estimated. The updated samples are trained in the next step. By such an iterative training-inference-sampling approach, the reconstructed topography can converge to the expected one efficiently. Demonstrations on different structured, freeform and roughness surfaces ascertain the effectiveness of the sampling strategy. It can lead to an accurate inference of the surface topography and a sufficient reduction of data points compared with conventional uniform sampling. Robustness against random surface features, measurement noise and sharp height changes is further discussed. Such an adaptive sampling method is extremely suitable for discrete point-by-point measurements.

  10. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)


    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  11. Method and apparatus for sampling atmospheric mercury (United States)

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.


    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  12. New adaptive sampling method in particle image velocimetry (United States)

    Yu, Kaikai; Xu, Jinglei; Tang, Lan; Mo, Jianwei


    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method.

  13. Sampling Methods in Cardiovascular Nursing Research: An Overview. (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie


    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.


    Directory of Open Access Journals (Sweden)

    Adaleta Perković


    Full Text Available Everyday procedures carried out in petrophysical laboratory can be defined as a complete cycle of business processes. Sample handling process is one of the most significant and demanding procedures. It starts with sample receiving in laboratory and then subsequently, series of analyses and measurements are carrying out resulting in petrophysical parameters. Sample handling process ends with sample storage and archiving of obtained measurement data. Process model is used for description of repeating activities. Sample handling process is presented by graphical method and use of eEPC diagram (extended Event-Driven Process Chain which describe process based on events. Created process model jointly binds static laboratory resources (measuring instruments, computers and data, speeds up process with increasing the user’s efficiency and with improvements of data and information exchange. Besides flow of activity, model of data sample handling includes information about system components (laboratory equipment and software applications that carry out activities. Described model, with minor modifications and adaptations, can be used in any laboratory that is dealing with samples (the paper is published in Croatian.

  15. Oral processing of two milk chocolate samples. (United States)

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina


    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  16. Sampling methods for terrestrial amphibians and reptiles. (United States)

    Paul Stephen Corn; R. Bruce. Bury


    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  17. Non-Contact Conductivity Measurement for Automated Sample Processing Systems (United States)

    Beegle, Luther W.; Kirby, James P.


    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  18. Static sampling of dynamic processes - a paradox? (United States)

    Mälicke, Mirko; Neuper, Malte; Jackisch, Conrad; Hassler, Sibylle; Zehe, Erwin


    Environmental systems monitoring aims at its core at the detection of spatio-temporal patterns of processes and system states, which is a pre-requisite for understanding and explaining their baffling heterogeneity. Most observation networks rely on distributed point sampling of states and fluxes of interest, which is combined with proxy-variables from either remote sensing or near surface geophysics. The cardinal question on the appropriate experimental design of such a monitoring network has up to now been answered in many different ways. Suggested approaches range from sampling in a dense regular grid using for the so-called green machine, transects along typical catenas, clustering of several observations sensors in presumed functional units or HRUs, arrangements of those cluster along presumed lateral flow paths to last not least a nested, randomized stratified arrangement of sensors or samples. Common to all these approaches is that they provide a rather static spatial sampling, while state variables and their spatial covariance structure dynamically change in time. It is hence of key interest how much of our still incomplete understanding stems from inappropriate sampling and how much needs to be attributed to an inappropriate analysis of spatial data sets. We suggest that it is much more promising to analyze the spatial variability of processes, for instance changes in soil moisture values, than to investigate the spatial variability of soil moisture states themselves. This is because wetting of the soil, reflected in a soil moisture increase, is causes by a totally different meteorological driver - rainfall - than drying of the soil. We hence propose that the rising and the falling limbs of soil moisture time series belong essentially to different ensembles, as they are influenced by different drivers. Positive and negative temporal changes in soil moisture need, hence, to be analyzed separately. We test this idea using the CAOS data set as a benchmark

  19. A cryopreservation method for Pasteurella multocida from wetland samples (United States)

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.


    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  20. Methods of Analysis by the U.S. Geological Survey National Water Quality Laboratory - Processing, Taxonomy, and Quality Control of Benthic Macroinvertebrate Samples (United States)


    Only a portion of colonial organisms, such as Bryozoa or Porifera , is sorted to document its presence in the sample. Verte- brates, exuviae... invertebrate eggs, micro- crustaceans, and terrestrial organisms are not sorted. However, terrestrial insects that have an aquatic lifestage (for example...taxonomic principles and hav- ing a broad knowledge of all aquatic macro- invertebrate groups. Typically dichotomous keys are used to identify

  1. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.


    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  2. Method for using polarization gating to measure a scattering sample (United States)

    Baba, Justin S.


    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  3. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)


    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  4. Applications of non-uniform sampling and processing. (United States)

    Hyberts, Sven G; Arthanari, Haribabu; Wagner, Gerhard


    Modern high-field NMR instruments provide unprecedented resolution. To make use of the resolving power in multidimensional NMR experiment standard linear sampling through the indirect dimensions to the maximum optimal evolution times (~1.2T (2)) is not practical because it would require extremely long measurement times. Thus, alternative sampling methods have been proposed during the past 20 years. Originally, random nonlinear sampling with an exponentially decreasing sampling density was suggested, and data were transformed with a maximum entropy algorithm (Barna et al., J Magn Reson 73:69-77, 1987). Numerous other procedures have been proposed in the meantime. It has become obvious that the quality of spectra depends crucially on the sampling schedules and the algorithms of data reconstruction. Here we use the forward maximum entropy (FM) reconstruction method to evaluate several alternate sampling schedules. At the current stage, multidimensional NMR spectra that do not have a serious dynamic range problem, such as triple resonance experiments used for sequential assignments, are readily recorded and faithfully reconstructed using non-uniform sampling. Thus, these experiments can all be recorded non-uniformly to utilize the power of modern instruments. On the other hand, for spectra with a large dynamic range, such as 3D and 4D NOESYs, choosing optimal sampling schedules and the best reconstruction method is crucial if one wants to recover very weak peaks. Thus, this chapter is focused on selecting the best sampling schedules and processing methods for high-dynamic range spectra.

  5. Sample to sample fluctuations in fragmentation and agglomeration processes

    CERN Document Server

    Olla, P


    The fluctuations in the particle size distribution for processes of fragmentation and aggregation are studied for stationary state regimes. The system is described in terms of a stochastic process over an adequate tree structure. The RMS fluctuations appear to scale with the square root of the mean distribution, as in the case of sums of statistically independent events. Implications for the applicability of a mean field description to fragmentation and agglomeration processes, and possible relation with intermittency phenomena in three dimensional turbulence are discussed.

  6. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality (United States)

    Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.


    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover

  7. A random spatial sampling method in a rural developing nation. (United States)

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C


    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  8. Quality evaluation of processed clay soil samples | Steiner-Asiedu ...

    African Journals Online (AJOL)

    Introduction: This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods: The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was ...

  9. System and method for measuring fluorescence of a sample

    Energy Technology Data Exchange (ETDEWEB)

    Riot, Vincent J.


    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  10. System and method for measuring fluorescence of a sample (United States)

    Riot, Vincent J


    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  11. Microencapsulation and Electrostatic Processing Method (United States)

    Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)


    Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.

  12. System and Method for Isolation of Samples (United States)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)


    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.


    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.


    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from

  14. Fluidics platform and method for sample preparation and analysis (United States)

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.


    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  15. Radiochemistry methods in DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Fadeff, S.K.; Goheen, S.C.


    Current standard sources of radiochemistry methods are often inappropriate for use in evaluating US Department of Energy environmental and waste management (DOE/EW) samples. Examples of current sources include EPA, ASTM, Standard Methods for the Examination of Water and Wastewater and HASL-300. Applicability of these methods is limited to specific matrices (usually water), radiation levels (usually environmental levels), and analytes (limited number). Radiochemistry methods in DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) attempt to fill the applicability gap that exists between standard methods and those needed for DOE/EM activities. The Radiochemistry chapter in DOE Methods includes an ``analysis and reporting`` guidance section as well as radiochemistry methods. A basis for identifying the DOE/EM radiochemistry needs is discussed. Within this needs framework, the applicability of standard methods and targeted new methods is identified. Sources of new methods (consolidated methods from DOE laboratories and submissions from individuals) and the methods review process will be discussed. The processes involved in generating consolidated methods add editing individually submitted methods will be compared. DOE Methods is a living document and continues to expand by adding various kinds of methods. Radiochemistry methods are highlighted in this paper. DOE Methods is intended to be a resource for methods applicable to DOE/EM problems. Although it is intended to support DOE, the guidance and methods are not necessarily exclusive to DOE. The document is available at no cost through the Laboratory Management Division of DOE, Office of Technology Development.

  16. Methods of Human Body Odor Sampling: The Effect of Freezing

    National Research Council Canada - National Science Library

    Lenochova, Pavlina; Roberts, S. Craig; Havlicek, Jan

    Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing...

  17. Statistical sampling method, used in the audit

    Directory of Open Access Journals (Sweden)

    Gabriela-Felicia UNGUREANU


    Full Text Available The rapid increase in the size of U.S. companies from the early twentieth century created the need for audit procedures based on the selection of a part of the total population audited to obtain reliable audit evidence, to characterize the entire population consists of account balances or classes of transactions. Sampling is not used only in audit – is used in sampling surveys, market analysis and medical research in which someone wants to reach a conclusion about a large number of data by examining only a part of these data. The difference is the “population” from which the sample is selected, ie that set of data which is intended to draw a conclusion. Audit sampling applies only to certain types of audit procedures.

  18. Tissue Sampling and Processing for Histopathology Evaluation. (United States)

    Slaoui, Mohamed; Bauchet, Anne-Laure; Fiette, Laurence


    Histological procedures aim at providing good-quality sections that can be used for a light microscopic evaluation of tissue. These are applicable to identify either spontaneous or diseases-induced changes. Routinely, tissues are fixed with neutral formalin 10%, embedded in paraffin, and manually sectioned with a microtome to obtain 4-5 μm thick paraffin sections. Dewaxed sections are then stained with HE&S (hematoxylin-eosin and saffron) or can be used for other purposes (special stains, immunohistochemistry, in situ hybridization, etc.). During this processing, many steps and procedures are critical to ensure standard and interpretable sections. This chapter provides key recommendations to efficiently achieve this objective.

  19. Ion chromatographic determination of cyanate in saline gold processing samples. (United States)

    Black, S B; Schulz, R S


    An ion chromatographic method was developed for the determination of cyanate (CNO-) in saline gold processing samples. The method is based on the use of a very weak-eluting buffer (5 mM sodium borate) and a Dionex AS4A-SC anion-exchange column. This weak-eluting buffer facilitates the wide chromatographic separation of chloride (Cl-) from CNO-. After CNO- has been eluted, the switch to 1.8 mM Na2CO3-1.7 mM NaHCO3 buffer allows the fast elution of other major inorganic and organic anions. Validation of this method, including identification of interferences, has shown that this method is reliable, accurate, sensitive (detection limit, 0.1 mg/l CNO-) and reproducible.

  20. A Novel Fast Method for Point-sampled Model Simplification

    Directory of Open Access Journals (Sweden)

    Cao Zhi


    Full Text Available A novel fast simplification method for point-sampled statue model is proposed. Simplifying method for 3d model reconstruction is a hot topic in the field of 3D surface construction. But it is difficult as point cloud of many 3d models is very large, so its running time becomes very long. In this paper, a two-stage simplifying method is proposed. Firstly, a feature-preserved non-uniform simplification method for cloud points is presented, which simplifies the data set to remove the redundancy while keeping down the features of the model. Secondly, an affinity clustering simplifying method is used to classify the point cloud into a sharp point or a simple point. The advantage of Affinity Propagation clustering is passing messages among data points and fast speed of processing. Together with the re-sampling, it can dramatically reduce the duration of the process while keep a lower memory cost. Both theoretical analysis and experimental results show that after the simplification, the performance of the proposed method is efficient as well as the details of the surface are preserved well.

  1. System and method for extracting a sample from a surface (United States)

    Van Berkel, Gary; Covey, Thomas


    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  2. Computational Methods for Conformational Sampling of Biomolecules

    DEFF Research Database (Denmark)

    Bottaro, Sandro

    Proteins play a fundamental role in virtually every process within living organisms. For example, some proteins act as enzymes, catalyzing a wide range of reactions necessary for life, others mediate the cell interaction with the surrounding environment and still others have regulatory functions....

  3. Challenging genosensors in food samples: The case of gluten determination in highly processed samples. (United States)

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz


    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Recent Results of the Investigation of a Microfluidic Sampling Chip and Sampling System for Hot Cell Aqueous Processing Streams

    Energy Technology Data Exchange (ETDEWEB)

    Julia Tripp; Jack Law; Tara Smith


    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and microfluidics sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The microfluidic-based robotic sampling system’s mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of microfluidic sampling chips.


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.; Noyes, G.


    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  6. The curvHDR method for gating flow cytometry samples

    Directory of Open Access Journals (Sweden)

    Wand Matthew P


    Full Text Available Abstract Background High-throughput flow cytometry experiments produce hundreds of large multivariate samples of cellular characteristics. These samples require specialized processing to obtain clinically meaningful measurements. A major component of this processing is a form of cell subsetting known as gating. Manual gating is time-consuming and subjective. Good automatic and semi-automatic gating algorithms are very beneficial to high-throughput flow cytometry. Results We develop a statistical procedure, named curvHDR, for automatic and semi-automatic gating. The method combines the notions of significant high negative curvature regions and highest density regions and has the ability to adapt well to human-perceived gates. The underlying principles apply to dimension of arbitrary size, although we focus on dimensions up to three. Accompanying software, compatible with contemporary flow cytometry infor-matics, is developed. Conclusion The method is seen to adapt well to nuances in the data and, to a reasonable extent, match human perception of useful gates. It offers big savings in human labour when processing high-throughput flow cytometry data whilst retaining a good degree of efficacy.

  7. Some connections between importance sampling and enhanced sampling methods in molecular dynamics (United States)

    Lie, H. C.; Quer, J.


    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  8. Progressive sample processing of band selection for hyperspectral imagery (United States)

    Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu


    Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.

  9. 19 CFR 151.83 - Method of sampling. (United States)


    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  10. 7 CFR 29.110 - Method of sampling. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  11. an assessment of methods for sampling carabid beetles

    African Journals Online (AJOL)


    particular habitat where we sampled (rugged montane rain forest) pitfall trapping has no advantage over searching methods with respect to ease of operation, low cost or efficiency. However, despite its inefficiency, pitfall trapping cannot be left out of sampling protocols because the method sampled some species that were ...

  12. Method optimization for fecal sample collection and fecal DNA extraction. (United States)

    Mathay, Conny; Hamot, Gael; Henry, Estelle; Georges, Laura; Bellora, Camille; Lebrun, Laura; de Witt, Brian; Ammerlaan, Wim; Buschart, Anna; Wilmes, Paul; Betsou, Fay


    This is the third in a series of publications presenting formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks. We report here optimization of a stool processing protocol validated for fitness-for-purpose in terms of downstream DNA-based analyses. Stool collection was initially optimized in terms of sample input quantity and supernatant volume using canine stool. Three DNA extraction methods (PerkinElmer MSM I®, Norgen Biotek All-In-One®, MoBio PowerMag®) and six collection container types were evaluated with human stool in terms of DNA quantity and quality, DNA yield, and its reproducibility by spectrophotometry, spectrofluorometry, and quantitative PCR, DNA purity, SPUD assay, and 16S rRNA gene sequence-based taxonomic signatures. The optimal MSM I protocol involves a 0.2 g stool sample and 1000 μL supernatant. The MSM I extraction was superior in terms of DNA quantity and quality when compared to the other two methods tested. Optimal results were obtained with plain Sarstedt tubes (without stabilizer, requiring immediate freezing and storage at -20°C or -80°C) and Genotek tubes (with stabilizer and RT storage) in terms of DNA yields (total, human, bacterial, and double-stranded) according to spectrophotometry and spectrofluorometry, with low yield variability and good DNA purity. No inhibitors were identified at 25 ng/μL. The protocol was reproducible in terms of DNA yield among different stool aliquots. We validated a stool collection method suitable for downstream DNA metagenomic analysis. DNA extraction with the MSM I method using Genotek tubes was considered optimal, with simple logistics in terms of collection and shipment and offers the possibility of automation. Laboratories and biobanks should ensure protocol conditions are systematically recorded in the scope of accreditation.

  13. DSMC multicomponent aerosol dynamics: Sampling algorithms and aerosol processes (United States)

    Palaniswaamy, Geethpriya

    The post-accident nuclear reactor primary and containment environments can be characterized by high temperatures and pressures, and fission products and nuclear aerosols. These aerosols evolve via natural transport processes as well as under the influence of engineered safety features. These aerosols can be hazardous and may pose risk to the public if released into the environment. Computations of their evolution, movement and distribution involve the study of various processes such as coagulation, deposition, condensation, etc., and are influenced by factors such as particle shape, charge, radioactivity and spatial inhomogeneity. These many factors make the numerical study of nuclear aerosol evolution computationally very complicated. The focus of this research is on the use of the Direct Simulation Monte Carlo (DSMC) technique to elucidate the role of various phenomena that influence the nuclear aerosol evolution. In this research, several aerosol processes such as coagulation, deposition, condensation, and source reinforcement are explored for a multi-component, aerosol dynamics problem in a spatially homogeneous medium. Among the various sampling algorithms explored the Metropolis sampling algorithm was found to be effective and fast. Several test problems and test cases are simulated using the DSMC technique. The DSMC results obtained are verified against the analytical and sectional results for appropriate test problems. Results show that the assumption of a single mean density is not appropriate due to the complicated effect of component densities on the aerosol processes. The methods developed and the insights gained will also be helpful in future research on the challenges associated with the description of fission product and aerosol releases.

  14. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum


    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...

  15. Advanced methods for processing ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)


    Combustion chemical vapor deposition (combustion CVD) is being developed for the deposition of high temperature oxide coatings. The process is being evaluated as an alternative to more capital intensive conventional coating processes. The thrusts during this reporting period were the development of the combustion CVD process for depositing lanthanum monazite, the determination of the influence of aerosol size on coating morphology, the incorporation of combustion CVD coatings into thermal barrier coatings (TBCs) and related oxidation research, and continued work on the deposition of zirconia-yttria coatings.


    DEFF Research Database (Denmark)


    As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which is the o......As a rule, electron beam welding takes place in a vacuum. However, this means that the workpieces in question have to be placed in a vacuum chamber and have to be removed therefrom after welding. This is time−consuming and a serious limitation of a process the greatest advantage of which...... exploiting the potential of electron beam processing to a greater degree than previously possible, for example by means of electron beam welding...

  17. Systems and methods for self-synchronized digital sampling (United States)

    Samson, Jr., John R. (Inventor)


    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  18. Double Shell Tank (DST) Process Waste Sampling Subsystem Specification

    Energy Technology Data Exchange (ETDEWEB)



    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Process Waste Sampling Subsystem which supports the first phase of Waste Feed Delivery.

  19. DOE methods for evaluating environmental and waste management samples.

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K [eds.; Pacific Northwest Lab., Richland, WA (United States)


    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  20. A random spatial sampling method in a rural developing nation (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas


    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  1. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples." (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  2. Experience sampling with elderly persons: an exploration of the method. (United States)

    Hnatiuk, S H


    The daily lives of a sample of elderly widows (greater than 69 years of age) were studied using the method of experience sampling developed by Csikszentmihalyi and his colleagues. The purpose of the study was to investigate the response of elderly people to experience sampling as a means of collecting information about their activities, thoughts, and moods during the course of one week. The method proved acceptable to the majority of participants and yielded reliable, valid data about their home lives, particularly from among the younger, more physically able women. Experience sampling was, within certain limits, a useful method of obtaining information from elderly people.


    Directory of Open Access Journals (Sweden)



    Full Text Available Marketing and statistical literature available to practitioners provides a wide range of sampling methods that can be implemented in the context of marketing research. Ranking sampling method is based on taking apart the general population into several strata, namely into several subdivisions which are relatively homogenous regarding a certain characteristic. In fact, the sample will be composed by selecting, from each stratum, a certain number of components (which can be proportional or non-proportional to the size of the stratum until the pre-established volume of the sample is reached. Using ranking sampling within marketing research requires the determination of some relevant statistical indicators - average, dispersion, sampling error etc. To that end, the paper contains a case study which illustrates the actual approach used in order to apply the ranking sample method within a marketing research made by a company which provides Internet connection services, on a particular category of customers – small and medium enterprises.


    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  5. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual benefits and risks. This review critically surveys the available evidence to generate a comparison between arterial and capillary blood gas sampling, focusing on their ...

  6. An efficient method for sampling the essential subspace of proteins

    NARCIS (Netherlands)

    Amadei, A; Linssen, A.B M; de Groot, B.L.; van Aalten, D.M.F.; Berendsen, H.J.C.

    A method is presented for a more efficient sampling of the configurational space of proteins as compared to conventional sampling techniques such as molecular dynamics. The method is based on the large conformational changes in proteins revealed by the ''essential dynamics'' analysis. A form of

  7. Monitoring of Extraction Efficiency by a Sample Process Control Virus Added Immediately Upon Sample Receipt. (United States)

    Ruhanya, Vurayai; Diez-Valcarce, Marta; D'Agostino, Martin; Cook, Nigel; Hernández, Marta; Rodríguez-Lázaro, David


    When analysing food samples for enteric viruses, a sample process control virus (SPCV) must be added at the commencement of the analytical procedure, to verify that the analysis has been performed correctly. Samples can on occasion arrive at the laboratory late in the working day or week. The analyst may consequently have insufficient time to commence and complete the complex procedure, and the samples must consequently be stored. To maintain the validity of the analytical result, it will be necessary to consider storage as part of the process, and the analytical procedure as commencing on sample receipt. The aim of this study was to verify that an SPCV can be recovered after sample storage, and thus indicate the effective recovery of enteric viruses. Two types of samples (fresh and frozen raspberries) and two types of storage (refrigerated and frozen) were studied using Mengovirus vMC0 as SPCV. SPCV recovery was not significantly different (P > 0.5) regardless of sample type or duration of storage (up to 14 days at -20 °C). Accordingly, samples can be stored without a significant effect on the performance of the analysis. The results of this study should assist the analyst by demonstrating that they can verify that viruses can be extracted from food samples even if samples have been stored.

  8. A method and fortran program for quantitative sampling in paleontology (United States)

    Tipper, J.C.


    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  9. Proteome sampling by the HLA class I antigen processing pathway. (United States)

    Hoof, Ilka; van Baarle, Debbie; Hildebrand, William H; Keşmir, Can


    The peptide repertoire that is presented by the set of HLA class I molecules of an individual is formed by the different players of the antigen processing pathway and the stringent binding environment of the HLA class I molecules. Peptide elution studies have shown that only a subset of the human proteome is sampled by the antigen processing machinery and represented on the cell surface. In our study, we quantified the role of each factor relevant in shaping the HLA class I peptide repertoire by combining peptide elution data, in silico predictions of antigen processing and presentation, and data on gene expression and protein abundance. Our results indicate that gene expression level, protein abundance, and rate of potential binding peptides per protein have a clear impact on sampling probability. Furthermore, once a protein is available for the antigen processing machinery in sufficient amounts, C-terminal processing efficiency and binding affinity to the HLA class I molecule determine the identity of the presented peptides. Having studied the impact of each of these factors separately, we subsequently combined all factors in a logistic regression model in order to quantify their relative impact. This model demonstrated the superiority of protein abundance over gene expression level in predicting sampling probability. Being able to discriminate between sampled and non-sampled proteins to a significant degree, our approach can potentially be used to predict the sampling probability of self proteins and of pathogen-derived proteins, which is of importance for the identification of autoimmune antigens and vaccination targets.

  10. Method and apparatus for imaging a sample on a device (United States)

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.


    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  11. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.


    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-sample composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  12. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  13. Sampling Methods for Web and E-mail Surveys


    Fricker, RD


    London: SAGE Publications. Reprinted from The SAGE Handbook of Online Research Methods, N. Fielding, R.M. Lee and G. Blank, eds., chapter 11, London: SAGE Publications, 195-216. This chapter is a comprehensive overview of sampling methods for web and e-mail (‘Internetbased’) surveys. It reviews the various types of sampling method – both probability and nonprobability – and examines their applicability to Internet-based surveys. Issues related to Internetbased survey samp...

  14. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    Directory of Open Access Journals (Sweden)

    Tony J Popic

    Full Text Available Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  15. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang


      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  16. Effect of sample preparation methods on photometric determination of the tellurium and cobalt content in the samples of copper concentrates

    Directory of Open Access Journals (Sweden)

    Viktoriya Butenko


    Full Text Available Methods of determination of cobalt and nickel in copper concentrates currently used in factory laboratories are very labor intensive and time consuming. The limiting stage of the analysis is preliminary chemical sample preparation. Carrying out the decomposition process of industrial samples with concentrated mineral acids in open systems does not allow to improve the metrological characteristics of the methods, for this reason improvement the methods of sample preparation is quite relevant and has a practical interest. The work was dedicated to the determination of the optimal conditions of preliminary chemical preparation of copper concentrate samples for the subsequent determination of cobalt and tellurium in the obtained solution using tellurium-spectrophotometric method. Decomposition of the samples was carried out by acid dissolving in individual mineral acids and their mixtures by heating in an open system as well as by using ultrasonification and microwave radiation in a closed system. In order to select the optimal conditions for the decomposition of the samples in a closed system the phase contact time and ultrasonic generator’s power were varied. Intensification of the processes of decomposition of copper concentrates with nitric acid (1:1, ultrasound and microwave radiation allowed to transfer quantitatively cobalt and tellurium into solution spending 20 and 30 min respectively. This reduced the amount of reactants used and improved the accuracy of determination by running the process in strictly identical conditions.

  17. Learning process mapping heuristics under stochastic sampling overheads (United States)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.


    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  18. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations]. (United States)

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna


    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  19. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.


    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  20. Sampling design for spatially distributed hydrogeologic and environmental processes (United States)

    Christakos, G.; Olea, R.A.


    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  1. [Biological Advisory Subcommittee Sampling Methods : Results, Resolutions, and Correspondences : 2002 (United States)

    US Fish and Wildlife Service, Department of the Interior — This document contains a variety of information concerning Biological Advisory Subcommittee sampling methods at the Rocky Mountain Arsenal Refuge in 2002. Multiple...

  2. Methods for collection and analysis of water samples (United States)

    Rainwater, Frank Hays; Thatcher, Leland Lincoln


    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  3. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    Energy Technology Data Exchange (ETDEWEB)

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox


    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  4. Validity and reliability of the Experience-Sampling Method. (United States)

    Csikszentmihalyi, M; Larson, R


    To understand the dynamics of mental health, it is essential to develop measures for the frequency and the patterning of mental processes in every-day-life situations. The Experience-Sampling Method (ESM) is an attempt to provide a valid instrument to describe variations in self-reports of mental processes. It can be used to obtain empirical data on the following types of variables: a) frequency and patterning of daily activity, social interaction, and changes in location; b) frequency, intensity, and patterning of psychological states, i.e., emotional, cognitive, and conative dimensions of experience; c) frequency and patterning of thoughts, including quality and intensity of thought disturbance. The article reviews practical and methodological issues of the ESM and presents evidence for its short- and long-term reliability when used as an instrument for assessing the variables outlined above. It also presents evidence for validity by showing correlation between ESM measures on the one hand and physiological measures, one-time psychological tests, and behavioral indices on the other. A number of studies with normal and clinical populations that have used the ESM are reviewed to demonstrate the range of issues to which the technique can be usefully applied.

  5. Passive sampling methods for contaminated sediments: risk assessment and management. (United States)

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F


    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. © 2014

  6. Passive sampling methods for contaminated sediments: Risk assessment and management (United States)

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F


    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  7. Field evaluation of personal sampling methods for multiple bioaerosols.

    Directory of Open Access Journals (Sweden)

    Chi-Hsun Wang

    Full Text Available Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min. Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  8. Adaptive cluster sampling: An efficient method for assessing inconspicuous species (United States)

    Andrea M. Silletti; Joan Walker


    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  9. Proteome sampling by the HLA class I antigen processing pathway.

    Directory of Open Access Journals (Sweden)

    Ilka Hoof

    Full Text Available The peptide repertoire that is presented by the set of HLA class I molecules of an individual is formed by the different players of the antigen processing pathway and the stringent binding environment of the HLA class I molecules. Peptide elution studies have shown that only a subset of the human proteome is sampled by the antigen processing machinery and represented on the cell surface. In our study, we quantified the role of each factor relevant in shaping the HLA class I peptide repertoire by combining peptide elution data, in silico predictions of antigen processing and presentation, and data on gene expression and protein abundance. Our results indicate that gene expression level, protein abundance, and rate of potential binding peptides per protein have a clear impact on sampling probability. Furthermore, once a protein is available for the antigen processing machinery in sufficient amounts, C-terminal processing efficiency and binding affinity to the HLA class I molecule determine the identity of the presented peptides. Having studied the impact of each of these factors separately, we subsequently combined all factors in a logistic regression model in order to quantify their relative impact. This model demonstrated the superiority of protein abundance over gene expression level in predicting sampling probability. Being able to discriminate between sampled and non-sampled proteins to a significant degree, our approach can potentially be used to predict the sampling probability of self proteins and of pathogen-derived proteins, which is of importance for the identification of autoimmune antigens and vaccination targets.

  10. A simple capacitive method to evaluate ethanol fuel samples (United States)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.


    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  11. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method (United States)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.


    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  12. DOE methods for evaluating environmental and waste management samples

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K. [eds.


    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  13. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample (United States)

    Pines, Alexander; Samoson, Ago


    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  14. Methods for sample size determination in cluster randomized trials. (United States)

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra


    The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  15. Sampling and Timing: A Task for the Environmetal Process

    NARCIS (Netherlands)

    Hilderink, G.H.; Broenink, Johannes F.


    Sampling and timing is considered a responsibility of the environment of controller software. In this paper we will illustrate a concept whereby an environmental process and multi-way events play an important role in applying timing for untimed CSP software architectures. We use this timing concept

  16. Soil separator and sampler and method of sampling (United States)

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID


    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  17. Using re-sampling methods in mortality studies.

    Directory of Open Access Journals (Sweden)

    Igor Itskovich

    Full Text Available Traditional methods of computing standardized mortality ratios (SMR in mortality studies rely upon a number of conventional statistical propositions to estimate confidence intervals for obtained values. Those propositions include a common but arbitrary choice of the confidence level and the assumption that observed number of deaths in the test sample is a purely random quantity. The latter assumption may not be fully justified for a series of periodic "overlapping" studies. We propose a new approach to evaluating the SMR, along with its confidence interval, based on a simple re-sampling technique. The proposed method is most straightforward and requires neither the use of above assumptions nor any rigorous technique, employed by modern re-sampling theory, for selection of a sample set. Instead, we include all possible samples that correspond to the specified time window of the study in the re-sampling analysis. As a result, directly obtained confidence intervals for repeated overlapping studies may be tighter than those yielded by conventional methods. The proposed method is illustrated by evaluating mortality due to a hypothetical risk factor in a life insurance cohort. With this method used, the SMR values can be forecast more precisely than when using the traditional approach. As a result, the appropriate risk assessment would have smaller uncertainties.


    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.


    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  19. Degradation process of lead chromate in paintings by Vincent van Gogh studied by means of spectromicroscopic methods. 4. Artificial aging of model samples of co-precipitates of lead chromate and lead sulfate. (United States)

    Monico, Letizia; Janssens, Koen; Miliani, Costanza; Van der Snickt, Geert; Brunetti, Brunetto Giovanni; Cestelli Guidi, Mariangela; Radepont, Marie; Cotte, Marine


    Previous investigations about the darkening of chrome yellow pigments revealed that this form of alteration is attributable to a reduction of the original Cr(VI) to Cr(III), and that the presence of sulfur-containing compounds, most often sulfates, plays a key role during this process. We recently demonstrated that different crystal forms of chrome yellow pigments (PbCrO(4) and PbCr(1-x)S(x)O(4)) are present in paintings by Vincent van Gogh. In the present work, we show how both the chemical composition and the crystalline structure of lead chromate-based pigments influence their stability. For this purpose, oil model samples made with in-house synthesized powders of PbCrO(4) and PbCr(1-x)S(x)O(4) were artificially aged and characterized. We observed a profound darkening only for those paint models made with PbCr(1-x)S(x)O(4), rich in SO(4)(2-) (x ≥ 0.4), and orthorhombic phases (>30 wt %). Cr and S K-edge micro X-ray absorption near edge structure investigations revealed in an unequivocal manner the formation of up to about 60% of Cr(III)-species in the outer layer of the most altered samples; conversely, independent of the paint models' chemical composition, no change in the S-oxidation state was observed. Analyses employing UV-visible diffuse reflectance and Fourier transform infrared spectroscopy were performed on unaged and aged model samples in order to obtain additional information on the physicochemical changes induced by the aging treatment.

  20. Heat-capacity measurements on small samples: The hybrid method

    NARCIS (Netherlands)

    Klaasse, J.C.P.; Brück, E.H.


    A newly developed method is presented for measuring heat capacities on small samples, particularly where thermal isolation is not sufficient for the use of the traditional semiadiabatic heat-pulse technique. This "hybrid technique" is a modification of this heat-pulse method in case the temperature

  1. Method of determining an electrical property of a test sample

    DEFF Research Database (Denmark)


    A method of obtaining an electrical property of a test sample, comprising a non-conductive area and a conductive or semi-conductive test area, byperforming multiple measurements using a multi-point probe. The method comprising the steps of providing a magnetic field having field lines passing...... the electrical property of the test area....

  2. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations

    DEFF Research Database (Denmark)

    Mayer, Philipp; Parkerton, Thomas F.; Adams, Rachel G.


    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree ) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive upta...

  3. Efficiency of snake sampling methods in the Brazilian semiarid region. (United States)

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z


    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  4. Processing module operating methods, processing modules, and communications systems (United States)

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy


    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  5. A multi-dimensional sampling method for locating small scatterers (United States)

    Song, Rencheng; Zhong, Yu; Chen, Xudong


    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method.

  6. Advanced Markov chain Monte Carlo methods learning from past samples

    CERN Document Server

    Liang, Faming; Carrol, Raymond J


    This book provides comprehensive coverage of simulation of complex systems using Monte Carlo methods. Developing algorithms that are immune to the local trap problem has long been considered as the most important topic in MCMC research. Various advanced MCMC algorithms which address this problem have been developed include, the modified Gibbs sampler, the methods based on auxiliary variables and the methods making use of past samples. The focus of this book is on the algorithms that make use of past samples. This book includes the multicanonical algorithm, dynamic weighting, dynamically weight

  7. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay

    DEFF Research Database (Denmark)

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana


    BACKGROUND: In studies of perfluoroalkyl acids, the validity and comparability of measured concentrations may be affected by differences in the handling of biospecimens. We aimed to investigate whether measured plasma levels of perfluoroalkyl acids differed between blood samples subjected to delay...... and transportation prior to processing and samples with immediate processing and freezing. METHODS: Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed...... and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. RESULTS: For samples taken in the winter, relative...

  8. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System (United States)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.


    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  9. Designing a Repetitive Group Sampling Plan for Weibull Distributed Processes

    Directory of Open Access Journals (Sweden)

    Aijun Yan


    Full Text Available Acceptance sampling plans are useful tools to determine whether the submitted lots should be accepted or rejected. An efficient and economic sampling plan is very desirable for the high quality levels required by the production processes. The process capability index CL is an important quality parameter to measure the product quality. Utilizing the relationship between the CL index and the nonconforming rate, a repetitive group sampling (RGS plan based on CL index is developed in this paper when the quality characteristic follows the Weibull distribution. The optimal plan parameters of the proposed RGS plan are determined by satisfying the commonly used producer’s risk and consumer’s risk at the same time by minimizing the average sample number (ASN and then tabulated for different combinations of acceptance quality level (AQL and limiting quality level (LQL. The results show that the proposed plan has better performance than the single sampling plan in terms of ASN. Finally, the proposed RGS plan is illustrated with an industrial example.

  10. Sample preparation method for glass welding by ultrashort laser pulses yields higher seam strength. (United States)

    Cvecek, K; Miyamoto, I; Strauss, J; Wolf, M; Frick, T; Schmidt, M


    Glass welding by ultrashort laser pulses allows joining without the need of an absorber or a preheating and postheating process. However, cracks generated during the welding process substantially impair the joining strength of the welding seams. In this paper a sample preparation method is described that prevents the formation of cracks. The measured joining strength of samples prepared by this method is substantially higher than previously reported values.


    Directory of Open Access Journals (Sweden)

    O. Honcharova


    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  12. Sample size formulae for the Bayesian continual reassessment method. (United States)

    Cheung, Ying Kuen


    In the planning of a dose finding study, a primary design objective is to maintain high accuracy in terms of the probability of selecting the maximum tolerated dose. While numerous dose finding methods have been proposed in the literature, concrete guidance on sample size determination is lacking. With a motivation to provide quick and easy calculations during trial planning, we present closed form formulae for sample size determination associated with the use of the Bayesian continual reassessment method (CRM). We examine the sampling distribution of a nonparametric optimal design and exploit it as a proxy to empirically derive an accuracy index of the CRM using linear regression. We apply the formulae to determine the sample size of a phase I trial of PTEN-long in pancreatic cancer patients and demonstrate that the formulae give results very similar to simulation. The formulae are implemented by an R function 'getn' in the package 'dfcrm'. The results are developed for the Bayesian CRM and should be validated by simulation when used for other dose finding methods. The analytical formulae we propose give quick and accurate approximation of the required sample size for the CRM. The approach used to derive the formulae can be applied to obtain sample size formulae for other dose finding methods.

  13. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M


    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  14. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA


    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  15. Survival-time statistics for sample space reducing stochastic processes. (United States)

    Yadav, Avinash Chand


    Stochastic processes wherein the size of the state space is changing as a function of time offer models for the emergence of scale-invariant features observed in complex systems. I consider such a sample-space reducing (SSR) stochastic process that results in a random sequence of strictly decreasing integers {x(t)},0≤t≤τ, with boundary conditions x(0)=N and x(τ) = 1. This model is shown to be exactly solvable: P_{N}(τ), the probability that the process survives for time τ is analytically evaluated. In the limit of large N, the asymptotic form of this probability distribution is Gaussian, with mean and variance both varying logarithmically with system size: 〈τ〉∼lnN and σ_{τ}^{2}∼lnN. Correspondence can be made between survival-time statistics in the SSR process and record statistics of independent and identically distributed random variables.


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.


    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  17. Spin column extraction as a new sample preparation method in bioanalysis. (United States)

    Namera, Akira; Saito, Takashi


    Sample preparation is important in obtaining accurate data for qualification and quantification in bioanalysis. We have recently focused on monolithic silica for high-throughput analysis. These extraction processes - using monolithic silica packed in spin column - such as sample loading, washing and elution, are executed by centrifugation. There are several possibilities such as on-column derivatization for the determination of amines or carboxylic acids in the sample. The spin column extraction reduces the sample preparation time required for determination of drugs and other chemicals in biological materials and increases productivity in bioanalysis. We expect spin column extraction to become the mainstream method of sample processing in the future.

  18. Characterizing lentic freshwater fish assemblages using multiple sampling methods. (United States)

    Fischer, Jesse R; Quist, Michael C


    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48-1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  19. Health care process modelling: which method when? (United States)

    Jun, Gyuchan Thomas; Ward, James; Morris, Zoe; Clarkson, John


    The role of process modelling has been widely recognized for effective quality improvement. However, application in health care is somewhat limited since the health care community lacks knowledge about a broad range of methods and their applicability to health care. Therefore, the objectives of this paper are to present a summary description of a limited number of distinct modelling methods and evaluate how health care workers perceive them. Various process modelling methods from several different disciplines were reviewed and characterized. Case studies in three different health care scenarios were carried out to model those processes and evaluate how health care workers perceive the usability and utility of the process models. Eight distinct modelling methods were identified and characterized by what the modelling elements in each explicitly represents. Flowcharts, which had been most extensively used by the participants, were most favoured in terms of their usability and utility. However, some alternative methods, although having been used by a much smaller number of participants, were considered to be helpful, specifically in understanding certain aspects of complex processes, e.g. communication diagrams for understanding interactions, swim lane activity diagrams for roles and responsibilities and state transition diagrams for a patient-centred perspective. We believe that it is important to make the various process modelling methods more easily accessible to health care by providing clear guidelines or computer-based tool support for health care-specific process modelling. These supports can assist health care workers to apply initially unfamiliar, but eventually more effective modelling methods.

  20. Self-contained cryogenic gas sampling apparatus and method (United States)

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.


    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  1. Methods for Sampling and Measurement of Compressed Air Contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L.


    In order to improve the technique for measuring oil and water entrained in a compressed air stream, a laboratory study has been made of some methods for sampling and measurement. For this purpose water or oil as artificial contaminants were injected in thin streams into a test loop, carrying dry compressed air. Sampling was performed in a vertical run, down-stream of the injection point. Wall attached liquid, coarse droplet flow, and fine droplet flow were sampled separately. The results were compared with two-phase flow theory and direct observation of liquid behaviour. In a study of sample transport through narrow tubes, it was observed that, below a certain liquid loading, the sample did not move, the liquid remaining stationary on the tubing wall. The basic analysis of the collected samples was made by gravimetric methods. Adsorption tubes were used with success to measure water vapour. A humidity meter with a sensor of the aluminium oxide type was found to be unreliable. Oil could be measured selectively by a flame ionization detector, the sample being pretreated in an evaporation- condensation unit

  2. Process tracing methods: foundation and guidelines

    DEFF Research Database (Denmark)

    Beach, Derek; Pedersen, Rasmus Brun

    Derek Beach and Rasmus Brun Pedersen have written the first practical guide for using process tracing in social science research. The book introduces a more refined definition of what process tracing methods are, differentiating it into three variants, showing the uses and limitations of each...... a set of tools for how the three variants of process tracing methods can be used in research, introducing a set of practical guidelines for each stage of the research process (working with theories, developing empirical tests, working with evidence, and case selection strategies, nesting case studies...


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S.; Culligan, B.


    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.


    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, S; Brian Culligan, B


    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  5. An Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System (United States)

    Beegle, L. W.; Anderson, R. C.; Hurowitz, J. A.; Jandura, L.; Limonadi, D.


    The Mars Science Laboratory Mission (MSL), landed on Mars on August 5. The rover and a scientific payload are designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem is the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)). It is expected that the SA/SPaH system will have produced a scooped system and possibility a drilled sample in the first 90 sols of the mission. Results from these activities and the ongoing testing program will be presented.

  6. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Schindler, Matthias, E-mail:; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander


    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO{sub 2} and reduced to graphite to determine {sup 14}C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).


    Directory of Open Access Journals (Sweden)

    BRAD Raluca


    Full Text Available The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality, from grade 5 (best to grade 1 (the worst. The puckering features presented in the learning and test images have been pre-classified using the seam puckering quality standard. The network training stage will consist in presenting five input vectors (derived from the down-sampled arrays, representing the puckering grades. The puckering classification consists in providing an input vector derived from the image supposed to be classified. A scalar product between the input values vectors and the weighted training images is computed. The result will be assigned to one of the five classes of which the input image belongs. Using the Kohonen network the puckering defects were correctly classified in proportion of 71.42%.

  8. Cavitation Erosion Tests Performed by Indirect Vibratory Method on Stainless Steel Welded Samples with Hardened Surface

    Directory of Open Access Journals (Sweden)

    Marian-Dumitru Nedeloni


    Full Text Available The paper presents the results of cavitation erosion tests performed on two types of samples. The materials of the samples are frequently used for manufacturing and repairs of the hydro turbines components submitted to cavitation. The first sample was made by welding of an austenitic stainless steel on austenito-feritic base material. The second sample was made similarly with the first but with a martensitic base material. After the welding processes, on both samples was applied a hardening treatment by surface peening. The cavitation erosion tests were performed on vibratory equipment using the indirect method with stationary specimen. The results show a good cavitation erosion resistance on both samples.

  9. Method for Hot Real-Time Sampling of Gasification Products

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beam Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.

  10. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik


    Sample preparation is considered one of the limiting steps in microbial metabolome analysis. Eukaryotes and prokaryotes behave very differently during the several steps of classical sample preparation methods for analysis of metabolites. Even within the eukaryote kingdom there is a vast diversity...... of cell structures that make it imprudent to blindly adopt protocols that were designed for a specific group of microorganisms. We have therefore reviewed and evaluated the whole sample preparation procedures for analysis of yeast metabolites. Our focus has been on the current needs in metabolome analysis......, which is the analysis of a large number of metabolites with very diverse chemical and physical properties. This work reports the leakage of intracellular metabolites observed during quenching yeast cells with cold methanol solution, the efficacy of six different methods for the extraction...

  11. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Shannon, D. W.


    The data obtained for the first round robin sample collected at Mesa 6-2 wellhead, East Mesa Test Site, Imperial Valley are summarized. Test results are listed by method used for cross reference to the analytic methods section. Results obtained for radioactive isotopes present in the brine sample are tabulated. The data obtained for the second round robin sample collected from the Woolsey No. 1 first stage flash unit, San Diego Gas and Electric Niland Test Facility are presented in the same manner. Lists of the participants of the two round robins are given. Data from miscellaneous analyses are included. Summaries of values derived from the round robin raw data are presented. (MHR)

  12. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik


    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  13. Effect of method of sample preparation on ruminal in situ ...

    African Journals Online (AJOL)

    The objective of this study was to investigate the effect of method of sample preparation on the degradation kinetics of herbage when applying the in situ technique. Ryegrass (Lolium multiflorum cv. Midmar) was harvested at three and four weeks after cutting and fertilizing with 200 kg nitrogen (N)/ha. Freshly cut herbage ...

  14. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    Indwelling arterial catheters are a practical, reliable and accurate method of measuring acid-base parameters, provided they are inserted and maintained with the proper care. Capillary blood gas sampling is accurate, and a good substitute for radial 'stab' arterial puncture, avoiding many of the complications of repeated ...

  15. A General Linear Method for Equating with Small Samples (United States)

    Albano, Anthony D.


    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  16. Protein precipitation methods for sample pretreatment of grass pea ...

    African Journals Online (AJOL)

    Protein precipitation methods for sample pretreatment of grass pea extracts. Negussie Wodajo, Ghirma Moges, Theodros Solomon. Abstract. Bull. Chem. Soc. Ethiop. 1996, 10(2), 129-134. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics.

  17. Sample Selected Averaging Method for Analyzing the Event Related Potential (United States)

    Taguchi, Akira; Ono, Youhei; Kimura, Tomoaki

    The event related potential (ERP) is often measured through the oddball task. On the oddball task, subjects are given “rare stimulus” and “frequent stimulus”. Measured ERPs were analyzed by the averaging technique. In the results, amplitude of the ERP P300 becomes large when the “rare stimulus” is given. However, measured ERPs are included samples without an original feature of ERP. Thus, it is necessary to reject unsuitable measured ERPs when using the averaging technique. In this paper, we propose the rejection method for unsuitable measured ERPs for the averaging technique. Moreover, we combine the proposed method and Woody's adaptive filter method.

  18. Comparison of DNA preservation methods for environmental bacterial community samples (United States)

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.


    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  19. Method and apparatus for processing algae (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite; Di Salvo, Roberto


    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells. The lysate separates into at least two layers including a lipid-containing hydrophobic layer and an ionic liquid-containing hydrophilic layer. A salt or salt solution may be used to remove water from the ionic liquid-containing layer before the ionic liquid is reused. The used salt may also be dried and/or concentrated and reused. The method can operate at relatively low lysis, processing, and recycling temperatures, which minimizes the environmental impact of algae processing while providing reusable biofuels and other useful products.

  20. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.


    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  1. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava


    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  2. Standard methods for sampling freshwater fishes: Opportunities for international collaboration (United States)

    Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.


    With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.

  3. Soybean yield modeling using bootstrap methods for small samples

    Energy Technology Data Exchange (ETDEWEB)

    Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.


    One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)

  4. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    (This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models....... The increasing development of such likelihood based methods, whether frequentist or Bayesian, has lead to more objective and efficient statistical procedures. When checking a fitted parametric point process model, summary statistics and residual analysis (Chapter 4.5) play an important role in combination...

  5. Method for Hot Real-Time Sampling of Pyrolysis Vapors

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    Biomass Pyrolysis has been an increasing topic of research, in particular as a replacement for crude oil. This process utilizes moderate temperatures to thermally deconstruct the biomass which is then condensed into a mixture of liquid oxygenates to be used as fuel precursors. Pyrolysis oils contain more than 400 compounds, up to 60 percent of which do not re-volatilize for subsequent chemical analysis. Vapor chemical composition is also complicated as additional condensation reactions occur during the condensation and collection of the product. Due to the complexity of the pyrolysis oil, and a desire to catalytically upgrade the vapor composition before condensation, online real-time analytical techniques such as Molecular Beam Mass Spectrometry (MBMS) are of great use. However, in order to properly sample hot pyrolysis vapors, many challenges must be overcome. Sampling must occur within a narrow range of temperatures to reduce product composition changes from overheating or partial condensation or plugging of lines from condensed products. Residence times must be kept at a minimum to reduce further reaction chemistries. Pyrolysis vapors also form aerosols that are carried far downstream and can pass through filters resulting in build-up in downstream locations. The co-produced bio-char and ash from the pyrolysis process can lead to plugging of the sample lines, and must be filtered out at temperature, even with the use of cyclonic separators. A practical approach for considerations and sampling system design, as well as lessons learned are integrated into the hot analytical sampling system of the National Renewable Energy Laboratory's (NREL) Thermochemical Process Development Unit (TCPDU) to provide industrially relevant demonstrations of thermochemical transformations of biomass feedstocks at the pilot scale.

  6. The Sensitivity of Respondent-driven Sampling Method

    CERN Document Server

    Lu, Xin; Britton, Tom; Camitz, Martin; Kim, Beom Jun; Thorson, Anna; Liljeros, Fredrik


    Researchers in many scientific fields make inferences from individuals to larger groups. For many groups however, there is no list of members from which to take a random sample. Respondent-driven sampling (RDS) is a relatively new sampling methodology that circumvents this difficulty by using the social networks of the groups under study. The RDS method has been shown to provide unbiased estimates of population proportions given certain conditions. The method is now widely used in the study of HIV-related high-risk populations globally. In this paper, we test the RDS methodology by simulating RDS studies on the social networks of a large LGBT web community. The robustness of the RDS method is tested by violating, one by one, the conditions under which the method provides unbiased estimates. Results reveal that the risk of bias is large if networks are directed, or respondents choose to invite persons based on characteristics that are correlated with the study outcomes. If these two problems are absent, the RD...

  7. A direct sampling method to an inverse medium scattering problem

    KAUST Repository

    Ito, Kazufumi


    In this work we present a novel sampling method for time harmonic inverse medium scattering problems. It provides a simple tool to directly estimate the shape of the unknown scatterers (inhomogeneous media), and it is applicable even when the measured data are only available for one or two incident directions. A mathematical derivation is provided for its validation. Two- and three-dimensional numerical simulations are presented, which show that the method is accurate even with a few sets of scattered field data, computationally efficient, and very robust with respect to noises in the data. © 2012 IOP Publishing Ltd.

  8. Microextraction Methods for Preconcentration of Aluminium in Urine Samples

    Directory of Open Access Journals (Sweden)

    Farzad Farajbakhsh, Mohammad Amjadi, Jamshid Manzoori, Mohammad R. Ardalan, Abolghasem Jouyban


    Full Text Available Background: Analysis of aluminium (Al in urine samples is required in management of a number of diseases including patients with renal failure. This work aimed to present dispersive liquid-liquid microextraction (DLLME and ultrasound-assisted emulsification microextraction (USAEME methods for the preconcentration of ultra-trace amount of aluminum in human urine prior to its determination by a graphite furnace atomic absorption spectrometry (GFAAS. Methods: The microextraction methods were based on the complex formation of Al3+ with 8-hydroxyquinoline. The effect of various experimental parameters on the efficiencies of the methods and their optimum values were studied. Results: Under the optimal conditions, the limits of detection for USAEME-GFAAS and DLLME-GFAAS were 0.19 and 0.30 ng mL−1, respectively and corresponding relative standard deviations (RSD, n=5 for the determination of 40 ng mL−1 Al3+ were 5.9% and 4.9%. Conclusion: Both methods could be successfully used to the analysis of ultra trace concentrations of Al in urine samples of dialysis patients.

  9. Testing K. Patrick Method of Psychopathy Diagnosis in Russian Sample

    Directory of Open Access Journals (Sweden)

    Atadzhykova Y.A.,


    Full Text Available The article is devoted to the development of a method of diagnosing psychopathy, or antisocial (dissocial personality disorder. Modern researchers mostly use the methods of experiment, expert assessment, clinical interview or different combinations for personality disorders, including psychopathy. However, nowadays there is a growing need in development of a psychopathy diagnosis method which would be less labour-intensive, less expensive and more objective. One of the recently developed models of psychopathy is Trierarchic conceptualization by C. Patrick, it offers a new way to operationalize and diagnose psychopathy. The authors had tested this method in the Russian population, including both common sample as well as criminal offender sample consisting of individuals that have been suspected, accused or convicted of violent crimes. The subject of the current research is psychopathic traits measured by the tested method. We had carried out statistical and content analyzes of the data. Our study allowed to conclude that tested Russian version of the Triarchic Psychopathy Measure is effective enough to be used for research purposes. However, further research is required in order to render this measure valid to practical use.

  10. Double sampling control chart for a first order autoregressive process

    Directory of Open Access Journals (Sweden)

    Fernando A. E. Claro


    Full Text Available In this paper we propose the Double Sampling control chart for monitoring processes in which the observations follow a first order autoregressive model. We consider sampling intervals that are sufficiently long to meet the rational subgroup concept. The Double Sampling chart is substantially more efficient than the Shewhart chart and the Variable Sample Size chart. To study the properties of these charts we derived closed-form expressions for the average run length (ARL taking into account the within-subgroup correlation. Numerical results show that this correlation has a significant impact on the chart properties.Neste artigo propomos o gráfico de controle de amostragem dupla para monitoramento de processos nos quais as observações seguem um modelo autoregressivo de primeira ordem. Nós consideramos intervalos de amostragem suficientemente longos em linha com o conceito de subgrupos racionais. O gráfico de controle de amostragem dupla é substancialmente mais eficiente que o Gráfico de Shewhart e do que o Gráfico com Amostra de Tamanho Variável. Para estudar as propriedades destes gráficos nós derivamos expressões de forma-fechada para o Numero Médio de Amostras até o Sinal (NMA levando em conta a correlação dentro do subgrupo. Os resultados numéricos mostram que esta correlação tem impacto significante sobre as propriedades do gráfico.

  11. Sampling naturally contaminated broiler carcasses for Salmonella by three different methods (United States)

    Postchill neck skin (NS) maceration and whole carcass rinsing (WCR) are frequently used methods to detect salmonellae from commercially processed broilers. These are practical, nondestructive methods, but they are insensitive and may result in frequent false negatives (20 to 40%). NS samples only ...

  12. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    Energy Technology Data Exchange (ETDEWEB)

    Adamic, M.L., E-mail: [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Lister, T.E.; Dufek, E.J.; Jenson, D.D.; Olson, J.E. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States); Vockenhuber, C. [Laboratory of Ion Beam Physics, ETH Zurich, Otto-Stern-Weg 5, 8093 Zurich (Switzerland); Watrous, M.G. [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83402 (United States)


    This paper presents an evaluation of an alternate method for preparing environmental samples for {sup 129}I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  13. Angoff's delta method revisited: improving DIF detection under small samples. (United States)

    Magis, David; Facon, Bruno


    Most methods for detecting differential item functioning (DIF) are suitable when the sample sizes are sufficiently large to validate the null statistical distributions. There is no guarantee, however, that they will still perform adequately when there are few respondents in the focal group or in both the reference and the focal group. Angoff's delta plot is a potentially useful alternative for small-sample DIF investigation, but it suffers from an improper DIF flagging criterion. The purpose of this paper is to improve this classification rule under mild statistical assumptions. This improvement yields a modified delta plot with an adjusted DIF flagging criterion for small samples. A simulation study was conducted to compare the modified delta plot with both the classical delta plot approach and the Mantel-Haenszel method. It is concluded that the modified delta plot is consistently less conservative and more powerful than the usual delta plot, and is also less conservative and more powerful than the Mantel-Haenszel method as long as at least one group of respondents is small. ©2011 The British Psychological Society.

  14. Study on Processing Method of Image Shadow

    Directory of Open Access Journals (Sweden)

    Wang Bo


    Full Text Available In order to effectively remove disturbance of shadow and enhance robustness of information processing of computer visual image, this paper makes study on inspection and removal of image shadow. It makes study the continual removal algorithm of shadow based on integration, the illumination surface and texture, it respectively introduces their work principles and realization method, it can effectively carrying processing for shadow by test.


    Directory of Open Access Journals (Sweden)



    Full Text Available Innovation process has a wide scope of coverage, manifesting itself in all activities carried out in companies, nerezumându to the products and technologies. It also covers information systems, economic methods, organizational structures, decision processes, etc. It is necessary to make this clarification since there is often a tendency to limit creativity and innovation to the production, although lately in the world there is a definite trend to promote other categories of inventions and innovations.

  16. The statistical process control methods - SPC

    Directory of Open Access Journals (Sweden)

    Floreková Ľubica


    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  17. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals]. (United States)

    Kong, Qin; Chen, Lei; Wang, Ling


    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate.

  18. Process compilation methods for thin film devices (United States)

    Zaman, Mohammed Hasanuz

    This doctoral thesis presents the development of a systematic method of automatic generation of fabrication processes (or process flows) for thin film devices starting from schematics of the device structures. This new top-down design methodology combines formal mathematical flow construction methods with a set of library-specific available resources to generate flows compatible with a particular laboratory. Because this methodology combines laboratory resource libraries with a logical description of thin film device structure and generates a set of sequential fabrication processing instructions, this procedure is referred to as process compilation, in analogy to the procedure used for compilation of computer programs. Basically, the method developed uses a partially ordered set (poset) representation of the final device structure which describes the order between its various components expressed in the form of a directed graph. Each of these components are essentially fabricated "one at a time" in a sequential fashion. If the directed graph is acyclic, the sequence in which these components are fabricated is determined from the poset linear extensions, and the component sequence is finally expanded into the corresponding process flow. This graph-theoretic process flow construction method is powerful enough to formally prove the existence and multiplicity of flows thus creating a design space {cal D} suitable for optimization. The cardinality Vert{cal D}Vert for a device with N components can be large with a worst case Vert{cal D}Vert≤(N-1)! yielding in general a combinatorial explosion of solutions. The number of solutions is hence controlled through a-priori estimates of Vert{cal D}Vert and condensation (i.e., reduction) of the device component graph. The mathematical method has been implemented in a set of algorithms that are parts of the software tool MISTIC (Michigan Synthesis Tools for Integrated Circuits). MISTIC is a planar process compiler that generates

  19. Method for Sampling Alpha-Helical Protein Backbones

    Energy Technology Data Exchange (ETDEWEB)

    Fain, Boris; Levitt, Michael


    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  20. Comparisons of polybrominated diphenyl ether and hexabromocyclododecane concentrations in dust collected with two sampling methods and matched breast milk samples. (United States)

    Björklund, J A; Sellström, U; de Wit, C A; Aune, M; Lignell, S; Darnerud, P O


    Household dust from 19 Swedish homes was collected using two different sampling methods: from the occupant's own home vacuum cleaner after insertion of a new bag and using a researcher-collected method where settled house dust was collected from surfaces above floor level. The samples were analyzed for 16 polybrominated diphenyl ether (PBDE) congeners and total hexabromocyclododecane (HBCD). Significant correlations (r = 0.60-0.65, Spearman r = 0.47-0.54, P samples collected with the two sampling methods for ∑OctaBDE and ∑DecaBDE but not for ∑PentaBDE or HBCD. Statistically significantly higher concentrations of all PBDE congeners were found in the researcher-collected dust than in the home vacuum cleaner bag dust (VCBD). For HBCD, however, the concentrations were significantly higher in the home VCBD samples. Analysis of the bags themselves indicated no or very low levels of PBDEs and HBCD. This indicates that there may be specific HBCD sources to the floor and/or that it may be present in the vacuum cleaners themselves. The BDE-47 concentrations in matched pairs of VCBD and breast milk samples were significantly correlated (r = 0.514, P = 0.029), indicating that one possible exposure route for this congener may be via dust ingestion. The statistically significant correlations found for several individual polybrominated diphenyl ether (PBDE) congeners, ∑OctaBDE and ∑DecaBDE between the two dust sampling methods in this study indicate that the same indoor sources contaminate both types of dust or that common processes govern the distribution of these compounds in the indoor environment. Therefore, either method is adequate for screening ∑OctaBDE and ∑DecaBDE in dust. The high variability seen between dust samples confirms results seen in other studies. For hexabromocyclododecane (HBCD), divergent results in the two dust types indicate differences in contamination sources to the floor than to above-floor surfaces. Thus, it is still unclear which dust

  1. A survey of raw processing methods for kolanuts | Asamoah ...

    African Journals Online (AJOL)

    A survey was carried out in the Eastern and Ashanti Regions of Ghana to identify indigenous methods for the raw processing and handling of kolanuts. Using purposive and accidental sampling techniques and interviews, thirty-two individuals and eleven focus group discussions were undertaken at fourteen and ten ...

  2. Effect of Processing Methods on Nutrient Contents of Six Sweet ...

    African Journals Online (AJOL)

    A study was carried out to evaluate the effect of processing methods on nutrient contents of six fresh sweet potato varieties namely Carrot Dar, Japon, Zapallo, Mafuta, Polita and Sekondari commonly grown in three districts (Meatu, Sengerema and Missungwi) located along the lake zone of Tanzania. Fresh samples of ...

  3. Literature Review on Processing and Analytical Methods for ... (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  4. Comparison between powder and slices diffraction methods in teeth samples

    Energy Technology Data Exchange (ETDEWEB)

    Colaco, Marcos V.; Barroso, Regina C. [Universidade do Estado do Rio de Janeiro (IF/UERJ), RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada; Porto, Isabel M. [Universidade Estadual de Campinas (FOP/UNICAMP), Piracicaba, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia; Gerlach, Raquel F. [Universidade de Sao Paulo (FORP/USP), Rieirao Preto, SP (Brazil). Fac. de Odontologia. Dept. de Morfologia, Estomatologia e Fisiologia; Costa, Fanny N. [Coordenacao dos Programas de Pos-Graduacao de Engenharia (LIN/COPPE/UFRJ), RJ (Brazil). Lab. de Instrumentacao Nuclear


    Propose different methods to obtain crystallographic information about biological materials are important since powder method is a nondestructive method. Slices are an approximation of what would be an in vivo analysis. Effects of samples preparation cause differences in scattering profiles compared with powder method. The main inorganic component of bones and teeth is a calcium phosphate mineral whose structure closely resembles hydroxyapatite (HAp). The hexagonal symmetry, however, seems to work well with the powder diffraction data, and the crystal structure of HAp is usually described in space group P63/m. Were analyzed ten third molar teeth. Five teeth were separated in enamel, detin and circumpulpal detin powder and five in slices. All the scattering profile measurements were carried out at the X-ray diffraction beamline (XRD1) at the National Synchrotron Light Laboratory - LNLS, Campinas, Brazil. The LNLS synchrotron light source is composed of a 1.37 GeV electron storage ring, delivering approximately 4x10{sup -1}0 photons/s at 8 keV. A double-crystal Si(111) pre-monochromator, upstream of the beamline, was used to select a small energy bandwidth at 11 keV . Scattering signatures were obtained at intervals of 0.04 deg for angles from 24 deg to 52 deg. The human enamel experimental crystallite size obtained in this work were 30(3)nm (112 reflection) and 30(3)nm (300 reflection). These values were obtained from measurements of powdered enamel. When comparing the slice obtained 58(8)nm (112 reflection) and 37(7)nm (300 reflection) enamel diffraction patterns with those generated by the powder specimens, a few differences emerge. This work shows differences between powder and slices methods, separating characteristics of sample of the method's influence. (author)

  5. Consent process for US-based family reference DNA samples. (United States)

    Katsanis, Sara H; Snyder, Lindsey; Arnholt, Kelly; Mundorff, Amy Z


    DNA collection from family members of the missing is a tenet for missing persons' and mass fatality investigations. Procedures for consenting family members are disparate, depending on the context supporting the reason for sample collection. While guidelines and best practices have been developed for handling mass fatalities and for identification of the missing, these guidelines do not address standard consent practices for living family members of potential victims. We examined the relevant U.S. laws, international guidelines and best practices, sampled consent forms currently used for DNA collection of family members, and drafted model language for a consent form to communicate the required and recommended information. We modeled the consent form on biobank consenting practices and tested the consent language among students and the general population for constructive feedback and readability. We also asked respondents to consider the options for DNA collection and either hypothetically agree or disagree. The model language presented here highlights information important to relay in consent processes and can serve as a foundation for future consent practices in mass fatalities and missing persons' investigations. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  7. Evaluation method for process intensification alternatives

    NARCIS (Netherlands)

    Rivas, David Fernández; Castro-Hernández, Elena; Villanueva Perales, Angel Luis; van der Meer, Walter

    A method for the comparison of scenarios in the context of Process Intensification is presented, and is applied to cases reported in the literature, as well as several examples taken from selected industrial practices. A step by step calculation of different factors, all relevant in the chemical

  8. Evaluation method for process intensification alternatives

    NARCIS (Netherlands)

    Rivas, David Fernández; Castro-Hernández, Elena; Villanueva Perales, Angel Luis; van der Meer, Walter


    A method for the comparison of scenarios in the context of Process Intensification is presented, and is applied to cases reported in the literature, as well as several examples taken from selected industrial practices. A step by step calculation of different factors, all relevant in the chemical

  9. Natural Language Processing concepts and methods revisited




    The paper starts with the history of Natural Language Processing (NLP) and revisits the concepts and methods involved in the NLP. It provides overview of different classifiers and language modelling techniques. The paper also lists the different fields where NLP is used and also the software available to carry out NLP.

  10. Hand held sample tube manipulator, system and method (United States)

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH


    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  11. Ionomer-Membrane Water Processing Methods (United States)

    MacCallum, Taber K. (Inventor); Kelsey, Laura (Inventor)


    This disclosure provides water processing apparatuses, systems, and methods for recovering water from wastewater such as urine. The water processing apparatuses, systems, and methods can utilize membrane technology for extracting purified water in a single step. A containment unit can include an ionomer membrane, such as Nafion(TradeMark) over a hydrophobic microporous membrane, such as polytetrafluoroethylene (PTFE). The containment unit can be filled with wastewater, and the hydrophobic microporous membrane can be impermeable to liquids and solids of the wastewater but permeable to gases and vapors of the wastewater, and the ionomer membrane can be permeable to water vapor but impermeable to one or more contaminants of the gases and vapors. The containment unit can be exposed to a dry purge gas to maintain a water vapor partial pressure differential to drive permeation of the water vapor, and the water vapor can be collected and processed into potable water.

  12. Development of a sodium dodecyl sulfate-polyacrylamide gel electrophoresis reference method for the analysis and identification of fish species in raw and heat-processed samples : A collaborative study

    DEFF Research Database (Denmark)

    Pineiro, C.; Barros-Velazquez, J.; Perez-Martin, R.I.


    A collaborative study was carried out in seven European labs with the aim of achieving a sodium dodecyl sulfate- polyacrylamide gel electrophoresis (SDS-PAGE) standard operation procedure to identify fish species in raw and cooked samples. Urea and SDS-containing solutions were evaluated...... seemed not to be influenced so much by the state of the sample (raw, cooked at 60 degrees C, cooked at 85 degrees C). Desalting, ultrafiltration or treatment with RNase/DNase did not improve the discriminatory power of the protein patterns. Commercial homogeneous 15% ExcelGels, especially when they were...... silver stained, yielded good results and afforded higher reproducibility, thus allowing a better matching of results among the laboratories participating in this collaborative study. Under the optimized technical conditions described above, all the fish species tested, either raw and cooked, yielded...

  13. Sample Size for Assessing Agreement between Two Methods of Measurement by Bland-Altman Method. (United States)

    Lu, Meng-Jie; Zhong, Wei-Hua; Liu, Yu-Xiu; Miao, Hua-Zhang; Li, Yong-Chang; Ji, Mu-Huo


    The Bland-Altman method has been widely used for assessing agreement between two methods of measurement. However, it remains unsolved about sample size estimation. We propose a new method of sample size estimation for Bland-Altman agreement assessment. According to the Bland-Altman method, the conclusion on agreement is made based on the width of the confidence interval for LOAs (limits of agreement) in comparison to predefined clinical agreement limit. Under the theory of statistical inference, the formulae of sample size estimation are derived, which depended on the pre-determined level of α, β, the mean and the standard deviation of differences between two measurements, and the predefined limits. With this new method, the sample sizes are calculated under different parameter settings which occur frequently in method comparison studies, and Monte-Carlo simulation is used to obtain the corresponding powers. The results of Monte-Carlo simulation showed that the achieved powers could coincide with the pre-determined level of powers, thus validating the correctness of the method. The method of sample size estimation can be applied in the Bland-Altman method to assess agreement between two methods of measurement.

  14. A direct sampling method for inverse electromagnetic medium scattering

    KAUST Repository

    Ito, Kazufumi


    In this paper, we study the inverse electromagnetic medium scattering problem of estimating the support and shape of medium scatterers from scattered electric/magnetic near-field data. We shall develop a novel direct sampling method based on an analysis of electromagnetic scattering and the behavior of the fundamental solution. It is applicable to a few incident fields and needs only to compute inner products of the measured scattered field with the fundamental solutions located at sampling points. Hence, it is strictly direct, computationally very efficient and highly robust to the presence of data noise. Two- and three-dimensional numerical experiments indicate that it can provide reliable support estimates for multiple scatterers in the case of both exact and highly noisy data. © 2013 IOP Publishing Ltd.

  15. Determination of pesticides and their metabolites in processed cereal samples. (United States)

    González-Curbelo, M Á; Hernández-Borges, J; Borges-Miquel, T M; Rodríguez-Delgado, M Á


    Fifteen pesticides including some of their metabolites (disulfoton sulfoxide, ethoprophos, cadusafos, dimethoate, terbufos, disulfoton, chlorpyrifos-methyl, malaoxon, fenitrothion, pirimiphos-methyl, malathion, chlorpyrifos, terbufos sulfone, disulfoton sulfone and fensulfothion) were analysed in milled toasted wheat and maize as well as in wheat flour and baby cereals. The QuEChERS (quick, easy, cheap, effective, rugged and safe) methodology was used and its dispersive solid-phase extraction procedure was optimised by means of an experimental design with the aim of reducing the amount of co-extracted lipids and obtaining a clean extract. Gas chromatography with nitrogen phosphorus detection were used as the separation and detection techniques, respectively. The method was validated in terms of selectivity, recoveries, calibration, precision and accuracy as well as matrix effects. Limits of detection were between 0.07 and 34.8 µg kg(-1) with recoveries in the range of 71-110% (relative standard deviations were below 9%). A total of 40 samples of different origin were analysed. Residues of pirimiphos-methyl were found in six of the samples at concentrations in the range 0.08-0.47 mg kg(-1), which were below the MRLs established for this pesticide in cereal grains. Tandem mass spectrometry confirmation was also carried out in order to identify unequivocally the presence of this pesticide.

  16. Empirical comparison of neutron activation sample analysis methods (United States)

    Gillenwalters, Elizabeth

    The U.S. Geological Survey (USGS) operates a research reactor used mainly for neutron activation of samples, which are then shipped to industrial customers. Accurate nuclide identification and activity determination are crucial to remain in compliance with Code of Federal Regulations guidelines. This facility utilized a Canberra high purity germanium detector (HPGe) coupled with Canberra Genie(TM) 2000 (G2K) software for gamma spectroscopy. This study analyzed the current method of nuclide identification and activity determination of neutron activated materials utilized by the USGS reactor staff and made recommendations to improve the method. Additionally, analysis of attenuators, effect of detector dead time on nuclide identification, and validity of activity determination assumptions were investigated. The current method of activity determination utilized the G2K software to obtain ratio of activity per nuclide identified. This determination was performed without the use of geometrically appropriate efficiency calibration curves. The ratio of activity per nuclide was used in conjunction with an overall exposure rate in mR/h obtained via a Fluke Biomedical hand-held ion chamber. The overall exposure rate was divided into individual nuclide amounts based on the G2K nuclide ratios. A gamma energy of 1 MeV and a gamma yield of 100% was assumed for all samples. Utilizing the gamma assumption and nuclide ratios, a calculation was performed to determine total sample activity in muCi (microCuries). An alternative method was proposed, which would eliminate the use of exposure rate and rely solely on the G2K software capabilities. The G2K software was energy and efficiency calibrated with efficiency curves developed for multiple geometries. The USGS reactor staff were trained to load appropriate calibration data into the G2K software prior to sample analysis. Comparison of the current method and proposed method demonstrated that the activity value calculated with the 1 Me

  17. Generalized Jones matrix method for homogeneous biaxial samples. (United States)

    Ortega-Quijano, Noé; Fade, Julien; Alouini, Mehdi


    The generalized Jones matrix (GJM) is a recently introduced tool to describe linear transformations of three-dimensional light fields. Based on this framework, a specific method for obtaining the GJM of uniaxial anisotropic media was recently presented. However, the GJM of biaxial media had not been tackled so far, as the previous method made use of a simplified rotation matrix that lacks a degree of freedom in the three-dimensional rotation, thus being not suitable for calculating the GJM of biaxial media. In this work we propose a general method to derive the GJM of arbitrarily-oriented homogeneous biaxial media. It is based on the differential generalized Jones matrix (dGJM), which is the three-dimensional counterpart of the conventional differential Jones matrix. We show that the dGJM provides a simple and elegant way to describe uniaxial and biaxial media, with the capacity to model multiple simultaneous optical effects. The practical usefulness of this method is illustrated by the GJM modeling of the polarimetric properties of a negative uniaxial KDP crystal and a biaxial KTP crystal for any three-dimensional sample orientation. The results show that this method constitutes an advantageous and straightforward way to model biaxial media, which show a growing relevance for many interesting applications.

  18. Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper

    Directory of Open Access Journals (Sweden)

    Burgoyne LA


    Full Text Available Abstract Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler.


    Directory of Open Access Journals (Sweden)

    Camelia COŞEREANU


    Full Text Available This paper aims at presenting an ornament used to decorate the art furniture, whose classic realization requires a lot of manual labor, patience and artistic talent. General elements about the way of obtaining the inlaid veneer named intarsia are presented, continuing with the styles of art furniture where it was used for decoration purpose, the classical method and artistic techniques used by the manufacturers. In contrast to the classic method, the modern one is presented, namely the process of veneer cutting by laser beam, together with the benefits of applying this method to obtain a high quality intarsia work. The up-to-date method of laser cutting of veneers in order to obtain intarsia decoration has to be an impulse for the manufacturers in order to reinvigorate the art furniture export, furniture so much appreciated abroad in the past, but whose price included an extremely high workmanship.

  20. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Gang; Lin, Yuehe


    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  1. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation. (United States)

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham


    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  2. Verification of spectrophotometric method for nitrate analysis in water samples (United States)

    Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu


    The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.

  3. Methods to maximise recovery of environmental DNA from water samples.

    Directory of Open Access Journals (Sweden)

    Rheyda Hinlo

    Full Text Available The environmental DNA (eDNA method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days. This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.

  4. Methods to maximise recovery of environmental DNA from water samples. (United States)

    Hinlo, Rheyda; Gleeson, Dianne; Lintermans, Mark; Furlan, Elise


    The environmental DNA (eDNA) method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus) as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3-5 days). This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure.

  5. Field evaluation of broiler gait score using different sampling methods

    Directory of Open Access Journals (Sweden)

    AFS Cordeiro


    Full Text Available Brazil is today the world's largest broiler meat exporter; however, in order to keep this position, it must comply with welfare regulations while maintaining low production costs. Locomotion problems restrain bird movements, limiting their access to drinking and feeding equipment, and therefore their survival and productivity. The objective of this study was to evaluate locomotion deficiency in broiler chickens reared under stressful temperature conditions using three different sampling methods of birds from three different ages. The experiment consisted in determining the gait score of 28, 35, 42 and 49-day-old broilers using three different known gait scoring methods: M1, birds were randomly selected, enclosed in a circle, and then stimulated to walk out of the circle; M2, ten birds were randomly selected and gait scored; and M3, birds were randomly selected, enclosed in a circle, and then observed while walking away from the circle without stimulus to walking. Environmental temperature, relative humidity, and light intensity inside the poultry houses were recorded. No evidence of interaction between scoring method and age was found however, both method and age influenced gait score. Gait score was found to be lower at 28 days of age. The evaluation using the ten randomly selected birds within the house was the method that presented the less reliable results. Gait score results when birds were stimulated to walk were lower than when they were not simulated, independently of age. The gait scores obtained with the three tested methods and ages were higher than those considered acceptable. The highest frequency of normal gait score (0 represented 50% of the flock. These results may be related to heat stress during rearing. Average gait score incresead with average ambient temperature, relative humidity, and light intensity. The evaluation of gait score to detect locomotion problems of broilers under rearing conditions seems subjective and

  6. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.


    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  7. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples. (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G


    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  8. Efficient free energy calculations by combining two complementary tempering sampling methods (United States)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun


    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  9. Expanding the application of the tablet processing workstation to support the sample preparation of oral suspensions. (United States)

    Opio, Alex Manuel; Nickerson, Beverly; Xue, Gang; Warzeka, John; Norris, Ken


    Sample preparation is the most time-consuming part of the analytical method for powder for oral suspension (POS) assay, purity, and preservative analysis, as this involves multiple dilution and filtration steps. The Tablet Processing Workstation (TPW) was used to automate the sample preparation of a POS formulation. Although the TPW is typically used to automate the preparation of solid oral dosage forms and powders, it contains all of the necessary components to perform POS sample preparation. The TPW exhibited acceptable repeatability in testing 3 lots using 10 replicate preparations per lot. Acceptable linearity of the drug and preservative in the presence of excipients was demonstrated over the range corresponding to 50-150% of intent. Accuracy showed suitable recoveries for all points evaluated. TPW results were shown to correlate to results obtained with the manual method. The TPW method was used to prepare samples in support of manufacturing scale-up efforts. With the efficiencies gained using the TPW, it was possible to analyze a large number of samples generated during process development activities for the POS formulation with minimal human intervention. The extensive data enabled trending of the manufacturing development runs and helped to identify optimization strategies for the process. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  10. GREENSCOPE: A Method for Modeling Chemical Process ... (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua


    Directory of Open Access Journals (Sweden)



    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES


    Directory of Open Access Journals (Sweden)

    Cardos Vasile-Daniel


    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES

  13. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods. (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C


    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  14. Method of remotely characterizing thermal properties of a sample (United States)

    Heyman, Joseph S. (Inventor); Heath, D. Michele (Inventor); Welch, Christopher (Inventor); Winfree, William P. (Inventor); Miller, William E. (Inventor)


    A sample in a wind tunnel is radiated from a thermal energy source outside of the wind tunnel. A thermal imager system, also located outside of the wind tunnel, reads surface radiations from the sample as a function of time. The produced thermal images are characteristic of the heat transferred from the sample to the flow across the sample. In turn, the measured rates of heat loss of the sample are characteristic of the flow and the sample.

  15. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua


    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  16. An adaptive sampling and windowing interrogation method in PIV (United States)

    Theunissen, R.; Scarano, F.; Riethmuller, M. L.


    This study proposes a cross-correlation based PIV image interrogation algorithm that adapts the number of interrogation windows and their size to the image properties and to the flow conditions. The proposed methodology releases the constraint of uniform sampling rate (Cartesian mesh) and spatial resolution (uniform window size) commonly adopted in PIV interrogation. Especially in non-optimal experimental conditions where the flow seeding is inhomogeneous, this leads either to loss of robustness (too few particles per window) or measurement precision (too large or coarsely spaced interrogation windows). Two criteria are investigated, namely adaptation to the local signal content in the image and adaptation to local flow conditions. The implementation of the adaptive criteria within a recursive interrogation method is described. The location and size of the interrogation windows are locally adapted to the image signal (i.e., seeding density). Also the local window spacing (commonly set by the overlap factor) is put in relation with the spatial variation of the velocity field. The viability of the method is illustrated over two experimental cases where the limitation of a uniform interrogation approach appears clearly: a shock-wave-boundary layer interaction and an aircraft vortex wake. The examples show that the spatial sampling rate can be adapted to the actual flow features and that the interrogation window size can be arranged so as to follow the spatial distribution of seeding particle images and flow velocity fluctuations. In comparison with the uniform interrogation technique, the spatial resolution is locally enhanced while in poorly seeded regions the level of robustness of the analysis (signal-to-noise ratio) is kept almost constant.

  17. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine. (United States)

    Hodgson, James A; Seyler, Tiffany H; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing


    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs-N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)-using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV method increases sample throughput while maintaining a low limit of detection (sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle.

  18. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik


    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  19. A review of blood sample handling and pre-processing for metabolomics studies. (United States)

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta


    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Geothermal water and gas: collected methods for sampling and analysis. Comment issue. [Compilation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, J.G.; Serne, R.J.; Shannon, D.W.; Woodruff, E.M.


    A collection of methods for sampling and analysis of geothermal fluids and gases is presented. Compilations of analytic options for constituents in water and gases are given. Also, a survey of published methods of laboratory water analysis is included. It is stated that no recommendation of the applicability of the methods to geothermal brines should be assumed since the intent of the table is to encourage and solicit comments and discussion leading to recommended analytical procedures for geothermal waters and research. (WHK)

  1. Contribution of Sample Processing to Variability and Accuracy of the Results of Pesticide Residue Analysis in Plant Commodities. (United States)

    Ambrus, Árpád; Buczkó, Judit; Hamow, Kamirán Á; Juhász, Viktor; Solymosné Majzik, Etelka; Szemánné Dobrik, Henriett; Szitás, Róbert


    Significant reduction of concentration of some pesticide residues and substantial increase of the uncertainty of the results derived from the homogenization of sample materials have been reported in scientific papers long ago. Nevertheless, performance of methods is frequently evaluated on the basis of only recovery tests, which exclude sample processing. We studied the effect of sample processing on accuracy and uncertainty of the measured residue values with lettuce, tomato, and maize grain samples applying mixtures of selected pesticides. The results indicate that the method is simple and robust and applicable in any pesticide residue laboratory. The analytes remaining in the final extract are influenced by their physical-chemical properties, the nature of the sample material, the temperature of comminution of sample, and the mass of test portion extracted. Consequently, validation protocols should include testing the effect of sample processing, and the performance of the complete method should be regularly checked within internal quality control.

  2. Method for producing a thin sample band in a microchannel device (United States)

    Griffiths, Stewart K [Livermore, CA; Nilson, Robert H [Cardiff, CA


    The present invention improves the performance of microchannel systems for chemical and biological synthesis and analysis by providing a method and apparatus for producing a thin band of a species sample. Thin sample bands improve the resolution of microchannel separation processes, as well as many other processes requiring precise control of sample size and volume. The new method comprises a series of steps in which a species sample is manipulated by controlled transport through a junction formed at the intersection of four or more channels. A sample is first inserted into the end of one of these channels in the vicinity of the junction. Next, this sample is thinned by transport across the junction one or more times. During these thinning steps, flow enters the junction through one of the channels and exists through those remaining, providing a divergent flow field that progressively stretches and thins the band with each traverse of the junction. The thickness of the resulting sample band may be smaller than the channel width. Moreover, the thickness of the band may be varied and controlled by altering the method alone, without modification to the channel or junction geometries. The invention is applicable to both electroosmotic and electrophoretic transport, to combined electrokinetic transport, and to some special cases in which bulk fluid transport is driven by pressure gradients. It is further applicable to channels that are open, filled with a gel or filled with a porous or granular material.

  3. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  4. Pesticide-sampling equipment, sample-collection and processing procedures, and water-quality data at Chicod Creek, North Carolina, 1992 (United States)

    Manning, T.K.; Smith, K.E.; Wood, C.D.; Williams, J.B.


    Water-quality samples were collected from Chicod Creek in the Coastal Plain Province of North Carolina during the summer of 1992 as part of the U.S. Geological Survey's National Water-Quality Assessment Program. Chicod Creek is in the Albemarle-Pamlico drainage area, one of four study units designated to test equipment and procedures for collecting and processing samples for the solid-phase extraction of selected pesticides, The equipment and procedures were used to isolate 47 pesticides, including organonitrogen, carbamate, organochlorine, organophosphate, and other compounds, targeted to be analyzed by gas chromatography/mass spectrometry. Sample-collection and processing equipment equipment cleaning and set-up procedures, methods pertaining to collecting, splitting, and solid-phase extraction of samples, and water-quality data resulting from the field test are presented in this report Most problems encountered during this intensive sampling exercise were operational difficulties relating to equipment used to process samples.


    Directory of Open Access Journals (Sweden)

    Saeid Karamzadeh


    Full Text Available In the past three decades, a lot of various applications of Ground Penetrating Radar (GPR took place in real life. There are important challenges of this radar in civil applications and also in military applications. In this paper, the fundamentals of GPR systems will be covered and three important signal processing methods (Wavelet Transform, Matched Filter and Hilbert Huang will be compared to each other in order to get most accurate information about objects which are in subsurface or behind the wall.




    In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation. PMID:23645963

  7. Methods of control the machining process

    Directory of Open Access Journals (Sweden)

    Yu.V. Petrakov


    Full Text Available Presents control methods, differentiated by the time of receipt of information used: a priori, a posteriori and current. When used a priori information to determine the mode of cutting is carried out by simulation the process of cutting allowance, where the shape of the workpiece and the details are presented in the form of wireframes. The office for current information provides for a system of adaptive control and modernization of CNC machine, where in the input of the unit shall be computed by using established optimization software. For the control by a posteriori information of the proposed method of correction of shape-generating trajectory in the second pass measurement surface of the workpiece formed by the first pass. Developed programs that automatically design the adjusted file for machining.

  8. Experimental application of contour method for determination of residual stress in subsurface layers of milled sample

    Directory of Open Access Journals (Sweden)

    Karel Horák


    Full Text Available Determination of residual stress close to the sample surface is in the most cases performed by hole-drilling method, X-Ray diffraction or neutron diffraction. Each of these methods has its benefits and disadvantages. In case of diffraction methods the measurement speed is the main disadvantage. It is also very problematic to apply diffraction method in case of sample with mechanically deformed surface, for example by standard machining operations. Therefore, determined results are very often confusing and hard to interpret. On the other side, hole drilling method is less sensitive to quality of sample surface than diffraction methods, but measurement realization is quite expensive and equipment demanding (strain gage rosettes, miniature milling cutter, high speed milling machine, pc equipment,….Recently introduce contour method used for determination of residual stress inside the sample is very fast, can be performed with almost common laboratory equipment and combines traditional stance with modern numerical methods by FEM. Contour method was selected for determination of residual stress below the milled surface and the dependency of milling process quality on residual stress value is demonstrated.

  9. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods (United States)

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.


    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  10. Parasitological stool sample exam by spontaneous sedimentation method using conical tubes: effectiveness, practice, and biosafety

    Directory of Open Access Journals (Sweden)

    Steveen Rios Ribeiro


    Full Text Available INTRODUCTION: Spontaneous sedimentation is an important procedure for stool examination. A modification of this technique using conical tubes was performed and evaluated. METHODS: Fifty fecal samples were processed in sedimentation glass and in polypropylene conical tubes. Another 50 samples were used for quantitative evaluation of protozoan cysts. RESULTS: Although no significant differences occurred in the frequency of protozoa and helminths detected, significant differences in protozoan cyst counts did occur. CONCLUSIONS: The use of tube predicts a shorter path in the sedimentation of the sample, increases concentration of parasites for microscopy analysis, minimizes the risks of contamination, reduces the odor, and optimizes the workspace.

  11. Probing methane hydrate nucleation through the forward flux sampling method. (United States)

    Bi, Yuanfei; Li, Tianshu


    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate.

  12. Comparison of two methods of tear sampling for protein quantification by Bradford method

    Directory of Open Access Journals (Sweden)

    Eliana Farias


    Full Text Available The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes using Schirmer tear test (STT strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001 were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.

  13. Update on ESTCP Project ER-0918: Field Sampling and Sample Processing for Metals on DoD Ranges (United States)


    recovery of antimony is evident with conventional analysis; new digestion process needed 22 ...applicable to both metals and energetics Experimental Design –Task 1 ● Multi-increment versus grab samples ● Number of increments per decision unit...for digestate preparation 4 Experimental Design –Task 1 5 Single DU Grab Sample FP MI Sample Berm Sample Type 5 10 20 30 50 100 50 50 Pb < 400

  14. Highly Effective DNA Extraction Method from Fresh, Frozen, Dried and Clotted Blood Samples

    Directory of Open Access Journals (Sweden)

    Jaleh Barar


    Full Text Available Introduction: Today, with the tremendous potential of genomics and other recent advances in science, the role of science to improve reliable DNA extraction methods is more relevant than ever before. The ideal process for genomic DNA extraction demands high quantities of pure, integral and intact genomic DNA (gDNA from the sample with minimal co-extraction of inhibitors of downstream processes. Here, we report the development of a very rapid, less-hazardous, and high throughput protocol for extracting of high quality DNA from blood samples. Methods: Dried, clotted and ethylene diamine tetra-acetic acid (EDTA treated fresh and frozen blood samples were extracted using this method in which the quality and integrity of the extracted DNA were corroborated by agarose gel electrophoresis, PCR reaction and DNA digestion using restricted enzyme. The UV spectrophotometric and gel electrophoresis analysis resulted in high A260/A280 ratio (>1.8 with high intactness of DNA. Results: PCR and DNA digestion experiments indicated that the final solutions of extracted DNA contained no inhibitory substances, which confirms that the isolated DNA is of good quality. Conclusion: The high quality and quantity of current method, no enzymatic processing and accordingly its low cost, make it appropriate for DNA extraction not only from human but also from animal blood samples in any molecular biology labs.

  15. Digital Signal Processing Methods for Ultrasonic Echoes. (United States)

    Sinding, Kyle; Drapaca, Corina; Tittmann, Bernhard


    Digital signal processing has become an important component of data analysis needed in industrial applications. In particular, for ultrasonic thickness measurements the signal to noise ratio plays a major role in the accurate calculation of the arrival time. For this application a band pass filter is not sufficient since the noise level cannot be significantly decreased such that a reliable thickness measurement can be performed. This paper demonstrates the abilities of two regularization methods - total variation and Tikhonov - to filter acoustic and ultrasonic signals. Both of these methods are compared to a frequency based filtering for digitally produced signals as well as signals produced by ultrasonic transducers. This paper demonstrates the ability of the total variation and Tikhonov filters to accurately recover signals from noisy acoustic signals faster than a band pass filter. Furthermore, the total variation filter has been shown to reduce the noise of a signal significantly for signals with clear ultrasonic echoes. Signal to noise ratios have been increased over 400% by using a simple parameter optimization. While frequency based filtering is efficient for specific applications, this paper shows that the reduction of noise in ultrasonic systems can be much more efficient with regularization methods.

  16. Method to resolve microphone and sample location errors in the two-microphone duct measurement method (United States)



    Utilizing the two-microphone impedance tube method, the normal incidence acoustic absorption and acoustic impedance can be measured for a given sample. This method relies on the measured transfer function between two microphones, and the knowledge of their precise location relative to each other and the sample material. In this article, a method is proposed to accurately determine these locations. A third sensor is added at the end of the tube to simplify the measurement. First, a justification and investigation of the method is presented. Second, reference terminations are measured to evaluate the accuracy of the apparatus. Finally, comparisons are made between the new method and current methods for determining these distances and the variations are discussed. From this, conclusions are drawn with regards to the applicability and need for the new method and under which circumstances it is applicable. Results show that the method provides a reliable determination of both microphone locations, which is not possible using the current techniques. Errors due to inaccurate determinination of these parameters between methods were on the order of 3% for R and 12% for Re Z.

  17. an assessment of methods for sampling carabid beetles

    African Journals Online (AJOL)


    Moist leaf litter was scooped onto white clothing (1 square metre beating sheet) and carabid beetles caught using a “pootah” (aspirator) or a pair of forceps. Resting beetles were sampled by manual searching under logs, stones and tree barks. Sampling effort was measured by time, each "sample" containing carabid.

  18. Interatomic force microscope and sample observing method therefor


    YAMANAKA, K; Kolosov, Oleg; Ogiso, H; Sato, H.; Koda, T


    PURPOSE:To provide a measuring technology for interatomic microscope in which the irregular sample can be separated well from the frictional force. SOLUTION :An oscillating force applied laterally relatively between a sample 8 and a probe 4 Is provided. The sample 8 tilted laterally to excite bending orthogonal oscillation. The phase and the amplitude of the oscillation of the cantilever are detected.

  19. 19 CFR 151.70 - Method of sampling by Customs. (United States)


    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a... clean yield if such a test is requested in accordance with the provisions of § 151.71(c), or if a second...

  20. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality of the System after being on the Surface for Two Years. (United States)

    Beegle, L. W.; Anderson, R. C.; Abbey, W. J.


    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system. SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Curiosity rover preformed several sample acquisitions and processing of solid samples during its first year of operation. Material were processed and delivered to the two analytical instruments, Chemistry and Mineralogy (CheMin) and Sample Analysis at Mars (SAM), both of which required specific particle size for the material delivered to them to perform their analysis to determine its mineralogy and geochemistry content. In this presentation, the functionality of the system will be explained along with the in-situ targets the system has acquire and the samples that were delivered.

  1. Sample collection method and sequential sampling plan for mites Oligonychus ununquis and Aphids cinaria laricifex on tamarack. Technical note No. 278

    Energy Technology Data Exchange (ETDEWEB)


    Spider mite and aphid infestations are a recurrent problem in a number of seed orchards in New Brunswick. A study was carried out to develop suitable sampling methods for routine assessment of black larch aphid (Cinaria laricifex) and spruce spider mite (Oligonychus ununquis) populations on tamarack and black spruce. Tamarack was chosen because population levels of black larch aphid and spruce spider mite have been traditionally high. Sample collection procedures, sample processing/counting, sequential sampling, and use of the sequential sampling plan are discussed.

  2. Sampling and measurement methods for diesel exhaust aerosol

    Energy Technology Data Exchange (ETDEWEB)

    Ristimaeki, J.


    Awareness of adverse health effects of urban aerosols has increased general interest in aerosol sources. As diesel engines are one significant urban anthropogenic particle source, diesel aerosols have been under intense research during the last decades. This thesis discusses the measurement issues related to the diesel exhaust particles, focusing on the effective density measurement with Elpi-Sumps and Tda-Elpi methods and presents some additional performance issues not discussed in the papers. As the emergence of volatile nanoparticles in the diesel exhaust is sensitive to prevailing circumstances there is a need to properly control the dilution parameters in laboratory measurements in order to obtain repeatable and reproducible results. In addition to the dilution parameters, the effect of ambient temperature on the light duty vehicle exhaust particulate emission was studied. It was found that turbo charged diesel engines were relatively insensitive to changes in ambient temperature whereas particle emissions from naturally aspirated gasoline vehicles were significantly increased at low temperatures. The measurement of effective density and mass of aerosol particles with Dma and impactor was studied and applied to characterisation of diesel exhaust particles. The Tda-Elpi method was used for determination of the volatile mass of diesel exhaust particles as a function of particle size. Based on the measurement results, condensation was suggested to be the main phenomena driving volatile mass transfer to the exhaust particles. Identification of the process and the separation of volatile and solid mass may become important as some health effect studies suggest the volatile fraction to be a key component causing the biological effects of diesel exhaust particles. (orig.)

  3. The experimental research on response characteristics of coal samples under the uniaxial loading process (United States)

    Jia, Bing; Wei, Jian-Ping; Wen, Zhi-Hui; Wang, Yun-Gang; Jia, Lin-Xing


    In order to study the response characteristics of infrasound in coal samples under the uniaxial loading process, coal samples were collected from GengCun mine. Coal rock stress loading device, acoustic emission tested system and infrasound tested system were used to test the infrasonic signal and acoustic emission signal under uniaxial loading process. The tested results were analyzed by the methods of wavelet filter, threshold denoise, time-frequency analysis and so on. The results showed that in the loading process, the change of the infrasonic wave displayed the characteristics of stage, and it could be divided into three stages: initial stage with a certain amount infrasound events, middle stage with few infrasound events, and late stage gradual decrease. It had a good consistency with changing characteristics of acoustic emission. At the same time, the frequency of infrasound was very low. It can propagate over a very long distance with little attenuation, and the characteristics of the infrasound before the destruction of the coal samples were obvious. A method of using the infrasound characteristics to predict the destruction of coal samples was proposed. This is of great significance to guide the prediction of geological hazards in coal mines.

  4. Implementation of SMED method in wood processing

    Directory of Open Access Journals (Sweden)

    Vukićević Milan R.


    Full Text Available The solution of problems in production is mainly tackled by the management based on the hardware component, i.e. by the introduction of work centres of recent generation. In this way, it ensures the continuity of quality reduced consumption of energy, humanization of work, etc. However, the interaction between technical-technological and organizational-economic aspects of production is neglected. This means that the new-generation equipment requires a modern approach to planning, organization, and management of production, as well as to economy of production. Consequently it is very important to ensure the implementation of modern organizational methods in wood processing. This paper deals with the problem of implementation of SMED method (SMED - Single Digit Minute Exchange of Die in the aim of rationalization of set-up-end-up operations. It is known that in the conditions of discontinuous production, set-up-end-up time is a significant limiting factor in the increase of flexibility of production systems.

  5. 40 CFR 761.243 - Standard wipe sample method and size. (United States)


    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Standard wipe sample method and size... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas...

  6. Quantitative evaluation method of arc sound spectrum based on sample entropy (United States)

    Yao, Ping; Zhou, Kang; Zhu, Qiang


    Arc sound analysis is an effective way to evaluate the stability of the arc welding process. Current methods cannot effectively quantify the disorder of the process. By studying the characteristics of the arc sound signal, we found that low frequency random mutation of arc sound power resulted from unstable factors, such as splashes or short circuits, increased the complexity and randomness of the arc sound signals. Then the arc sound signals were visualized on time-frequency interface by means of spectrogram, and it was found that the max power spectral density (PSD) distribution of spectrogram was closely related to the stability of arc welding process. Moreover, a method based on sample entropy was proposed to further quantify the relation. Finally, considering the factors such as averages of max PSD and the standard deviations of sample entropy, a compound quantitative evaluation indicator, arc sound sample entropy (ASSE), which can avoid the influence of different parameters on the quantitative results, was proposed, so that the stability of arc welding process can be quantitatively presented. Testing results showed that the accuracy rate of the method was more than 90 percent.

  7. Data warehousing methods and processing infrastructure for brain recovery research. (United States)

    Gee, T; Kenny, S; Price, C J; Seghier, M L; Small, S L; Leff, A P; Pacurar, A; Strother, S C


    In order to accelerate translational neuroscience with the goal of improving clinical care it has become important to support rapid accumulation and analysis of large, heterogeneous neuroimaging samples and their metadata from both normal control and patient groups. We propose a multi-centre, multinational approach to accelerate the data mining of large samples and facilitate data-led clinical translation of neuroimaging results in stroke. Such data-driven approaches are likely to have an early impact on clinically relevant brain recovery while we simultaneously pursue the much more challenging model-based approaches that depend on a deep understanding of the complex neural circuitry and physiological processes that support brain function and recovery. We present a brief overview of three (potentially converging) approaches to neuroimaging data warehousing and processing that aim to support these diverse methods for facilitating prediction of cognitive and behavioral recovery after stroke, or other types of brain injury or disease.

  8. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    Energy Technology Data Exchange (ETDEWEB)

    Addleman, Raymond S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Naes, Benjamin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Olsen, Khris B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chouyyok, Wilaiwan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Willingham, David G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spigner, Angel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmental sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for

  9. sampling plans for monitoring quality control process at a plastic

    African Journals Online (AJOL)

    Dr Obe

    “decrease in inspector performance will occur with increases in length of inspection time”. The company estimates the inventory holding cost associated with the 5-day hold at N120,. 000.00 a year. For the near-zero-defect process, the potential impact of the defect provides clues to management about the type of system to.

  10. Communication Barriers in Quality Process: Sakarya University Sample (United States)

    Yalcin, Mehmet Ali


    Communication has an important role in life and especially in education. Nowadays, lots of people generally use technology for communication. When technology uses in education and other activities, there may be some communication barriers. And also, quality process has an important role in higher education institutes. If a higher education…

  11. sampling plans for monitoring quality control process at a plastic

    African Journals Online (AJOL)

    Dr Obe

    Abstract. This paper explores the problem of high quality cost at a medium –sized firm manufacturing various types and sizes of plastic container, using a real life data. In pursuance of their quality objectives, the company established a policy that dictates and expensive and time-consuming post-manufacturing process.

  12. Sampling Plans for Monitoring Quality Control Process at a Plastic ...

    African Journals Online (AJOL)

    This paper explores the problem of high quality cost at a medium –sized firm manufacturing various types and sizes of plastic container, using a real life data. In pursuance of their quality objectives, the company established a policy that dictates and expensive and time-consuming post-manufacturing process. While the ...

  13. Effects of Heterogeneities, Sampling Frequencies, Tools and Methods on Uncertainties in Subsurface Contaminant Concentration Measurements (United States)

    Ezzedine, S. M.; McNab, W. W.


    Long-term monitoring (LTM) is particularly important for contaminants which are mitigated by natural processes of dilution, dispersion, and degradation. At many sites, LTM can require decades of expensive sampling at tens or even hundreds of existing monitoring wells, resulting in hundreds of thousands, or millions of dollars per year for sampling and data management. Therefore, contaminant sampling tools, methods and frequencies are chosen to minimize waste and data management costs while ensuring a reliable and informative time-history of contaminant measurement for regulatory compliance. The interplay play between cause (i.e. subsurface heterogeneities, sampling techniques, measurement frequencies) and effect (unreliable data and measurements gap) has been overlooked in many field applications which can lead to inconsistencies in time- histories of contaminant samples. In this study we address the relationship between cause and effect for different hydrogeological sampling settings: porous and fractured media. A numerical model has been developed using AMR-FEM to solve the physicochemical processes that take place in the aquifer and the monitoring well. In the latter, the flow is governed by the Navier-Stokes equations while in the former the flow is governed by the diffusivity equation; both are fully coupled to mimic stressed conditions and to assess the effect of dynamic sampling tool on the formation surrounding the monitoring well. First of all, different sampling tools (i.e., Easy Pump, Snapper Grab Sampler) were simulated in a monitoring well screened in different homogeneous layered aquifers to assess their effect on the sampling measurements. Secondly, in order to make the computer runs more CPU efficient the flow in the monitoring well was replaced by its counterpart flow in porous media with infinite permeability and the new model was used to simulate the effect of heterogeneities, sampling depth, sampling tool and sampling frequencies on the

  14. Gap processing for adaptive maximal Poisson-disk sampling

    KAUST Repository

    Yan, Dongming


    In this article, we study the generation of maximal Poisson-disk sets with varying radii. First, we present a geometric analysis of gaps in such disk sets. This analysis is the basis for maximal and adaptive sampling in Euclidean space and on manifolds. Second, we propose efficient algorithms and data structures to detect gaps and update gaps when disks are inserted, deleted, moved, or when their radii are changed.We build on the concepts of regular triangulations and the power diagram. Third, we show how our analysis contributes to the state-of-the-art in surface remeshing. © 2013 ACM.

  15. Determination of Acyclovir in Human Plasma Samples by HPLC Method with UV Detection: Application to Single-Dose Pharmacokinetic Study

    Directory of Open Access Journals (Sweden)

    Dragica Zendelovska


    CONCLUSION: Good precision, accuracy, simplicity, sensitivity and shorter time of analysis of the method makes it particularly useful for processing of multiple samples in a limited period of time for pharmacokinetic study of acyclovir.

  16. Sampling Methods for Wallenius' and Fisher's Noncentral Hypergeometric Distributions

    DEFF Research Database (Denmark)

    Fog, Agner


    Several methods for generating variates with univariate and multivariate Wallenius' and Fisher's noncentral hypergeometric distributions are developed. Methods for the univariate distributions include: simulation of urn experiments, inversion by binary search, inversion by chop-down search from t...

  17. Reliability of a method of sampling stream invertebrates

    CSIR Research Space (South Africa)

    Chutter, FM


    Full Text Available In field ecological studies inferences must often be drawn from dissimilarities in numbers and species of organisms found in biological samples collected at different times and under various conditions....

  18. Summary Report for Evaluation of Compost Sample Drying Methods

    National Research Council Canada - National Science Library

    Frye, Russell


    .... Previous work in Support of these efforts developed a compost sample preparation scheme, consisting of air drying followed by milling, to reduce analytical variability in the heterogeneous compost matrix...

  19. A Review of Biological Agent Sampling Methods and ... (United States)

    Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.

  20. Sampling method and location affect recovery of coliforms and Escherichia coli from broiler carcasses. (United States)

    Smith, D P


    Two experiments were conducted, the first to determine whether numbers of recovered bacteria differed due to sampling method used or due to location on carcass sampled (breast or leg quarters) and the second to determine if numbers of bacteria differed between the front (ventral) and back (dorsal) side of the carcass. In both experiments, eviscerated broiler carcasses were obtained from a commercial processing plant just before the final inside-outside bird washer. In experiment 1, carcasses (3 in each of 4 replicate trials) were separated into leg quarters and breast quarters (n = 48) and either rinsed or ground and stomached for microbiological sampling. In experiment 2, for 3 replicate trials of 4 carcasses each, necks, wings, and legs were manually removed; the remaining trunks were cut through the sides to produce front (ventral) and back (dorsal) halves (n = 24); and then rinsed. For both experiments, coliforms and Escherichia coli were enumerated. In experiment 1, significantly higher numbers (P coliforms and E. coli were recovered by rinsing than by grinding from both breast and leg quarters. Leg quarters were found to have higher bacterial numbers than breasts from grind samples, but no quarter differences were found for rinse samples. In experiment 2, higher (P coliforms and E. coli were recovered from the dorsal carcass half compared with the ventral half. Bacterial counts of broiler carcasses are affected by both the sampling method used and by carcass location sampled.

  1. Order–disorder–reorder process in thermally treated dolomite samples

    DEFF Research Database (Denmark)

    Zucchini, Azzurra; Comodi, Paola; Katerinopoulou, Anna


    A combined powder and single-crystal X-ray diffraction analysis of dolomite [CaMg(CO3)2] heated to 1,200oC at 3 GPa was made to study the order–disorder–reorder process. The order/disorder transition is inferred to start below 1,100oC, and complete disorder is attained at approximately 1,200o...

  2. Method of Biomaterials Processing into Energy Carrier

    Energy Technology Data Exchange (ETDEWEB)

    Nikoghosyan, S. [Surentechnology, St. Ancona, AP (Italy)


    The progressive method of biomass processing into energy carrier as briquettes (Pellets) is described in the present work. It is based on institution of polymer Naralex into the milled biomass in proportion 100:2. Naralex is a light yellow powder with dispersity from 20 to 200 mkm. The nature (type) of biomass does not effect the quality of final product-briquette. The mixture is activated when reaching the temperature 180 deg C, whereas the humidity of the primary product-biomass should not exceed 20%. The machine is elaborated in accordance with the technology. It has got several design solutions on productivity, is equipped with automatic control of parameters. Operating mode is perpetual - 24 hours. The final product has got advanced physical-mechanical properties: solidity, water-, moisture resistance, heat emission up to 8000 kcal/h, burns without a smoke and smell. The product is ecologically pure, certified according to all parameters provided in standards. The ready briquette is durable, does not get out of order even if kept for a long time in the conditions of high humidity. It has a low cost.

  3. Filter-Aided Sample Preparation: The Versatile and Efficient Method for Proteomic Analysis. (United States)

    Wiśniewski, J R


    Filter-aided sample preparation (FASP) is a versatile and efficient way of processing protein extracts for bottom-up proteomic analysis. The method repurposes centrifugal ultrafiltration concentrators for removal of detergents, protein cleavage, and isolation of pure peptide fractions. FASP can be used for protein cleavage with different proteinases either with single enzymes or in a mode of successive multienzyme digestion (MED)-FASP. The FASP methods are useful for processing of samples ranging in their sizes from submicrogram to several milligram amounts of total protein. They also allow peptide fractionation, and isolation and quantitation of total RNA and DNA acid contents. This chapter describes principles, limitations, and applications of FASP. Additionally detailed FASP and MED-FASP protocols are provided. © 2017 Elsevier Inc. All rights reserved.

  4. Survey: interpolation methods for whole slide image processing. (United States)

    Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T


    Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  5. Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site (United States)


    Oasis montaj mapping software (version 7.2.1) and TtEC-developed software specifically produced to integrate and assess digital geophysical data...corrections. These processed data were output to Geosoft Oasis montaj mapping software for further processing (e.g., leveling), QC analysis, and gridding, and...coordinate system. All data processing parameters were stored in digital files (*.chk) and in the Oasis montaj log file (*.log). All DGM data met

  6. Method for preconcentrating a sample for subsequent analysis (United States)

    Zaromb, Solomon


    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  7. Improvements in Sample Selection Methods for Image Classification

    Directory of Open Access Journals (Sweden)

    Thales Sehn Körting


    Full Text Available Traditional image classification algorithms are mainly divided into unsupervised and supervised paradigms. In the first paradigm, algorithms are designed to automatically estimate the classes’ distributions in the feature space. The second paradigm depends on the knowledge of a domain expert to identify representative examples from the image to be used for estimating the classification model. Recent improvements in human-computer interaction (HCI enable the construction of more intuitive graphic user interfaces (GUIs to help users obtain desired results. In remote sensing image classification, GUIs still need advancements. In this work, we describe our efforts to develop an improved GUI for selecting the representative samples needed to estimate the classification model. The idea is to identify changes in the common strategies for sample selection to create a user-driven sample selection, which focuses on different views of each sample, and to help domain experts identify explicit classification rules, which is a well-established technique in geographic object-based image analysis (GEOBIA. We also propose the use of the well-known nearest neighbor algorithm to identify similar samples and accelerate the classification.

  8. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing (United States)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.


    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and

  9. Methods of sampling airborne fungi in working environments of waste treatment facilities


    Kristýna Černá; Zdeňka Wittlingerová; Magdaléna Zimová; Zdeněk Janovský


    Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU)/m3 of airborne fungi w...

  10. In Situ Visualization of the Phase Behavior of Oil Samples Under Refinery Process Conditions. (United States)

    Laborde-Boutet, Cedric; McCaffrey, William C


    To help address production issues in refineries caused by the fouling of process units and lines, we have developed a setup as well as a method to visualize the behavior of petroleum samples under process conditions. The experimental setup relies on a custom-built micro-reactor fitted with a sapphire window at the bottom, which is placed over the objective of an inverted microscope equipped with a cross-polarizer module. Using reflection microscopy enables the visualization of opaque samples, such as petroleum vacuum residues, or asphaltenes. The combination of the sapphire window from the micro-reactor with the cross-polarizer module of the microscope on the light path allows high-contrast imaging of isotropic and anisotropic media. While observations are carried out, the micro-reactor can be heated to the temperature range of cracking reactions (up to 450 °C), can be subjected to H2 pressure relevant to hydroconversion reactions (up to 16 MPa), and can stir the sample by magnetic coupling. Observations are typically carried out by taking snapshots of the sample under cross-polarized light at regular time intervals. Image analyses may not only provide information on the temperature, pressure, and reactive conditions yielding phase separation, but may also give an estimate of the evolution of the chemical (absorption/reflection spectra) and physical (refractive index) properties of the sample before the onset of phase separation.

  11. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brigantic, Robert T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Reid A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).


    Frisch, H. P.


    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  13. Comparison of indoor air sampling and dust collection methods for fungal exposure assessment using quantitative PCR (United States)

    Evaluating fungal contamination indoors is complicated because of the many different sampling methods utilized. In this study, fungal contamination was evaluated using five sampling methods and four matrices for results. The five sampling methods were a 48 hour indoor air sample ...

  14. 7 CFR 28.46 - Method of submitting samples and types. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of submitting samples and types. 28.46 Section... Standards Act Sample Or Type Comparison § 28.46 Method of submitting samples and types. The method of submitting samples and types for comparison shall be the same as that prescribed in this subpart for...

  15. Using statistical methods of quality management in logistics processes

    Directory of Open Access Journals (Sweden)

    Tkachenko Alla


    Full Text Available The purpose of the paper is to study the application of statistical methods of logistics process quality management at a large industrial enterprise and testing the theoretical studies. The analysis of the publications shows that a significant number of works by both Ukrainian and foreign authors has been dedicated to the research of quality management, while statistical methods of quality management have only been thoroughly analyzed by a small number of researchers, since these methods are referred to as classical, that is, those that are considered well-known and do not require special attention of modern scholars. In the authors’ opinion, the logistics process is a process of transformation and movement of material and accompanying flows by ensuring management freedom under the conditions of sequential interdependencies; standardization; synchronization; sharing information, and consistency of incentives, using innovative methods and models. In our study, we have shown that the management of logistics processes should use such statistical methods of quality management as descriptive statistics, experiment planning, hypotheses testing, measurement analysis, process opportunities analysis, regression analysis, reliability analysis, sampling, modeling, maps of statistical process control, specification of statistical tolerance, time series analysis. The proposed statistical methods of logistics processes quality management have been tested at the large industrial enterprise JSC "Dniepropetrovsk Aggregate Plant" that specializes in manufacturing hydraulic control valves. The findings suggest that the main purpose in the sphere of logistics processes quality is the continuous improvement of the mining equipment production quality through the use of innovative processes, advanced management systems and information technology. This will enable the enterprise to meet the requirements and expectations of their customers. It has been proved that the

  16. Three sampling methods for visibility measures of landscape perception

    NARCIS (Netherlands)

    Weitkamp, S.G.; Bregt, A.K.; Lammeren, van R.J.A.; Berg, van den A.E.


    The character of a landscape can be seen as the outcome of people¿s perception of their physical environment, which is important for spatial planning and decision making. Three modes of landscape perception are proposed: view from a viewpoint, view from a road, and view of an area. Three sampling

  17. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten


    BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study w...

  18. Modern methods of sample preparation for GC analysis

    NARCIS (Netherlands)

    de Koning, S.; Janssen, H.-G.; Brinkman, U.A.Th.


    Today, a wide variety of techniques is available for the preparation of (semi-) solid, liquid and gaseous samples, prior to their instrumental analysis by means of capillary gas chromatography (GC) or, increasingly, comprehensive two-dimensional GC (GC × GC). In the past two decades, a large number

  19. An adaptive household sampling method for rural African communities

    African Journals Online (AJOL)

    Utilizing Google Earth images and a Graphical Information System (GIS) map of Berekuso, sampling units were defined as 15-degree wedge-shaped sectors ... of Berekuso, and produced generalizable results for median household size, median age of residents, sources of potable water and toilet types, among others.

  20. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  1. A simple sample preparation method for measuring amoxicillin in human plasma by hollow fiber centrifugal ultrafiltration. (United States)

    Dong, Wei-Chong; Hou, Zi-Li; Jiang, Xin-Hui; Jiang, Ye


    A simple sample preparation method has been developed for the determination of amoxicillin in human plasma by hollow fiber centrifugal ultrafiltration (HF-CF-UF). A 400-μL plasma sample was placed directly into the HF-CF-UF device, which consisited of a slim glass tube and a U-shaped hollow fiber. After centrifugation at 1.25 × 10(3) g for 10 min, the filtrate was withdrawn from the hollow fiber and 20 µL was directly injected into the high-performance liquid chromatography (HPLC) for analysis. The calibration curve was linear over the range of 0.1-20 µg/mL (r = 0.9996) and the limit of detection was as low as 0.025 µg/mL. The average recovery and absolute recovery were 99.9% and 84.5%, respectively. Both the intra-day and inter-day precisions (relative standard deviation) were less than 3.1% for three concentrations (0.25, 2.5 and 10 µg/mL). The sample preparation process was simplified. Only after a single centrifugal ultrafiltration can the filtrate be injected directly into HPLC. The present method is simple, sensitive and accurate. It could be effective for the analysis of biological samples with high protein contents, especially for the biopharmaceutical analysis of drugs that use traditional isolation techniques for sample preparation such as the protein precipitation method.

  2. Sample preparation method considerations for integrated transcriptomic and proteomic analysis of tumors. (United States)

    Bhat, Anupama Rajan; Gupta, Manoj Kumar; Krithivasan, Priya; Dhas, Kunal; Nair, Jayalakshmi; Reddy, Ram Bhupal; Sudheendra, Holalugunda Vittalamurthy; Chavan, Sandip; Vardhan, Harsha; Darsi, Sujatha; Balakrishnan, Lavanya; Katragadda, Shanmukh; Kekatpure, Vikram; Suresh, Amritha; Tata, Pramila; Panda, Binay; Kuriakose, Moni A; Sirdeshmukh, Ravi


    Sample processing protocols that enable compatible recovery of differentially expressed transcripts and proteins are necessary for integration of the multiomics data applied in the analysis of tumors. In this pilot study, we compared two different isolation methods for extracting RNA and protein from laryngopharyngeal tumor tissues and the corresponding adjacent normal sections. In Method 1, RNA and protein were isolated from a single tissue section sequentially and in Method 2, the extraction was carried out using two different sections and two independent and parallel protocols for RNA and protein. RNA and protein from both methods were subjected to RNA-seq and iTRAQ-based LC-MS/MS analysis, respectively. Analysis of data revealed that a higher number of differentially expressed transcripts and proteins were concordant in their regulation trends in Method 1 as compared to Method 2. Cross-method comparison of concordant entities revealed that RNA and protein extraction from the same tissue section (Method 1) recovered more concordant entities that are missed in the other extraction method (Method 2) indicating heterogeneity in distribution of these entities in different tissue sections. Method 1 could thus be the method of choice for integrated analysis of transcriptome and proteome data. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling (United States)

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah


    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  4. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    Energy Technology Data Exchange (ETDEWEB)

    Tripp, J.; Smith, T.; Law, J. [Idaho National Laboratory: P.O. Box 1625, Idaho Falls, ID 83415 (United States)


    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  5. An Optimized Method for Quantification of Pathogenic Leptospira in Environmental Water Samples. (United States)

    Riediger, Irina N; Hoffmaster, Alex R; Casanovas-Massana, Arnau; Biondo, Alexander W; Ko, Albert I; Stoddard, Robyn A


    Leptospirosis is a zoonotic disease usually acquired by contact with water contaminated with urine of infected animals. However, few molecular methods have been used to monitor or quantify pathogenic Leptospira in environmental water samples. Here we optimized a DNA extraction method for the quantification of leptospires using a previously described Taqman-based qPCR method targeting lipL32, a gene unique to and highly conserved in pathogenic Leptospira. QIAamp DNA mini, MO BIO PowerWater DNA and PowerSoil DNA Isolation kits were evaluated to extract DNA from sewage, pond, river and ultrapure water samples spiked with leptospires. Performance of each kit varied with sample type. Sample processing methods were further evaluated and optimized using the PowerSoil DNA kit due to its performance on turbid water samples and reproducibility. Centrifugation speeds, water volumes and use of Escherichia coli as a carrier were compared to improve DNA recovery. All matrices showed a strong linearity in a range of concentrations from 106 to 10° leptospires/mL and lower limits of detection ranging from Leptospira in environmental waters (river, pond and sewage) which consists of the concentration of 40 mL samples by centrifugation at 15,000×g for 20 minutes at 4°C, followed by DNA extraction with the PowerSoil DNA Isolation kit. Although the method described herein needs to be validated in environmental studies, it potentially provides the opportunity for effective, timely and sensitive assessment of environmental leptospiral burden.

  6. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio


    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  7. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method. (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils


    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  8. Precise confidence intervals of regression-based reference limits: Method comparisons and sample size requirements. (United States)

    Shieh, Gwowen


    Covariate-dependent reference limits have been extensively applied in biology and medicine for determining the substantial magnitude and relative importance of quantitative measurements. Confidence interval and sample size procedures are available for studying regression-based reference limits. However, the existing popular methods employ different technical simplifications and are applicable only in certain limited situations. This paper describes exact confidence intervals of regression-based reference limits and compares the exact approach with the approximate methods under a wide range of model configurations. Using the ratio between the widths of confidence interval and reference interval as the relative precision index, optimal sample size procedures are presented for precise interval estimation under expected ratio and tolerance probability considerations. Simulation results show that the approximate interval methods using normal distribution have inaccurate confidence limits. The exact confidence intervals dominate the approximate procedures in one- and two-sided coverage performance. Unlike the current simplifications, the proposed sample size procedures integrate all key factors including covariate features in the optimization process and are suitable for various regression-based reference limit studies with potentially diverse configurations. The exact interval estimation has theoretical and practical advantages over the approximate methods. The corresponding sample size procedures and computing algorithms are also presented to facilitate the data analysis and research design of regression-based reference limits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Designing waveforms for temporal encoding using a frequency sampling method

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jensen, Jørgen Arendt


    In this paper a method for designing waveforms for temporal encoding in medical ultrasound imaging is described. The method is based on least squares optimization and is used to design nonlinear frequency modulated signals for synthetic transmit aperture imaging. By using the proposed design method......, the amplitude spectrum of the transmitted waveform can be optimized, such that most of the energy is transmitted where the transducer has large amplification. To test the design method, a waveform was designed for a BK8804 linear array transducer. The resulting nonlinear frequency modulated waveform...... waveform, on the other hand, was designed so that only frequencies where the transducer had a large amplification were excited. Hereby, unnecessary heating of the transducer could be avoided and the signal-tonoise ratio could be increased. The experimental ultrasound scanner RASMUS was used to evaluate...

  10. Rapid method for plutonium-241 determination in soil samples


    Piekarz, M.; Komosa, A.


    A simple and rapid procedure for the determination of plutonium isotopes in the environment is presented. The procedure combines alpha spectrometry, solvent extraction and liquid scintillation measurements to ensure that both alpha- and beta-emitting isotopes are determined. Of five tested extractants, bis-(2-ethylhexyl) phosphoric acid was found to be the best choice. The procedure was applied to soil samples contaminated with Chernobyl fallout.

  11. Review of Processing and Analytical Methods for Francisella ... (United States)

    Journal Article The etiological agent of tularemia, Francisella tularensis, is a resilient organism within the environment and can be acquired many ways (infectious aerosols and dust, contaminated food and water, infected carcasses, and arthropod bites). However, isolating F. tularensis from environmental samples can be challenging due to its nutritionally fastidious and slow-growing nature. In order to determine the current state of the science regarding available processing and analytical methods for detection and recovery of F. tularensis from water and soil matrices, a review of the literature was conducted. During the review, analysis via culture, immunoassays, and genomic identification were the most commonly found methods for F. tularensis detection within environmental samples. Other methods included combined culture and genomic analysis for rapid quantification of viable microorganisms and use of one assay to identify multiple pathogens from a single sample. Gaps in the literature that were identified during this review suggest that further work to integrate culture and genomic identification would advance our ability to detect and to assess the viability of Francisella spp. The optimization of DNA extraction, whole genome amplification with inhibition-resistant polymerases, and multiagent microarray detection would also advance biothreat detection.

  12. Standard-Setting Methods as Measurement Processes (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly


    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  13. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys. (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R


    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey.

  14. Evaluation of sampling systems in iron ore concentrating and pelletizing processes - quantification of total sampling error (TSE) vs. apparent process variation

    DEFF Research Database (Denmark)

    Engström, Karin; Esbensen, Kim Harry


    Process sampling is involved in grade control in all parts of the production value chain in mineral processing. Reliable sampling and assaying is essential to ensure final product quality, but the need for representative sampling is not always taken into account. By continuous control...... analyses will form a basis for suggestions of possible improvements. The results show that variographic analysis is a powerful tool to evaluate both process variations and the variability of the sampling systems employed. The extensive access to time series data allow variographic characterization (quality...... control) of all critical measurement systems and locations. At the same time, periodicity and small changes in process variation can be detected and counteracted early, minimizing the risk for producing products out of specification....

  15. Hospital Registration Process Reengineering Using Simulation Method

    Directory of Open Access Journals (Sweden)

    Qiang Su


    Full Text Available With increasing competition, many healthcare organizations have undergone tremendous reform in the last decade aiming to increase efficiency, decrease waste, and reshape the way that care is delivered. This study focuses on the operational efficiency improvement of hospital’s registration process. The operational efficiency related factors including the service process, queue strategy, and queue parameters were explored systematically and illustrated with a case study. Guided by the principle of business process reengineering (BPR, a simulation approach was employed for process redesign and performance optimization. As a result, the queue strategy is changed from multiple queues and multiple servers to single queue and multiple servers with a prepare queue. Furthermore, through a series of simulation experiments, the length of the prepare queue and the corresponding registration process efficiency was quantitatively evaluated and optimized.

  16. Post-Decontamination Vapor Sampling and Analytical Test Methods (United States)


    agent; CWA; simulants; nontraditional agent; NTA; toxic industrial chemical; TIC; toxic industrial material; TIM; coupon; contamination ...decontamination process. Chemical contaminants can include chemical warfare agents (CWAs) or their simulants, nontraditional agents (NTAs), toxic industrial...include glove ports . The chamber may have certified fume hoods for the containment of toxic chemicals. All exhaust air must be filtered to

  17. Liquid Chromatographic Method for Determination of Nisoldipine from Pharmaceutical Samples

    Directory of Open Access Journals (Sweden)

    Amit Gupta


    Full Text Available A simple and specific high performance thin layer chromatographic method was developed and validated for the determination of nisoldipine from tablet dosage form. The method was carried out at 320 nm after extraction of drug in methanol. The method uses aluminum plates pre-coated with silica gel 60F-254 as stationary phase and cyclohexane-ethyl acetate-toluene (3:3:4, v/v/v as mobile phase. Linearity was established over a range of 400-2400 ng per zone. Both peak area ratio and peak height ratio showed acceptable correlation coefficient i.e. more than 0.99. However we used peak area for validation purpose. Intra-day and inter-day precision was determined and found to have less than 6.0 % RSD.

  18. Entrepreneurship education: Process, method, or both?

    Directory of Open Access Journals (Sweden)

    Dianne H.B. Welsh


    Full Text Available Transformative changes are happening in Higher Education Institutions worldwide in entrepreneurship education. These changes are conceptual as well as technological due to the upheaval in the global, social, political, and technological environment. We argue that the process theory of Alfred North Whitehead best explains why entrepreneurship education does not always have the same results on our students in the classroom and after they graduate. In the education of entrepreneurs, we hold that it is change that is the cornerstone of reality-our entrepreneurship students are in the process of becoming something they previously were not. Implications and comparisons of the process theory applied to entrepreneurship education are discussed.

  19. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  20. Multilayer hybrid microfluidics: a digital-to-channel interface for sample processing and separations. (United States)

    Watson, Michael W L; Jebrail, Mais J; Wheeler, Aaron R


    Microchannels can separate analytes faster with higher resolution, higher efficiency and with lower reagent consumption than typical column techniques. Unfortunately, an impediment in the path toward fully integrated microchannel-based laboratories-on-a-chip is the integration of preseparation sample processing. In contrast, the alternative format of digital microfluidics (DMF), in which discrete droplets are manipulated on an array of electrodes, is well-suited for carrying out sequential chemical reactions such as those commonly employed in proteomic sample preparation. We recently reported a new paradigm of "hybrid microfluidics," integrating DMF with microchannels for in-line sample processing and separations. Here, we build on our initial efforts, introducing a second-generation hybrid microfluidic device architecture. In the new multilayer design, droplets are manipulated by DMF in the two-plate format, an improvement that facilitates dispensing samples from reservoirs, as well as droplet splitting and storage for subsequent analysis. To demonstrate the capabilities of the new method, we implemented an on-chip serial dilution experiment, as well as multistep enzymatic digestion. Given the myriad applications requiring preprocessing and chemical separations, the hybrid digital-channel format has the potential to become a powerful new tool for micro total analysis systems.

  1. Method for fractional solid-waste sampling and chemical analysis

    DEFF Research Database (Denmark)

    Riber, Christian; Rodushkin, I.; Spliid, Henrik


    Chemical characterization of solid waste is a demanding task due to the heterogeneity of the waste. This article describes how 45 material fractions hand-sorted from Danish household waste were subsampled and prepared for chemical analysis of 61 substances. All material fractions were subject...... of variance (20-85% of the overall variation). Only by increasing the sample size significantly can this variance be reduced. The accuracy and short-term reproducibility of the chemical characterization were good, as determined by the analysis of several relevant certified reference materials. Typically, six...

  2. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S


    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... higher than in control soil, probably due mainly to release of predation from indigenous protozoa. In order to minimize solvent effects on indigenous soil microorganisms when spiking native soil samples with compounds having a low water solubility, we propose a common protocol in which the contaminant...

  3. Inverse RNA folding solution based on multi-objective genetic algorithm and Gibbs sampling method. (United States)

    Ganjtabesh, M; Zare-Mirakabad, F; Nowzari-Dalini, A


    In living systems, RNAs play important biological functions. The functional form of an RNA frequently requires a specific tertiary structure. The scaffold for this structure is provided by secondary structural elements that are hydrogen bonds within the molecule. Here, we concentrate on the inverse RNA folding problem. In this problem, an RNA secondary structure is given as a target structure and the goal is to design an RNA sequence that its structure is the same (or very similar) to the given target structure. Different heuristic search methods have been proposed for this problem. One common feature among these methods is to use a folding algorithm to evaluate the accuracy of the designed RNA sequence during the generation process. The well known folding algorithms take O(n(3)) times where n is the length of the RNA sequence. In this paper, we introduce a new algorithm called GGI-Fold based on multi-objective genetic algorithm and Gibbs sampling method for the inverse RNA folding problem. Our algorithm generates a sequence where its structure is the same or very similar to the given target structure. The key feature of our method is that it never uses any folding algorithm to improve the quality of the generated sequences. We compare our algorithm with RNA-SSD for some biological test samples. In all test samples, our algorithm outperforms the RNA-SSD method for generating a sequence where its structure is more stable.

  4. Al NMR: a novel NMR data processing program optimized for sparse sampling

    Energy Technology Data Exchange (ETDEWEB)

    Gledhill, John M.; Wand, A. Joshua, E-mail: [University of Pennsylvania, Graduate Group in Biochemistry and Molecular Biophysics, Perelman School of Medicine (United States)


    Sparse sampling in biomolecular multidimensional NMR offers increased acquisition speed and resolution and, if appropriate conditions are met, an increase in sensitivity. Sparse sampling of indirectly detected time domains combined with the direct truly multidimensional Fourier transform has elicited particular attention because of the ability to generate a final spectrum amenable to traditional analysis techniques. A number of sparse sampling schemes have been described including radial sampling, random sampling, concentric sampling and variations thereof. A fundamental feature of these sampling schemes is that the resulting time domain data array is not amenable to traditional Fourier transform based processing and phasing correction techniques. In addition, radial sampling approaches offer a number of advantages and capabilities that are also not accessible using standard NMR processing techniques. These include sensitivity enhancement, sub-matrix processing and determination of minimal sets of sampling angles. Here we describe a new software package (Al NMR) that enables these capabilities in the context of a general NMR data processing environment.

  5. Generalized reciprocal method applied in processing seismic ...

    African Journals Online (AJOL)

    A geophysical investigation was carried out at Shika, near Zaria, using seismic refraction method; with the aim of analyzing the data obtained using the generalized reciprocal method (GRM). The technique is for delineating undulating refractors at any depth from in-line seismic refraction data consisting of forward and ...

  6. Soil processing method journal article supporting data (United States)

    U.S. Environmental Protection Agency — This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the...

  7. The Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual

    Energy Technology Data Exchange (ETDEWEB)

    Folley, G.; Pearson, L.; Crosby, C. [Alaska Dept. of Environmental Conservation, Soldotna, AK (United States); DeCola, E.; Robertson, T. [Nuka Research and Planning Group, Seldovia, AK (United States)


    A comprehensive water quality sampling program was conducted in response to the oil spill that occurred when the M/V Selendang Ayu ship ran aground near a major fishing port at Unalaska Island, Alaska in December 2004. In particular, the sampling program focused on the threat of spilled oil to the local commercial fisheries resources. Spill scientists were unable to confidently model the movement of oil away from the wreck because of limited oceanographic data. In order to determine which fish species were at risk of oil contamination, a real-time assessment of how and where the oil was moving was needed, because the wreck became a continual source of oil release for several weeks after the initial grounding. The newly developed methods and procedures used to detect whole oil during the sampling program will be presented in the Alaska Commercial Fisheries Water Quality Sampling Methods and Procedures Manual which is currently under development. The purpose of the manual is to provide instructions to spill managers while they try to determine where spilled oil has or has not been encountered. The manual will include a meaningful data set that can be analyzed in real time to assess oil movement and concentration. Sections on oil properties and processes will be included along with scientific water quality sampling methods for whole and dissolved phase oil to assess potential contamination of commercial fishery resources and gear in Alaska waters during an oil spill. The manual will present a general discussion of factors that should be considered when designing a sampling program after a spill. In order to implement Alaska's improved seafood safety measures, the spatial scope of spilled oil must be known. A water quality sampling program can provide state and federal fishery managers and food safety inspectors with important information as they identify at-risk fisheries. 11 refs., 7 figs.

  8. Can an inadequate cervical cytology sample in ThinPrep be converted to a satisfactory sample by processing it with a SurePath preparation?

    Directory of Open Access Journals (Sweden)

    Sveinung Wergeland Sorbye


    Full Text Available Background: The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. Materials and Methods: A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7% were processed through the automated “gynecologic” application for cervix cytology samples, and 96 (51.3% were processed with the “nongynecological” automatic program. Results: Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7% were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the “gynecology” program and “nongynecology” program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0% were screened as normal while 13 samples (14.0% were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187 of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Conclusions: Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the “nongynecology” program to ensure an adequate number of cells.

  9. Comparison of sample preparation methods for detection of Chlamydia pneumoniae in bronchoalveolar lavage fluid by PCR. (United States)

    Maass, M; Dalhoff, K


    Amplification inhibitors can lead to false-negative results for PCR. In order to evaluate the reliability of PCR for the detection of Chlamydia pneumoniae, the presence of PCR inhibitors in 75 bronchoalveolar lavage specimens was assessed after treatment by various sample preparation methods. Specimens were collected from patients with acute respiratory infections, including four cases of proven C. pneumoniae infection. Substances inhibitory to the amplification of chlamydial DNA continued to be present in 12% of the samples treated according to the commonly used single-step proteinase K digestion and in 31% of the samples processed by heat treatment. However, the complexing of DNA-contaminating proteins and polysaccharides from digested specimens to cetyltrimethylammonium bromide (CTAB) followed by DNA extraction efficiently removed inhibitors from all experimental samples and provided subsequent identification of all positive clinical samples by PCR. The CTAB method and proteinase K treatment had comparable detection limits of approximately 0.01 inclusion-forming units. CTAB-based DNA purification of respiratory specimens is recommended to increase the diagnostic sensitivity of PCR and confidence in negative results. Images PMID:7814512

  10. Preparation of Biological Samples Containing Metoprolol and Bisoprolol for Applying Methods for Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Corina Mahu Ştefania


    Full Text Available Arterial hypertension is a complex disease with many serious complications, representing a leading cause of mortality. Selective beta-blockers such as metoprolol and bisoprolol are frequently used in the management of hypertension. Numerous analytical methods have been developed for the determination of these substances in biological fluids, such as liquid chromatography coupled with mass spectrometry, gas chromatography coupled with mass spectrometry, high performance liquid chromatography. Due to the complex composition of biological fluids a biological sample pre-treatment before the use of the method for quantitative determination is required in order to remove proteins and potential interferences. The most commonly used methods for processing biological samples containing metoprolol and bisoprolol were identified through a thorough literature search using PubMed, ScienceDirect, and Willey Journals databases. Articles published between years 2005-2015 were reviewed. Protein precipitation, liquid-liquid extraction and solid phase extraction are the main techniques for the extraction of these drugs from plasma, serum, whole blood and urine samples. In addition, numerous other techniques have been developed for the preparation of biological samples, such as dispersive liquid-liquid microextraction, carrier-mediated liquid phase microextraction, hollow fiber-protected liquid phase microextraction, on-line molecularly imprinted solid phase extraction. The analysis of metoprolol and bisoprolol in human plasma, urine and other biological fluids provides important information in clinical and toxicological trials, thus requiring the application of appropriate extraction techniques for the detection of these antihypertensive substances at nanogram and picogram levels.

  11. System and method for laser assisted sample transfer to solution for chemical analysis (United States)

    Van Berkel, Gary J; Kertesz, Vilmos


    A system and method for laser desorption of an analyte from a specimen and capturing of the analyte in a suspended solvent to form a testing solution are described. The method can include providing a specimen supported by a desorption region of a specimen stage and desorbing an analyte from a target site of the specimen with a laser beam centered at a radiation wavelength (.lamda.). The desorption region is transparent to the radiation wavelength (.lamda.) and the sampling probe and a laser source emitting the laser beam are on opposite sides of a primary surface of the specimen stage. The system can also be arranged where the laser source and the sampling probe are on the same side of a primary surface of the specimen stage. The testing solution can then be analyzed using an analytical instrument or undergo further processing.

  12. The determination of nitrite by a graphene quantum dot fluorescence quenching method without sample pretreatment. (United States)

    Jin, Li; Wang, Ying; Liu, Fangtong; Yu, Shihua; Gao, Yan; Zhang, Jianpo


    A method for quantitative analysis of nitrite was achieved based on fluorescence quenching of graphene quantum dots. To obtain reliable results, the effects of pH, temperature and reaction time on this fluorescence quenching system were studied. Under optimized conditions, decrease in fluorescence intensity of graphene quantum dots (F0 /F) showed a good linear relationship with nitrite concentration between 0.007692-0.38406 mmol/L and 0.03623-0.13043 μmol/L; the limits of detection were 9.8 μmol/L and 5.4 nmol/L, respectively. Variable temperature experiments, UV absorption spectra and thermodynamic calculations were used to determine the quenching mechanism, and indicated that it was an exothermic, spontaneous dynamic quenching process. This method was used to analyse urine samples, and showed that it could be applied to analyse biological samples. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Application of finite-element-methods in food processing

    DEFF Research Database (Denmark)

    Risum, Jørgen


    Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given.......Presentation of the possible use of finite-element-methods in food processing. Examples from diffusion studies are given....

  14. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Goheen, S.C.


    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  15. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of the...

  16. 7 CFR 32.400 - Samples of grease mohair grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of grease mohair grades; method of obtaining... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.400 Samples of grease mohair grades; method of obtaining. Samples certified as representative of the official standards of the...

  17. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining. (United States)


    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the official...

  18. Vegetation Sampling for Wetland Delineation: A Review and Synthesis of Methods and Sampling Issues (United States)


    element in several forest inventory programs. These alternative metrics to cover or frequency are not regularly used in the majority of seedlings, saplings, and overstory vegetation are routinely collected in many forest inventory methods (Schreuder et al. 1993; McRoberts and...Service collects vegetation data using strata from long-term monitoring plots as part of its Forest Inventory and Analysis (FIA) and National Forest

  19. Signal Processing Methods Monitor Cranial Pressure (United States)


    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  20. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander


    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  1. Suspension trapping (STrap) sample preparation method for bottom-up proteomics analysis. (United States)

    Zougman, Alexandre; Selby, Peter J; Banks, Rosamonde E


    Despite recent developments in bottom-up proteomics, the need still exists in a fast, uncomplicated, and robust method for comprehensive sample processing especially when applied to low protein amounts. The suspension trapping method combines the advantage of efficient SDS-based protein extraction with rapid detergent removal, reactor-type protein digestion, and peptide cleanup. Proteins are solubilized in SDS. The sample is acidified and introduced into the suspension trapping tip incorporating the depth filter and hydrophobic compartments, filled with the neutral pH methanolic solution. The instantly formed fine protein suspension is trapped in the depth filter stack-this crucial step is aimed at separating the particulate matter in space. SDS and other contaminants are removed in the flow-through, and a protease is introduced. Following the digestion, the peptides are cleaned up using the tip's hydrophobic part. The methodology allows processing of protein loads down to the low microgram/submicrogram levels. The detergent removal takes about 5 min, whereas the tryptic proteolysis of a cellular lysate is complete in as little as 30 min. We have successfully utilized the method for analysis of cellular lysates, enriched membrane preparations, and immunoprecipitates. We expect that due to its robustness and simplicity, the method will become an essential proteomics tool. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. An overview of medical image processing methods

    African Journals Online (AJOL)



    Jun 14, 2010 ... images through computer simulations has already in- creased the interests of many researchers. 3D image rendering usually refers to the analysis of the ..... Digital Image Processing. Reading,. MA: Addison-Wesley Publishing Company. Gose E, Johnsonbaugh R, Jost S (1996). Pattern Recognition and.

  3. Tissue sampling methods and standards for vertebrate genomics

    Directory of Open Access Journals (Sweden)

    Wong Pamela BY


    Full Text Available Abstract The recent rise in speed and efficiency of new sequencing technologies have facilitated high-throughput sequencing, assembly and analyses of genomes, advancing ongoing efforts to analyze genetic sequences across major vertebrate groups. Standardized procedures in acquiring high quality DNA and RNA and establishing cell lines from target species will facilitate these initiatives. We provide a legal and methodological guide according to four standards of acquiring and storing tissue for the Genome 10K Project and similar initiatives as follows: four-star (banked tissue/cell cultures, RNA from multiple types of tissue for transcriptomes, and sufficient flash-frozen tissue for 1 mg of DNA, all from a single individual; three-star (RNA as above and frozen tissue for 1 mg of DNA; two-star (frozen tissue for at least 700 μg of DNA; and one-star (ethanol-preserved tissue for 700 μg of DNA or less of mixed quality. At a minimum, all tissues collected for the Genome 10K and other genomic projects should consider each species’ natural history and follow institutional and legal requirements. Associated documentation should detail as much information as possible about provenance to ensure representative sampling and subsequent sequencing. Hopefully, the procedures outlined here will not only encourage success in the Genome 10K Project but also inspire the adaptation of standards by other genomic projects, including those involving other biota.

  4. The Marker State Space (MSS method for classifying clinical samples.

    Directory of Open Access Journals (Sweden)

    Brian P Fallon

    Full Text Available The development of accurate clinical biomarkers has been challenging in part due to the diversity between patients and diseases. One approach to account for the diversity is to use multiple markers to classify patients, based on the concept that each individual marker contributes information from its respective subclass of patients. Here we present a new strategy for developing biomarker panels that accounts for completely distinct patient subclasses. Marker State Space (MSS defines "marker states" based on all possible patterns of high and low values among a panel of markers. Each marker state is defined as either a case state or a control state, and a sample is classified as case or control based on the state it occupies. MSS was used to define multi-marker panels that were robust in cross validation and training-set/test-set analyses and that yielded similar classification accuracy to several other classification algorithms. A three-marker panel for discriminating pancreatic cancer patients from control subjects revealed subclasses of patients based on distinct marker states. MSS provides a straightforward approach for modeling highly divergent subclasses of patients, which may be adaptable for diverse applications.

  5. 7 CFR 51.308 - Methods of sampling and calculation of percentages. (United States)


    ..., CERTIFICATION, AND STANDARDS) United States Standards for Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and calculation of percentages. (a) When the numerical... 7 Agriculture 2 2010-01-01 2010-01-01 false Methods of sampling and calculation of percentages. 51...

  6. Effect of Processing Methods on the Proximate and Energy ...

    African Journals Online (AJOL)

    Four methods of processing were assessed to investigate the effect of processing methods on the digestibility, proximate and energy composition of Lablab purpureus (Rongai) beans. The processing methods were boiling (in water), fermentation, toasting and fermentation plus toasting. Some of The beans were boiled for 0, ...

  7. Determination of methylmercury in marine biota samples: method validation. (United States)

    Carrasco, Luis; Vassileva, Emilia


    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  8. Data processing method for neutron diffraction experiments

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Palomino, L.A. [Consejo Nacional de Investigaciones Cientificas y Tecnicas, Centro Atomico Bariloche, Instituto Balseiro, Comision Nacional de Energia Atomica, Universidad Nacional de Cuyo, 8400 Bariloche (Argentina); Dawidowski, J. [Consejo Nacional de Investigaciones Cientificas y Tecnicas, Centro Atomico Bariloche, Instituto Balseiro, Comision Nacional de Energia Atomica, Universidad Nacional de Cuyo, 8400 Bariloche (Argentina)]. E-mail:; Blostein, J.J. [Consejo Nacional de Investigaciones Cientificas y Tecnicas, Centro Atomico Bariloche, Instituto Balseiro, Comision Nacional de Energia Atomica, Universidad Nacional de Cuyo, 8400 Bariloche (Argentina); Cuello, G.J. [Institut Laue Langevin, Boite Postale 156, F-38042 Grenoble Cedex 9 (France)


    We present a procedure to perform multiple scattering, attenuation and efficiency corrections in reactor neutron diffraction experiments, based on a Monte Carlo code applied iteratively. We discuss the application of two procedures, the first based on Granada's synthetic model, useful for incoherent scatterers, and the second, based on the measured experimental distributions for coherent scatterers. Experiments on samples of polyethylene, light water, heavy water and Teflon of different sizes were performed and the correction procedures are tested. The problem of normalization in an absolute scale in diffraction experiments is addressed and results obtained from the present procedure are shown.

  9. Apollo experience report: Processing of lunar samples in a sterile nitrogen atmosphere (United States)

    Mcpherson, T. M.


    A sterile nitrogen atmosphere processing cabinet line was installed in the Lunar Receiving Laboratory to process returned lunar samples with minimum organic contamination. Design and operation of the cabinet line were complicated by the requirement for biological sterilization and isolation, which necessitated extensive filtration, leak-checking, and system sterilization before use. Industrial techniques were applied to lunar sample processing to meet requirements for time-critical experiments while handling a large flow of samples.

  10. Processes of aggression described by kinetic method

    Energy Technology Data Exchange (ETDEWEB)

    Aristov, V. V.; Ilyin, O. [Dorodnicyn Computing Centre of Russian Academy of Sciences, Vavilova str. 40, Moscow, 119333 (Russian Federation)


    In the last decades many investigations have been devoted to theoretical models in new areas concerning description of different biological, sociological and historical processes. In the present paper we suggest a model of the Nazi Germany invasion of Poland, France and USSR based on the kinetic theory. We model this process with the Cauchy boundary problem for the two-element kinetic equations with spatial initial conditions. The solution of the problem is given in the form of traveling wave. The propagation velocity of a frontline depends on the quotient between initial forces concentrations. Moreover it is obtained that the general solution of the model can be expressed in terms of quadratures and elementary functions. Finally it is shown that the frontline velocities are complied with the historical data.

  11. Processes of aggression described by kinetic method (United States)

    Aristov, V. V.; Ilyin, O.


    In the last decades many investigations have been devoted to theoretical models in new areas concerning description of different biological, sociological and historical processes. In the present paper we suggest a model of the Nazi Germany invasion of Poland, France and USSR based on the kinetic theory. We model this process with the Cauchy boundary problem for the two-element kinetic equations with spatial initial conditions. The solution of the problem is given in the form of traveling wave. The propagation velocity of a frontline depends on the quotient between initial forces concentrations. Moreover it is obtained that the general solution of the model can be expressed in terms of quadratures and elementary functions. Finally it is shown that the frontline velocities are complied with the historical data.

  12. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample: A Fast, Nonparametric Regression-Based Method. (United States)

    Strong, Mark; Oakley, Jeremy E; Brennan, Alan; Breeze, Penny


    Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. © The Author(s) 2015.

  13. The carbon isotope composition of CO2 respired by trunks: comparison of four sampling methods. (United States)

    Damesin, C; Barbaroux, C; Berveiller, D; Lelarge, C; Chaves, M; Maguas, C; Maia, R; Pontailler, J-Y


    The (13)C natural abundance of CO(2) respired by plants has been used in the laboratory to examine the discrimination processes that occur during respiration. Currently, field measurements are being expanded to interpret the respiration delta(13)C signature measured at ecosystem and global levels. In this context, forests are particularly important to consider as they represent 80% of the continental biomass. The objective of this investigation was to compare four methods of sampling the CO(2) respired by trunks for the determination of its carbon isotope composition: three in situ methods using chambers placed on the trunk, and one destructive method using cores of woody tissues. The in situ methods were based either on a Keeling plot approach applied at the tissue level or on an initial flush of the chamber with nitrogen or with CO(2)-free air. In parallel, we investigated the possibility of an apparent discrimination during tissue respiration by comparing the delta(13)C signature of the respired CO(2) and that of the organic matter. The study was performed on six tree species widely distributed in temperate and mediterranean areas. The four methods were not significantly different when overall means were considered. However, considering the individual data, the Keeling plot approach and the nitrogen flush methods gave fairly homogeneous results, whereas the CO(2)-free air method produced more variable results. The core method was not correlated with any of the chamber methods. Regardless of the methodology, the respired CO(2) generally was enriched in (13)C relative to the total organic matter. This apparent enrichment during respiration was variable, reaching as much as 3-5 per thousand. This study showed that, on the whole, the different sampling techniques gave similar results, but one should be aware of the variability associated with each method. Copyright (c) 2005 John Wiley & Sons, Ltd.

  14. Productivity studies in fish processing: methods of establishing work performance

    National Research Council Canada - National Science Library

    Amaria, P.J


    Relative merits of "Performance Rating", "Westinghouse Leveling Plan", "Methods-Time Measurement" and "Work Sampling" techniques used for establishing standard work performance in th fishing industry...

  15. Method and algorithm for image processing (United States)

    He, George G.; Moon, Brain D.


    The present invention is a modified Radon transform. It is similar to the traditional Radon transform for the extraction of line parameters and similar to traditional slant stack for the intensity summation of pixels away from a given pixel, for example ray paths that spans 360 degree at a given grid in the time and offset domain. However, the present invention differs from these methods in that the intensity and direction of a composite intensity for each pixel are maintained separately instead of combined after the transformation. An advantage of this approach is elimination of the work required to extract the line parameters in the transformed domain. The advantage of the modified Radon Transform method is amplified when many lines are present in the imagery or when the lines are just short segments which both occur in actual imagery.

  16. An efficient sampling algorithm for uncertain abnormal data detection in biomedical image processing and disease prediction. (United States)

    Liu, Fei; Zhang, Xi; Jia, Yan


    In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.

  17. Digital signal processor and processing method for GPS receivers (United States)

    Thomas, Jr., Jess B. (Inventor)


    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  18. Single Image Super-resolution Using Active-sampling Gaussian Process Regression. (United States)

    Wang, Haijun; Gao, Xinbo; Zhang, Kaibing; Li, Jie


    As well known, Gaussian process regression (GPR) has been successfully applied to example learning-based image super-resolution (SR). Despite its effectiveness, the applicability of GPR model is limited by its remarkably computational cost when a large number of examples are available to a learning task. For this purpose, we alleviate this problem of the GPR-based SR and propose a novel example learning-based SR method, called active-sampling Gaussian process regression (AGPR). The newly proposed approach employs active learning strategy to heuristically select more informative samples for training the regression parameters of the GPR model, which shows significant improvement on computational efficiency while keeping higher quality of reconstructed image. Finally, we suggest an accelerating scheme to further reduce the time complexity of the proposed AGPR-based SR by using a pre-learned projection matrix. We objectively and subjectively demonstrate that the proposed method is superior to other competitors for producing much sharper edges and finer details.

  19. A method for processing drilling muds

    Energy Technology Data Exchange (ETDEWEB)

    Mukhin, L.K.; Khramchenko, L.N.; Rybalchenko, V.S.; Zavorotnyy, V.L.


    The purpose of the invention is to increase the speed of bonding hydrogen sulfide with a simultaneous preservation of its absorptive capability and the rheological properties of the muds with a high content of solid phase. This is achieved through processing the drilling muds by introducing an additive which bonds the hydrogen sulfide, which is based on iron oxides, which are the dehydrated residue from the production of aminotoluenes through reduction of nitrotoluene in a volume of 5 to 50 percent by weight of the drilling mud.

  20. Schungite raw material quality evaluation using image processing method (United States)

    Chertov, Aleksandr N.; Gorbunova, Elena V.; Sadovnichii, Roman V.; Rozhkova, Natalia N.


    In modern times when technologies are developing rapidly, the high-carbon schungite rocks of Karelia are promising mineral raw material for production of active fillers for composite materials, radio shielding materials, silicon carbide, stable aqueous dispersions, sorbents, catalysts, carbon nanomaterials, and other products. An intensive evolution of radiometric separation and sorting methods based on different physical phenomena occurring in the interaction of minerals and their constituent chemical elements with different types of radiation open new enrichment opportunities for schungite materials. This is especially pertinent to optical method of enrichment, which is a part of radiometric methods. The present work is devoted to the research and development of preliminary quality assessment principles for raw schungite on the basis of image processing principles and perspectives of the optical separation for its [schungite] enrichment. Obtained results of preliminary studies allow us to describe the selective criteria for separation of mentioned raw material by optical method, as well as to propose the method of quality indicator assessing for schungite raw materials. All conceptual and theoretical fundamentals are corroborated by the results of experimental studies of schungite rock samples with breccia and vein textures with different sizes from Maksovo deposit.

  1. Sampling

    CERN Document Server

    Thompson, Steven K


    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  2. [Comparative Analysis of Spectrophotometric Methods of the Protein Measurement in the Pectic Polysaccharide Samples]. (United States)

    Ponomareva, S A; Golovchenko, V V; Patova, O A; Vanchikova, E V; Ovodov, Y S


    For the assay to reliability of determination of the protein content in the pectic polysaccharide samples by absorbance in the ultraviolet and visible regions of the spectrum a comparison of the eleven techniques called Flores, Lovry, Bradford, Sedmak, Rueman (ninhydrin reaction) methods, the method of ultraviolet spectrophotometry, the method Benedict's reagent, the method Nessler's reagent, the method with amide black, the bicinchoninic reagent and the biuret method was carried out. The data obtained show that insufficient sensitivity of the seven methods from the listed techniques doesn't allow their usage for determination of protein content in pectic polysaccharide samples. But the Lowry, Bradford, Sedmak methods, and the method Nessler's reagent may be used for determination of protein content in pectic polysaccharide samples, and the Bradford method is advisable for protein contaminants content determination in pectic polysaccharide samples in case protein content is less than 15%, and the Lowry method--for samples is more than 15%.

  3. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan


    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  4. A TIMS-based method for the high precision measurements of the three-isotope potassium composition of small samples

    DEFF Research Database (Denmark)

    Wielandt, Daniel Kim Peel; Bizzarro, Martin


    A novel thermal ionization mass spectrometry (TIMS) method for the three-isotope analysis of K has been developed, and ion chromatographic methods for the separation of K have been adapted for the processing of small samples. The precise measurement of K-isotopes is challenged by the presence of ...... extra-terrestrial materials....

  5. Sample processing and cDNA preparation for microbial metatranscriptomics in complex soil communities. (United States)

    Carvalhais, Lilia C; Schenk, Peer M


    Soil presents one of the most complex environments for microbial communities as it provides many microhabitats that allow coexistence of thousands of species with important ecosystem functions. These include biomass and nutrient cycling, mineralization, and detoxification. Culture-independent DNA-based methods, such as metagenomics, have revealed operational taxonomic units that suggest a high diversity of microbial species and associated functions in soil. An emerging but technically challenging area to profile the functions of microorganisms and their activities is mRNA-based metatranscriptomics. Here, we describe issues and important considerations of soil sample processing and cDNA preparation for metatranscriptomics from bacteria and archaea and provide a set of methods that can be used in the required experimental steps. © 2013 Elsevier Inc. All rights reserved.

  6. A long-term validation of the modernised DC-ARC-OES solid-sample method

    Energy Technology Data Exchange (ETDEWEB)

    Florian, K. [Dept. of Chemistry, Technical University of Kosice (Slovakia); Hassler, J.; Foerster, O. [Elektroschmelzwerk GmbH, Kempten (Germany)


    The validation procedure based on ISO 17 025 standard has been used to study and illustrate both the long-term stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out. (orig.)

  7. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation. (United States)

    Azemard, Sabine; Vassileva, Emilia


    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Development of a new method for hydrogen isotope analysis of trace hydrocarbons in natural gas samples

    Directory of Open Access Journals (Sweden)

    Xibin Wang


    Full Text Available A new method had been developed for the analysis of hydrogen isotopic composition of trace hydrocarbons in natural gas samples by using solid phase microextraction (SPME combined with gas chromatography-isotope ratio mass spectrometry (GC/IRMS. In this study, the SPME technique had been initially introduced to achieve the enrichment of trace content of hydrocarbons with low abundance and coupled to GC/IRMS for hydrogen isotopic analysis. The main parameters, including the equilibration time, extraction temperature, and the fiber type, were systematically optimized. The results not only demonstrated that high extraction yield was true but also shows that the hydrogen isotopic fractionation was not observed during the extraction process, when the SPME device fitted with polydimethylsiloxane/divinylbenzene/carbon molecular sieve (PDMS/DVB/CAR fiber. The applications of SPME-GC/IRMS method were evaluated by using natural gas samples collected from different sedimentary basins; the standard deviation (SD was better than 4‰ for reproducible measurements; and also, the hydrogen isotope values from C1 to C9 can be obtained with satisfying repeatability. The SPME-GC/IRMS method fitted with PDMS/DVB/CAR fiber is well suited for the preconcentration of trace hydrocarbons, and provides a reliable hydrogen isotopic analysis for trace hydrocarbons in natural gas samples.

  9. Comparative study of methods for DNA preparation from olive oil samples to identify cultivar SSR alleles in commercial oil samples: possible forensic applications. (United States)

    Breton, Catherine; Claux, Delphine; Metton, Isabelle; Skorski, Gilbert; Bervillé, André


    Virgin olive oil is made from diverse cultivars either mixed or single. Those ensure different tastes and typicity, and these may be also enhanced by the region of production of cultivars. The different olive oil labels correspond to their chemical composition and acidity. Labels also may correspond to a protected origin indication, and thus, such oils contain a given composition in cultivars. To verify the main cultivars used at the source of an olive oil sample, our method is based on DNA technology. DNA is present in all olive oil samples and even in refined oil, but the quantity may depend on the oil processing technology and oil conservation conditions. Thus, several supports were used to retain DNA checking different techniques (silica extraction, hydroxyapatite, magnetic beads, and spun column) to prepare DNA from variable amounts of oil. At this stage, it was usable for amplification through PCR technology and especially with the magnetic beads, and further purification processes were checked. Finally, the final method used magnetic beads. DNA is released from beads in a buffer. Once purified, we showed that it did not contain compounds inhibiting PCR amplification using SSR primers. Aliquot dilution fractions of this solution were successfully routinely used through PCR with different SSR primer sets. This enables confident detection of eventual alien alleles in oil samples. First applied to virgin oil samples of known composition, either single cultivars or mixtures of them, the method was verified working on commercial virgin oil samples using bottles bought in supermarkets. Last, we defined a protocol starting from 2 x 40 mL virgin olive oil, and DNA was prepared routinely in about 5 h. It was convenient to genotype together several loci per sample to check whether alleles were in accordance with those of expected cultivars. Thus, forensic applications of our method are expected. However, the method needs further improvement to work on all oil samples.

  10. Method For Brazing And Thermal Processing (United States)

    Milewski, John O.; Dave, Vivek R.; Christensen, Dane; Carpenter, II, Robert W.


    The present invention includes a method for brazing of two objects or heat treatment of one object. First, object or objects to be treated are selected and initial conditions establishing a relative geometry and material characteristics are determined. Then, a first design of an optical system for directing heat energy onto the object or objects is determined. The initial conditions and first design of the optical system are then input into a optical ray-tracing computer program. The program is then run to produce a representative output of the heat energy input distribution to the object or objects. The geometry of the object or objects, material characteristics, and optical system design are then adjusted until an desired heat input is determined.

  11. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne


    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  12. A method for the measurement of shielding effectiveness of planar samples requiring no sample edge preparation or contact


    Marvin, Andrew C.; Dawson, Linda; Flintoft, Ian Dand; Dawson, John F.


    A method is presented for the measurement of shielding effectiveness of planar materials with nonconducting surfaces such as carbon fiber composites. The method overcomes edge termination problems with such materials by absorbing edge-diffracted energy. A dynamic range of up to 100 dB has been demonstrated over a frequency range of 1-8.5 GHz, depending on the size of the sample under test. Comparison with ASTM D4935 and nested reverberation measurements of shielding effectiveness shows good a...

  13. Real-time recursive hyperspectral sample and band processing algorithm architecture and implementation

    CERN Document Server

    Chang, Chein-I


    This book explores recursive architectures in designing progressive hyperspectral imaging algorithms. In particular, it makes progressive imaging algorithms recursive by introducing the concept of Kalman filtering in algorithm design so that hyperspectral imagery can be processed not only progressively sample by sample or band by band but also recursively via recursive equations. This book can be considered a companion book of author’s books, Real-Time Progressive Hyperspectral Image Processing, published by Springer in 2016. Explores recursive structures in algorithm architecture Implements algorithmic recursive architecture in conjunction with progressive sample and band processing Derives Recursive Hyperspectral Sample Processing (RHSP) techniques according to Band-Interleaved Sample/Pixel (BIS/BIP) acquisition format Develops Recursive Hyperspectral Band Processing (RHBP) techniques according to Band SeQuential (BSQ) acquisition format for hyperspectral data.

  14. Simultaneous determination of caffeine, theophylline and theobromine in food samples by a kinetic spectrophotometric method. (United States)

    Xia, Zhenzhen; Ni, Yongnian; Kokot, Serge


    A novel kinetic spectrophotometric method was developed for the simultaneous determination of caffeine, theobromine and theophylline in food samples. This method was based on the different kinetic characteristics between the reactions of analytes with cerium sulphate in sulphuric acid and the associated change in absorbance at 320 nm. Experimental conditions, the effects of sulphuric acid, cerium sulphate and temperature, were optimised. Linear ranges (0.4-8.4 μg mL(-1)) for all three analytes were established, and the limits of detection were: 0.30 μg mL(-1) (caffeine), 0.33 μg mL(-1) (theobromine) and 0.16 μg mL(-1) (theophylline). The recorded data were processed by partial least squares and artificial neural network, and the developed mathematical models were then used for prediction. The proposed, novel method was applied to determine the analytes in commercial food samples, and there were no significant differences between the results from the proposed method and those obtained by high-performance liquid chromatography. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Effect of processing and preservation method on the organoleptic ...

    African Journals Online (AJOL)

    Effect of processing and preservation method on the organoleptic and shelf life of meat products. ... Nigerian Journal of Animal Science ... Two meat products (Beef and Mutton) were subjected to four processing methods-frying, boiling, roasting and oven-drying to investigate the effects of processing and preservation ...

  16. Model-Based Methods in the Biopharmaceutical Process Lifecycle. (United States)

    Kroll, Paul; Hofer, Alexandra; Ulonska, Sophia; Kager, Julian; Herwig, Christoph


    Model-based methods are increasingly used in all areas of biopharmaceutical process technology. They can be applied in the field of experimental design, process characterization, process design, monitoring and control. Benefits of these methods are lower experimental effort, process transparency, clear rationality behind decisions and increased process robustness. The possibility of applying methods adopted from different scientific domains accelerates this trend further. In addition, model-based methods can help to implement regulatory requirements as suggested by recent Quality by Design and validation initiatives. The aim of this review is to give an overview of the state of the art of model-based methods, their applications, further challenges and possible solutions in the biopharmaceutical process life cycle. Today, despite these advantages, the potential of model-based methods is still not fully exhausted in bioprocess technology. This is due to a lack of (i) acceptance of the users, (ii) user-friendly tools provided by existing methods, (iii) implementation in existing process control systems and (iv) clear workflows to set up specific process models. We propose that model-based methods be applied throughout the lifecycle of a biopharmaceutical process, starting with the set-up of a process model, which is used for monitoring and control of process parameters, and ending with continuous and iterative process improvement via data mining techniques.

  17. Cloud condensation nuclei activity and droplet activation kinetics of wet processed regional dust samples and minerals

    Directory of Open Access Journals (Sweden)

    P. Kumar


    Full Text Available This study reports laboratory measurements of particle size distributions, cloud condensation nuclei (CCN activity, and droplet activation kinetics of wet generated aerosols from clays, calcite, quartz, and desert soil samples from Northern Africa, East Asia/China, and Northern America. The dependence of critical supersaturation, sc, on particle dry diameter, Ddry, is used to characterize particle-water interactions and assess the ability of Frenkel-Halsey-Hill adsorption activation theory (FHH-AT and Köhler theory (KT to describe the CCN activity of the considered samples. Wet generated regional dust samples produce unimodal size distributions with particle sizes as small as 40 nm, CCN activation consistent with KT, and exhibit hygroscopicity similar to inorganic salts. Wet generated clays and minerals produce a bimodal size distribution; the CCN activity of the smaller mode is consistent with KT, while the larger mode is less hydrophilic, follows activation by FHH-AT, and displays almost identical CCN activity to dry generated dust. Ion Chromatography (IC analysis performed on regional dust samples indicates a soluble fraction that cannot explain the CCN activity of dry or wet generated dust. A mass balance and hygroscopicity closure suggests that the small amount of ions (from low solubility compounds like calcite present in the dry dust dissolve in the aqueous suspension during the wet generation process and give rise to the observed small hygroscopic mode. Overall these results identify an artifact that may question the atmospheric relevance of dust CCN activity studies using the wet generation method.

    Based on the method of threshold droplet growth analysis, wet generated mineral aerosols display similar activation kinetics compared to ammonium sulfate calibration aerosol. Finally, a unified CCN activity framework that accounts for concurrent effects of solute and adsorption is developed to

  18. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)

    The result of the two processing methods reduced the cyanide concentration to the barest minimum level required by World Health Organization (10mg/kg). The mechanical pressing-fermentation method removed more cyanide when compared to fermentation processing method. Keywords: Cyanide, Fermentation, Manihot ...

  19. Housing decision making methods for initiation development phase process (United States)

    Zainal, Rozlin; Kasim, Narimah; Sarpin, Norliana; Wee, Seow Ta; Shamsudin, Zarina


    Late delivery and sick housing project problems were attributed to poor decision making. These problems are the string of housing developer that prefers to create their own approach based on their experiences and expertise with the simplest approach by just applying the obtainable standards and rules in decision making. This paper seeks to identify the decision making methods for housing development at the initiation phase in Malaysia. The research involved Delphi method by using questionnaire survey which involved 50 numbers of developers as samples for the primary stage of collect data. However, only 34 developers contributed to the second stage of the information gathering process. At the last stage, only 12 developers were left for the final data collection process. Finding affirms that Malaysian developers prefer to make their investment decisions based on simple interpolation of historical data and using simple statistical or mathematical techniques in producing the required reports. It was suggested that they seemed to skip several important decision-making functions at the primary development stage. These shortcomings were mainly due to time and financial constraints and the lack of statistical or mathematical expertise among the professional and management groups in the developer organisations.

  20. Estimation of the Coefficient of Variation with Minimum Risk: A Sequential Method for Minimizing Sampling Error and Study Cost. (United States)

    Chattopadhyay, Bhargab; Kelley, Ken


    The coefficient of variation is an effect size measure with many potential uses in psychology and related disciplines. We propose a general theory for a sequential estimation of the population coefficient of variation that considers both the sampling error and the study cost, importantly without specific distributional assumptions. Fixed sample size planning methods, commonly used in psychology and related fields, cannot simultaneously minimize both the sampling error and the study cost. The sequential procedure we develop is the first sequential sampling procedure developed for estimating the coefficient of variation. We first present a method of planning a pilot sample size after the research goals are specified by the researcher. Then, after collecting a sample size as large as the estimated pilot sample size, a check is performed to assess whether the conditions necessary to stop the data collection have been satisfied. If not an additional observation is collected and the check is performed again. This process continues, sequentially, until a stopping rule involving a risk function is satisfied. Our method ensures that the sampling error and the study costs are considered simultaneously so that the cost is not higher than necessary for the tolerable sampling error. We also demonstrate a variety of properties of the distribution of the final sample size for five different distributions under a variety of conditions with a Monte Carlo simulation study. In addition, we provide freely available functions via the MBESS package in R to implement the methods discussed.

  1. A New Digital Signal Processing Method for Spectrum Interference Monitoring (United States)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.


    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.

  2. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy (United States)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng


    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  3. Development of sampling systems and special analyses for pressurized gasification processes; Paineistettujen kaasutusprosessien naeytteenottomenetelmien ja erityisanalytiikan kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Staahlberg, P.; Oesch, P.; Leppaemaeki, E.; Moilanen, A.; Nieminen, M.; Korhonen, J. [VTT Energy, Espoo (Finland)


    The reliability of sampling methods used for measuring impurities contained in gasification gas were studied, and new methods were developed for sampling and sample analyses. The aim of the method development was to improve the representativeness of the samples and to speed up the analysis of gas composition. The study focused on tar, nitrogen and sulphur compounds contained in the gasification gas. In the study of the sampling reliability, the effects of probe and sampling line materials suitable for high temperatures and of the solids deposited in the sampling devices on gas samples drawn from the process were studied. Measurements were carried out in the temperature range of 250 - 850 deg C both in real conditions and in conditions simulating gasification gas. The durability of samples during storage was also studied. The other main aim of the study was to increase the amount of quick-measurable gas components by developing on-line analytical methods based on GC, FTIR and FI (flow injection) techniques for the measurements of nitrogen and sulphur compounds in gasification gas. As these methods are suitable only for the gases that do not contain condensing gas components disturbing the operation of analysers (heavy tar compounds, water), a sampling system operating in dilution principle was developed. The system operates at high pressures and temperatures and is suitable for gasification gases containing heavy tar compounds. The capabilities of analysing heavy tar compounds (mole weight >200 g mol) was improved by adding the amount of compounds identified and calibrated by model substances and by developing analytical methods based on the high-temperature-GC analysis and the thermogravimetric method. (author)

  4. A new method for modeling coalescent processes with recombination. (United States)

    Wang, Ying; Zhou, Ying; Li, Linfeng; Chen, Xian; Liu, Yuting; Ma, Zhi-Ming; Xu, Shuhua


    Recombination plays an important role in the maintenance of genetic diversity in many types of organisms, especially diploid eukaryotes. Recombination can be studied and used to map diseases. However, recombination adds a great deal of complexity to the genetic information. This renders estimation of evolutionary parameters more difficult. After the coalescent process was formulated, models capable of describing recombination using graphs, such as ancestral recombination graphs (ARG) were also developed. There are two typical models based on which to simulate ARG: back-in-time model such as ms and spatial model including Wiuf&Hein's, SMC, SMC', and MaCS. In this study, a new method of modeling coalescence with recombination, Spatial Coalescent simulator (SC), was developed, which considerably improved the algorithm described by Wiuf and Hein. The present algorithm constructs ARG spatially along the sequence, but it does not produce any redundant branches which are inevitable in Wiuf and Hein's algorithm. Interestingly, the distribution of ARG generated by the present new algorithm is identical to that generated by a typical back-in-time model adopted by ms, an algorithm commonly used to model coalescence. It is here demonstrated that the existing approximate methods such as the sequentially Markov coalescent (SMC), a related method called SMC', and Markovian coalescent simulator (MaCS) can be viewed as special cases of the present method. Using simulation analysis, the time to the most common ancestor (TMRCA) in the local trees of ARGs generated by the present algorithm was found to be closer to that produced by ms than time produced by MaCS. Sample-consistent ARGs can be generated using the present method. This may significantly reduce the computational burden. In summary, the present method and algorithm may facilitate the estimation and description of recombination in population genomics and evolutionary biology.

  5. Microwave preservation method for DMSP, DMSO, and acrylate in unfiltered seawater and phytoplankton culture samples

    National Research Council Canada - National Science Library

    Kinsey, Joanna D; Kieber, David J


    ... T ), dimethylsulfoxide (DMSO T ), and acrylate (acrylate T ) concentrations in unfiltered samples to alleviate problems associated with the acidification method when applied to samples containing Phaeocystis . Microwave‐ and acid...

  6. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods (United States)

    Cynthia D. Huebner


    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  7. Contemporary Sample Preparation Methods for the Detection of Ignitable Liquids in Suspect Arson Cases. (United States)

    Bertsch, W; Ren, Q


    The isolation of ignitable liquid components, usually petroleum-based distillates from fire debris, is an important step in deciding whether a fire is of natural or incendiary origin. Steady progress has been made to develop sample preparation methods capable of enriching target analytes in high yield and within a short period of time. Heated headspace enrichment methods are currently most widely used. There are several variations of this basic technique. Carbon-based adsorbents are most popular. They come in different forms and shapes, including a flat sheet of polymer, impregnated with carbon particles. The analyst cuts a small strip from this sheet and suspends it in the heated headspace above the debris sample. The volatiles adsorb onto the carbon surface, eventually reaching an equilibrium condition. The process is usually carried out in an oven. This convenient method, called the static method, has largely replaced the dynamic method, which uses a granular charcoal adsorbent. In the latter, the heated headspace is drawn over a short trap packed with charcoal, using a source of vacuum such as a pump or pushed along using pressurized nitrogen. The headspace volatiles in both the static and dynamic method are recovered by elution with a solvent, usually carbon disulfide. Recently, a promising variation of the static headspace method has been introduced. It is based on the use of a tiny amount of a polysiloxane polymer which has been coated onto the tip of a thin silica fiber. The fiber can be retracted into a syringe-type needle and the adsorbed headspace vapor can be conveniently introduced into the heated injector port of a gas chromatograph. No solvent is required. This technique, abbreviated SPME (for solid-phase microextraction) has many attractive advantages but it is not without some problems. Low boiling range accelerants, including water-soluble polar substances such as ethanol, are poorly retained on methylsiloxane type polymers. The recent

  8. Recent trends in analytical methods for the determination of amino acids in biological samples. (United States)

    Song, Yanting; Xu, Chang; Kuroki, Hiroshi; Liao, Yiyi; Tsunoda, Makoto


    Amino acids are widely distributed in biological fluids and involved in many biological processes, such as the synthesis of proteins, fatty acids, and ketone bodies. The altered levels of amino acids in biological fluids have been found to be closely related to several diseases, such as type 2 diabetes, kidney disease, liver disease, and cancer. Therefore, the development of analytical methods to measure amino acid concentrations in biological samples can contribute to research on the physiological actions of amino acids and the prediction, diagnosis and understanding of diseases. This review describes the analytical methods reported in 2012-2016 that utilized liquid chromatography and capillary electrophoresis coupled with ultraviolet, fluorescence, mass spectrometry, and electrochemical detection. Additionally, the relationship between amino acid concentrations and several diseases is also summarized. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Note on neural network sampling for Bayesian inference of mixture processes

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); H.K. van Dijk (Herman)


    textabstractIn this paper we show some further experiments with neural network sampling, a class of sampling methods that make use of neural network approximations to (posterior) densities, introduced by Hoogerheide et al. (2007). We consider a method where a mixture of Student's t densities, which

  10. Single-Image Super-Resolution Using Active-Sampling Gaussian Process Regression. (United States)

    Wang, Haijun; Gao, Xinbo; Zhang, Kaibing; Li, Jie


    As well known, Gaussian process regression (GPR) has been successfully applied to example learning-based image super-resolution (SR). Despite its effectiveness, the applicability of a GPR model is limited by its remarkably computational cost when a large number of examples are available to a learning task. For this purpose, we alleviate this problem of the GPR-based SR and propose a novel example learning-based SR method, called active-sampling GPR (AGPR). The newly proposed approach employs an active learning strategy to heuristically select more informative samples for training the regression parameters of the GPR model, which shows significant improvement on computational efficiency while keeping higher quality of reconstructed image. Finally, we suggest an accelerating scheme to further reduce the time complexity of the proposed AGPR-based SR by using a pre-learned projection matrix. We objectively and subjectively demonstrate that the proposed method is superior to other competitors for producing much sharper edges and finer details.

  11. Method development for mass spectrometry based molecular characterization of fossil fuels and biological samples (United States)

    Mahat, Rajendra K.

    In an analytical (chemical) method development process, the sample preparation step usually determines the throughput and overall success of the analysis. Both targeted and non-targeted methods were developed for the mass spectrometry (MS) based analyses of fossil fuels (coal) and lipidomic analyses of a unique micro-organism, Gemmata obscuriglobus. In the non-targeted coal analysis using GC-MS, a microwave-assisted pressurized sample extraction method was compared with the traditional extraction method, such as Soxhlet. On the other hand, methods were developed to establish a comprehensive lipidomic profile and to confirm the presence of endotoxins (a.k.a. lipopolysaccharides, LPS) in Gemmata.. The performance of pressurized heating techniques employing hot-air oven and microwave irradiation were compared with that of Soxhlet method in terms of percentage extraction efficiency and extracted analyte profiles (via GC-MS). Sub-bituminous (Powder River Range, Wyoming, USA) and bituminous (Fruitland formation, Colorado, USA) coal samples were tested. Overall 30-40% higher extraction efficiencies (by weight) were obtained with a 4 hour hot-air oven and a 20 min microwave-heating extraction in a pressurized container when compared to a 72 hour Soxhlet extraction. The pressurized methods are 25 times more economic in terms of solvent/sample amount used and are 216 times faster in term of time invested for the extraction process. Additionally, same sets of compounds were identified by GC-MS for all the extraction methods used: n-alkanes and diterpanes in the sub-bituminous sample, and n-alkanes and alkyl aromatic compounds in the bituminous coal sample. G. obscuriglobus, a nucleated bacterium, is a micro-organism of high significances from evolutionary, cell and environmental biology standpoints. Although lipidomics is an essential tool in microbiological systematics and chemotaxonomy, complete lipid profile of this bacterium is still lacking. In addition, the presence of

  12. Evaluation of micro-colorimetric lipid determination method with samples prepared using sonication and accelerated solvent extraction methods. (United States)

    Billa, Nanditha; Hubin-Barrows, Dylan; Lahren, Tylor; Burkhard, Lawrence P


    Two common laboratory extraction techniques were evaluated for routine use with the micro-colorimetric lipid determination method developed by Van Handel (1985) [2] and recently validated for small samples by Inouye and Lotufo (2006) [1]. With the accelerated solvent extraction method using chloroform:methanol solvent and the colorimetric lipid determination method, 28 of 30 samples had significant proportional bias (α=1%, determined using standard additions) and 1 of 30 samples had significant constant bias (α=1%, determined using Youden Blank measurements). With sonic extraction, 0 of 6 samples had significant proportional bias (α=1%) and 1 of 6 samples had significant constant bias (α=1%). These demonstrate that the accelerated solvent extraction method with chloroform:methanol solvent system creates an interference with the colorimetric assay method, and without accounting for the bias in the analysis, inaccurate measurements would be obtained. Published by Elsevier B.V.

  13. Development and validation of a cleanup method for hydrocarbon containing samples for the analysis of semivolatile organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Hoppe, E.W.; Stromatt, R.W.; Campbell, J.A.; Steele, M.J.; Jones, J.E.


    Samples obtained from the Hanford single shell tanks (SSTs) are contaminated with normal paraffin hydrocarbon (NPH) as hydrostatic fluid from the sampling process or can be native to the tank waste. The contamination is usually high enough that a dilution of up to several orders of magnitude may be required before the sample can be analyzed by the conventional gas chromatography/mass spectrometry methodology. This can prevent detection and measurement of organic constituents that are present at lower concentration levels. To eliminate or minimize the problem, a sample cleanup method has been developed and validated and is presented in this document.

  14. Gap-Filling of Landsat 7 Imagery Using the Direct Sampling Method

    KAUST Repository

    Yin, Gaohong


    The failure of the Scan Line Corrector (SLC) on Landsat 7 imposed systematic data gaps on retrieved imagery and removed the capacity to provide spatially continuous fields. While a number of methods have been developed to fill these gaps, most of the proposed techniques are only applicable over relatively homogeneous areas. When they are applied to heterogeneous landscapes, retrieving image features and elements can become challenging. Here we present a gap-filling approach that is based on the adoption of the Direct Sampling multiple-point geostatistical method. The method employs a conditional stochastic resampling of known areas in a training image to simulate unknown locations. The approach is assessed across a range of both homogeneous and heterogeneous regions. Simulation results show that for homogeneous areas, satisfactory results can be obtained by simply adopting non-gap locations in the target image as baseline training data. For heterogeneous landscapes, bivariate simulations using an auxiliary variable acquired at a different date provides more accurate results than univariate simulations, especially as land cover complexity increases. Apart from recovering spatially continuous fields, one of the key advantages of the Direct Sampling is the relatively straightforward implementation process that relies on relatively few parameters.

  15. Gap-Filling of Landsat 7 Imagery Using the Direct Sampling Method

    Directory of Open Access Journals (Sweden)

    Gaohong Yin


    Full Text Available The failure of the Scan Line Corrector (SLC on Landsat 7 imposed systematic data gaps on retrieved imagery and removed the capacity to provide spatially continuous fields. While a number of methods have been developed to fill these gaps, most of the proposed techniques are only applicable over relatively homogeneous areas. When they are applied to heterogeneous landscapes, retrieving image features and elements can become challenging. Here we present a gap-filling approach that is based on the adoption of the Direct Sampling multiple-point geostatistical method. The method employs a conditional stochastic resampling of known areas in a training image to simulate unknown locations. The approach is assessed across a range of both homogeneous and heterogeneous regions. Simulation results show that for homogeneous areas, satisfactory results can be obtained by simply adopting non-gap locations in the target image as baseline training data. For heterogeneous landscapes, bivariate simulations using an auxiliary variable acquired at a different date provides more accurate results than univariate simulations, especially as land cover complexity increases. Apart from recovering spatially continuous fields, one of the key advantages of the Direct Sampling is the relatively straightforward implementation process that relies on relatively few parameters.


    Directory of Open Access Journals (Sweden)

    Amjad Al-Nasser


    Full Text Available In this paper, we propose a new quality control chart for the sample mean based on the folded ranked set sampling (FRSS method. The new charts are compared with the classical control charts using simple random sampling (SRS and ranked set sampling (RSS. A simulation study shows that the FRSS control charts have smaller average run length (ARL compared with their counterpart charts using SRS and RSS.

  17. A sample preparation process for LC-MS/MS analysis of total protein drug concentrations in monkey plasma samples with antibody. (United States)

    Ji, Qin C; Rodila, Ramona; El-Shourbagy, Tawakol A


    The determination of protein concentrations in plasma samples often provides essential information in biomedical research, clinical diagnostics, and pharmaceutical discovery and development. Binding assays such as ELISA determine meaningful free analyte concentrations by using specific antigen or antibody reagents. Concurrently, mass spectrometric technology is becoming a promising complementary method to traditional binding assays. Mass spectrometric assays generally provide measurements of the total protein analyte concentration. However, it was found that antibodies may bind strongly with the protein analyte such that total concentrations cannot be determined. Thus, a sample preparation process was developed which included a novel "denaturing" step to dissociate binding between antibodies and the protein analyte prior to solid phase extraction of plasma samples and LC-MS/MS analysis. In so doing, the total protein analyte concentrations can be obtained. This sample preparation process was further studied by LC-MS analysis with a full mass range scan. It was found that the protein of interest and other plasma peptides were pre-concentrated, while plasma albumin was depleted in the extracts. This capability of the sample preparation process could provide additional advantages in proteomic research for biomarker discovery and validation. The performance of the assay with the novel denaturing step was further evaluated. The linear dynamic range was between 100.9ng/mL and 53920.0ng/mL with a coefficient of determination (r(2)) ranging from 0.9979 and 0.9997. For LLOQ and ULOQ samples, the inter-assay CV was 12.6% and 2.7% and inter-assay mean accuracies were 103.7% and 99.5% of theoretical concentrations, respectively. For QC samples, the inter-assay CV was between 2.1% and 4.9%, and inter-assay mean accuracies were between 104.1% and 110.0% of theoretical concentrations.

  18. A single-blood-sample method using inulin for estimating feline glomerular filtration rate. (United States)

    Katayama, M; Saito, J; Katayama, R; Yamagishi, N; Murayama, I; Miyano, A; Furuhama, K


    Application of a multisample method using inulin to estimate glomerular filtration rate (GFR) in cats is cumbersome. To establish a simplified procedure to estimate GFR in cats, a single-blood-sample method using inulin was compared with a conventional 3-sample method. Nine cats including 6 clinically healthy cats and 3 cats with spontaneous chronic kidney disease. Retrospective study. Inulin was administered as an intravenous bolus at 50 mg/kg to cats, and blood was collected at 60, 90, and 120 minutes later for the 3-sample method. Serum inulin concentrations were colorimetrically determined by an autoanalyzer method. The GFR in the single-blood-sample method was calculated from the dose injected, serum concentration, sampling time, and estimated volume of distribution on the basis of the data of the 3-sample method. An excellent correlation was observed (r = 0.99, P = .0001) between GFR values estimated by the single-blood-sample and 3-sample methods. The single-blood-sample method using inulin provides a practicable and ethical alternative for estimating glomerular filtration rate in cats. Copyright © 2012 by the American College of Veterinary Internal Medicine.

  19. Evaluation of sampling methods for the detection of Salmonella in broiler flocks

    DEFF Research Database (Denmark)

    Skov, Marianne N.; Carstensen, B.; Tornoe, N.


    The present study compares four different sampling methods potentially applicable to detection of Salmonella in broiler flocks, based on collection of faecal samples (i) by hand, 300 fresh faecal samples (ii) absorbed on five sheets of paper (iii) absorbed on five pairs of socks (elastic cotton...... horizontal or vertical) were found in the investigation. The results showed that the sock method (five pairs of socks) had a sensitivity comparable with the hand collection method (60 pools of five faecal samples); the paper collection method was inferior, as was the use of only one pair of socks, Estimation...


    Directory of Open Access Journals (Sweden)

    Simona Kunová,Miroslava Kačániová


    Full Text Available The aim of this article was the control of disinfection in selected meat processing-plant by two methods – swabbing method and The 3M™ Petrifilm™ plates. Samples were collected within three months (September, October, November 2014. Together 54 samples selected surfaces were collected. Each month six samples were collected using imprint method with Petrifilm plates on total count of microorganisms, 6 samples using imprint method with Petrifilm plates on the number of coliforms and 6 samples using sterile swab method. Samples were collected from the workbench, conveyor belt, cutting blades, meat grinder, wall and knife. Total viable counts (TVC and coliform bacteria (CB were determined in samples. The values of TVC in September obtained by swabbing method were higher than allowable limit in sample no. 1 (3.70×102 and in sample no. 4 (3.35×101 The values of TVC obtained by Petrifilm plates were lower than 10 in all samples. Values of CB obtained by Petrifilm plates were lower than 1 in all samples, CB obtained by swabbing method was 1.6×101 in sample no. 6. Values of TVC obtained in October by Petrifilm plates and also by swabbing method were higher than permissible limit (< 10 in sample no. 2 and sample no. 4. Values of CB obtained by Petrifilm plates were lower than 1 in all samples, CB obtained by swabbing method was 3,65×101 in sample no. 3, value of CB obtained by swab method in sample no. 3 does not meet requirements of internal company standards. Values of TVC obtained in November by Petrifilm plates were lower than 10 in all samples. TVC otained by swab method were 1,25×101 in sample no. 3 and 3,25×101 in sample no. 4. Samples were not in accordance with requirements of internal company standards. Values of CB obtained in November by Petrifilm plates and by swabbing method were after disinfection lower than 1 in all

  1. Contamination Controls for Analysis of Root Canal Samples by Molecular Methods: An Overlooked and Unsolved Problem. (United States)

    Figdor, David; Brundin, Malin


    It has been almost 20 years since molecular methods were first described for the analysis of root canal microbial flora. Contamination control samples are essential to establish DNA decontamination before taking root canal samples, and this review assessed those studies. Using PubMed, a search was conducted for studies using molecular microbial analysis for the investigation of endodontic samples. Studies were grouped according to the cleaning protocol, acquisition methods, and processing of control samples taken to check for contamination. Of 136 studies applying molecular analysis to root canal samples, 21 studies performed surface cleaning and checking nucleotide decontamination with contamination control samples processed by polymerase chain reaction. Only 1 study described disinfection, sampling from the access cavity, and processing by polymerase chain reaction and reported the result; that study reported that all samples contained contaminating bacterial DNA. Cleaning, disinfection, and checking for contamination are basic scientific prerequisites for this type of investigation; yet, this review identifies it as an overlooked issue. On the basis of this review, we call for improved scientific practice in this field. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.


    Directory of Open Access Journals (Sweden)

    Milan Radenkovic


    Full Text Available In this paper, Kaikaku method is presented. The essence of this method is introduction, principles and ways of implementation in the real systems. The main point how Kaikaku method influences on quality. It is presented on the practical example (furniture industry, one way how to implement Kaikaku method and how influence on quality improvement of production process.

  3. Methods of sampling airborne fungi in working environments of waste treatment facilities

    Directory of Open Access Journals (Sweden)

    Kristýna Černá


    Full Text Available Objectives: The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Material and Methods: Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. Results: The total number of colony-forming units (CFU/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001. Detected concentrations of airborne fungi ranged 2×102–1.7×106 CFU/m3 when using the membrane filters (MF method, and 3×102–6.4×104 CFU/m3 when using the surface air system (SAS method. Conclusions: Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities.

  4. A Variable Sampling Interval Synthetic Xbar Chart for the Process Mean


    Lee, Lei Yong; Khoo, Michael Boon Chong; Teh, Sin Yin; Lee, Ming Ha


    The usual practice of using a control chart to monitor a process is to take samples from the process with fixed sampling interval (FSI). In this paper, a synthetic X ? control chart with the variable sampling interval (VSI) feature is proposed for monitoring changes in the process mean. The VSI synthetic X ? chart integrates the VSI X ? chart and the VSI conforming run length (CRL) chart. The proposed VSI synthetic X ? chart is evaluated using the average time to signal (ATS) criterion. The o...

  5. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs

    Directory of Open Access Journals (Sweden)

    Min-Kyu Kim


    Full Text Available This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs. The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.

  6. Comparison of preprocessing methods and storage times for touch DNA samples. (United States)

    Dong, Hui; Wang, Jing; Zhang, Tao; Ge, Jian-Ye; Dong, Ying-Qiang; Sun, Qi-Fan; Liu, Chao; Li, Cai-Xia


    To select appropriate preprocessing methods for different substrates by comparing the effects of four different preprocessing methods on touch DNA samples and to determine the effect of various storage times on the results of touch DNA sample analysis. Hand touch DNA samples were used to investigate the detection and inspection results of DNA on different substrates. Four preprocessing methods, including the direct cutting method, stubbing procedure, double swab technique, and vacuum cleaner method, were used in this study. DNA was extracted from mock samples with four different preprocessing methods. The best preprocess protocol determined from the study was further used to compare performance after various storage times. DNA extracted from all samples was quantified and amplified using standard procedures. The amounts of DNA and the number of alleles detected on the porous substrates were greater than those on the non-porous substrates. The performances of the four preprocessing methods varied with different substrates. The direct cutting method displayed advantages for porous substrates, and the vacuum cleaner method was advantageous for non-porous substrates. No significant degradation trend was observed as the storage times increased. Different substrates require the use of different preprocessing method in order to obtain the highest DNA amount and allele number from touch DNA samples. This study provides a theoretical basis for explorations of touch DNA samples and may be used as a reference when dealing with touch DNA samples in case work.

  7. Misrepresentation of hydro-erosional processes in rainfall simulations using disturbed soil samples (United States)

    Thomaz, Edivaldo L.; Pereira, Adalberto A.


    Interrill erosion is a primary soil erosion process which consists of soil detachment by raindrop impact and particle transport by shallow flow. Interill erosion affects other soil erosion sub-processes, e.g., water infiltration, sealing, crusting, and rill initiation. Interrill erosion has been widely studied in laboratories, and the use of a sieved soil, i.e., disturbed soil, has become a standard method in laboratory experiments. The aims of our study are to evaluate the hydro-erosional response of undisturbed and disturbed soils in a laboratory experiment, and to quantify the extent to which hydraulic variables change during a rainstorm. We used a splash pan of 0.3 m width, 0.45 m length, and 0.1 m depth. A rainfall simulation of 58 mm h- 1 lasting for 30 min was conducted on seven replicates of undisturbed and disturbed soils. During the experiment, several hydro-physical parameters were measured, including splashed sediment, mean particle size, runoff, water infiltration, and soil moisture. We conclude that use of disturbed soil samples results in overestimation of interrill processes. Of the nine assessed parameters, four displayed greater responses in the undisturbed soil: infiltration, topsoil shear strength, mean particle size of eroded particles, and soil moisture. In the disturbed soil, five assessed parameters displayed greater responses: wash sediment, final runoff coefficient, runoff, splash, and sediment yield. Therefore, contextual soil properties are most suitable for understanding soil erosion, as well as for defining soil erodibility.

  8. Empirically simulated study to compare and validate sampling methods used in aerial surveys of wildlife populations

    NARCIS (Netherlands)

    Khaemba, W.M.; Stein, A.; Rasch, D.; Leeuw, de J.; Georgiadis, N.


    This paper compares the distribution, sampling and estimation of abundance for two animal species in an African ecosystem by means of an intensive simulation of the sampling process under a geographical information system (GIS) environment. It focuses on systematic and random sampling designs,

  9. A Novel Method of Failure Sample Selection for Electrical Systems Using Ant Colony Optimization. (United States)

    Xiong, Jian; Tian, Shulin; Yang, Chenglin; Liu, Cheng


    The influence of failure propagation is ignored in failure sample selection based on traditional testability demonstration experiment method. Traditional failure sample selection generally causes the omission of some failures during the selection and this phenomenon could lead to some fearful risks of usage because these failures will lead to serious propagation failures. This paper proposes a new failure sample selection method to solve the problem. First, the method uses a directed graph and ant colony optimization (ACO) to obtain a subsequent failure propagation set (SFPS) based on failure propagation model and then we propose a new failure sample selection method on the basis of the number of SFPS. Compared with traditional sampling plan, this method is able to improve the coverage of testing failure samples, increase the capacity of diagnosis, and decrease the risk of using.

  10. Understanding scaling through history-dependent processes with collapsing sample space. (United States)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan


    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes.

  11. [Comparison of the designing effects (DE) among different designs related to complex sampling methods]. (United States)

    Wang, Jian-Sheng; Feng, Guo-Shuang; Yu, Shi-Cheng; Ma, Lin-Mao; Zhou, Mai-Geng; Liu, Shi-Yao


    To compare the designing effects (DE) among different complex sampling designing programs. Data from the '2002 Chinese Nutrition and Health Survey' was used as an example to generate the sampling population, and statistical simulation method was used to estimate the values of DEs from six complex sampling designing programs. It was found that the values of DEs varied among the six complex sampling designing programs. The values of the DEs were associated with the sample sizes in a positive way, with more sample stages and less stratified categories. Reduction of the numbers of sample stages and detailing stratified categories could decrease the DE values so as to improve the DE.

  12. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling. (United States)

    Rauscher, Sarah; Neale, Chris; Pomès, Régis


    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  13. A standardized method for sampling and extraction methods for quantifying microplastics in beach sand. (United States)

    Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs


    Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Individual and pen-based oral fluid sampling: A welfare-friendly sampling method for group-housed gestating sows. (United States)

    Pol, Françoise; Dorenlor, Virginie; Eono, Florent; Eudier, Solveig; Eveno, Eric; Liégard-Vanhecke, Dorine; Rose, Nicolas; Fablet, Christelle


    The aims of this study were to assess the feasibility of individual and pen-based oral fluid sampling (OFS) in 35 pig herds with group-housed sows, compare these methods to blood sampling, and assess the factors influencing the success of sampling. Individual samples were collected from at least 30 sows per herd. Pen-based OFS was performed using devices placed in at least three pens for 45min. Information related to the farm, the sows, and their living conditions were collected. Factors significantly associated with the duration of sampling and the chewing behaviour of sows were identified by logistic regression. Individual OFS took 2min 42s on average; the type of floor, swab size, and operator were associated with a sampling time >2min. Pen-based OFS was obtained from 112 devices (62.2%). The type of floor, parity, pen-level activity, and type of feeding were associated with chewing behaviour. Pen activity was associated with the latency to interact with the device. The type of floor, gestation stage, parity, group size, and latency to interact with the device were associated with a chewing time >10min. After 15, 30 and 45min of pen-based OFS, 48%, 60% and 65% of the sows were lying down, respectively. The time spent after the beginning of sampling, genetic type, and time elapsed since the last meal were associated with 50% of the sows lying down at one time point. The mean time to blood sample the sows was 1min 16s and 2min 52s if the number of operators required was considered in the sampling time estimation. The genetic type, parity, and type of floor were significantly associated with a sampling time higher than 1min 30s. This study shows that individual OFS is easy to perform in group-housed sows by a single operator, even though straw-bedded animals take longer to sample than animals housed on slatted floors, and suggests some guidelines to optimise pen-based OFS success. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Estimation method for mathematical expectation of continuous variable upon ordered sample


    Domchenkov, O. A.


    Method for estimation of mathematical expectation of a continuous variable based on analysis of the ordered sample is proposed. The method admits the estimation class propagation on nonlinear estimation classes.

  16. Methods of sampling airborne fungi in working environments of waste treatment facilities. (United States)

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk


    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  17. [Evaluation of usefulness of different methods for detection of Cryptosporidium in human and animal stool samples]. (United States)

    Werner, Anna; Sulima, Paweł; Majewska, Anna C


    There are many methods for detection of Cryptosporidium oocysts. Most of them (more than 20) enable the microscopic detection of Cryptosporidium oocysts in faecal smears. Such a great variability of diagnostic methods may lead to confusion as far as the choice of an appropriate technique by a given laboratory is concerned. This study evaluated the diagnostic usefulness of Cryptosporidium oocysts and coproantigen detection methods in the diagnosis of cryptosporidiosis in human (266 stool specimen) and animals (205 from cattle, 160 from sheep, 30 from horses, 80 from cats, 227 from dogs and 11 from wild animals). The total number of human and animal stool specimens processed was 266 and 713, respectively. In this study the usefulness of several diagnostic methods was compared. The following techniques were taken into account: wet mounts, hematoxylin staining, four different specific methods (modified Zeihl-Neelsen, Kinyoun's, safranin-methylene blue, as well as carbol-methyl violet and tartrazyne) and commercially available kit based on enzyme-linked immunoassay (ProspecT(r) Cryptosporidium Microplate Assay). The final number of positive specimens was 123. Out of them 77 were positive in all specific methods. The oocysts found in stool specimens were measured. Humans were infected with C. parvum and animals with C. parvum, C. andersoni or C. felis. The statistical analysis has shown that EIA test was a better than microscopy method for identification of Cryptosporidium in faecal samples in human and wild animal. Sensitivity and specificity are important factors for the choice of a proper diagnostic method for Cryptosporidium detection, however other factors such as cost, simplicity and ease of interpretation of results are also important considerations.

  18. A process variant modeling method comparison : Experience report

    NARCIS (Netherlands)

    Aysolmaz, Banu; Yaldiz, Ali; Reijers, Hajo


    Various process variant modeling methods have been introduced in the literature to manage process diversity in a business context. In industrial settings, it is difficult to select a method suitable for the needs and limitations of the organization due to the limited number of examples and

  19. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)



    Jun 7, 2014 ... wheat alternative for celiac disease patients. (FAO/WHO, 1991). Owing to the presence of the cyanogenic glycoside in cassava, various processing methods are employed to bring about a reduction in the toxicity of the roots. Studies on a wide variety of traditional cassava processing methods have been.

  20. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  1. An improved adaptive sampling and experiment design method for aerodynamic optimization

    Directory of Open Access Journals (Sweden)

    Huang Jiangtao


    Full Text Available Experiment design method is a key to construct a highly reliable surrogate model for numerical optimization in large-scale project. Within the method, the experimental design criterion directly affects the accuracy of the surrogate model and the optimization efficient. According to the shortcomings of the traditional experimental design, an improved adaptive sampling method is proposed in this paper. The surrogate model is firstly constructed by basic sparse samples. Then the supplementary sampling position is detected according to the specified criteria, which introduces the energy function and curvature sampling criteria based on radial basis function (RBF network. Sampling detection criteria considers both the uniformity of sample distribution and the description of hypersurface curvature so as to significantly improve the prediction accuracy of the surrogate model with much less samples. For the surrogate model constructed with sparse samples, the sample uniformity is an important factor to the interpolation accuracy in the initial stage of adaptive sampling and surrogate model training. Along with the improvement of uniformity, the curvature description of objective function surface gradually becomes more important. In consideration of these issues, crowdness enhance function and root mean square error (RMSE feedback function are introduced in C criterion expression. Thus, a new sampling method called RMSE and crowdness enhance (RCE adaptive sampling is established. The validity of RCE adaptive sampling method is studied through typical test function firstly and then the airfoil/wing aerodynamic optimization design problem, which has high-dimensional design space. The results show that RCE adaptive sampling method not only reduces the requirement for the number of samples, but also effectively improves the prediction accuracy of the surrogate model, which has a broad prospects for applications.

  2. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar


    Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  3. Methods and devices for hyperpolarising and melting NMR samples in a cryostat

    DEFF Research Database (Denmark)

    Ardenkjaer-Larsen, Jan Henrik; Axelsson, Oskar H. E.; Golman, Klaes Koppel


    The present invention relates to devices and method for melting solid polarised sample while retaining a high level of polarisation. In an embodiment of the present invention a sample is polarised in a sample-retaining cup 9 in a strong magnetic field in a polarising means 3a, 3b, 3c in a cryosta...

  4. Optical Methods for Identifying Hard Clay Core Samples During Petrophysical Studies (United States)

    Morev, A. V.; Solovyeva, A. V.; Morev, V. A.


    X-ray phase analysis of the general mineralogical composition of core samples from one of the West Siberian fields was performed. Electronic absorption spectra of the clay core samples with an added indicator were studied. The speed and availability of applying the two methods in petrophysical laboratories during sample preparation for standard and special studies were estimated.

  5. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation. (United States)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.


    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  6. Method of treating contaminated HEPA filter media in pulp process (United States)

    Hu, Jian S.; Argyle, Mark D.; Demmer, Ricky L.; Mondok, Emilio P.


    A method for reducing contamination of HEPA filters with radioactive and/or hazardous materials is described. The method includes pre-processing of the filter for removing loose particles. Next, the filter medium is removed from the housing, and the housing is decontaminated. Finally, the filter medium is processed as pulp for removing contaminated particles by physical and/or chemical methods, including gravity, flotation, and dissolution of the particles. The decontaminated filter medium is then disposed of as non-RCRA waste; the particles are collected, stabilized, and disposed of according to well known methods of handling such materials; and the liquid medium in which the pulp was processed is recycled.

  7. New Principles of Process Control in Geotechnics by Acoustic Methods

    Directory of Open Access Journals (Sweden)

    Leššo, I.


    Full Text Available The contribution describes the new solution of the control of rotary drilling process as some elementary process in geotechnics. The article presents the first results of research on the utilization of acoustic methods in identification process by optimal control of rotary drilling.

  8. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods (United States)

    Soroush, Masoud; Weinberger, Charles B.


    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  9. Effects of Processing Methods on Some Phytochemicals Present in ...

    African Journals Online (AJOL)

    The effects of extrusion cooking, roasting, aqueous and alkaline thermal process on the reduction of the levels of some phytochemicals in the seeds of white lupin (Lupinus albus L.) were studied. Aqueous thermal process was found to be better than the other processing methods in reducing the concentration of phytic acid.

  10. Effect of thermal processing methods on the proximate composition ...

    African Journals Online (AJOL)

    The nutritive value of raw and thermal processed castor oil seed (Ricinus communis) was investigated using the following parameters; proximate composition, gross energy, mineral constituents and ricin content. Three thermal processing methods; toasting, boiling and soaking-and-boiling were used in the processing of the ...

  11. Effects of processing methods on the antinutrional factor and the ...

    African Journals Online (AJOL)

    The effect of processing on phytic acid (PA) reduction and nutritional composition of sesame seed was investigated. Raw sesame seed (RASS) was compared with seeds processed by three different methods: roasted (ROSS), boiled (BOSS) and soaked (SOSS) sesame seeds. Processing had no significant (P>0.05) effects ...

  12. Stability of arsenic compounds in seafood samples during processing and storage by freezing

    DEFF Research Database (Denmark)

    Dahl, Lisbeth; Molin, Marianne; Amlund, Heidi


    was observed after processing or after storage by freezing. The content of tetramethylarsonium ion was generally low in all samples types, but increased significantly in all fried samples of both fresh and frozen seafood. Upon storage by freezing, the arsenobetaine content was reduced significantly, but only...

  13. Universal nucleic acids sample preparation method for cells, spores and their mixture (United States)

    Bavykin, Sergei [Darien, IL


    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  14. A Method for Microalgae Proteomics Analysis Based on Modified Filter-Aided Sample Preparation. (United States)

    Li, Song; Cao, Xupeng; Wang, Yan; Zhu, Zhen; Zhang, Haowei; Xue, Song; Tian, Jing


    With the fast development of microalgal biofuel researches, the proteomics studies of microalgae increased quickly. A filter-aided sample preparation (FASP) method is widely used proteomics sample preparation method since 2009. Here, a method of microalgae proteomics analysis based on modified filter-aided sample preparation (mFASP) was described to meet the characteristics of microalgae cells and eliminate the error caused by over-alkylation. Using Chlamydomonas reinhardtii as the model, the prepared sample was tested by standard LC-MS/MS and compared with the previous reports. The results showed mFASP is suitable for most of occasions of microalgae proteomics studies.

  15. Method and device for detecting a similarity in the shape of sampled signals

    NARCIS (Netherlands)

    Coenen, A.J.R.


    Method for detecting a similarity in shape between a first and a second sampled signal fragment. The method comprises the steps of: formation of a first fragment function from the first sampled signal fragment by means of inverse interpolation, definition of a domain interval, for example the time

  16. The mouthwash : A non-invasive sampling method to study cytokine gene polymorphisms

    NARCIS (Netherlands)

    Laine, ML; Farre, MA; Crusius, JBA; van Winkelhoff, AJ; Pena, AS

    Background: We describe a simple, non-invasive mouthwash sampling method for rapid DNA isolation to detect cytokine gene polymorphisms. In the present paper, interleukin-1 beta (IL-1B) and interleukin-1 receptor antagonist (IL-1RN) gene polymorphisms were studied. Methods: Two mouthwash samples and

  17. Sampling Methods and the Accredited Population in Athletic Training Education Research (United States)

    Carr, W. David; Volberding, Jennifer


    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  18. A simple method for determination of natural and depleted uranium in surface soil samples. (United States)

    Vukanac, I; Novković, D; Kandić, A; Djurasević, M; Milosević, Z


    A simple and efficient method for determination of uranium content in surface soil samples contaminated with depleted uranium, by gamma ray spectrometry is presented. The content of natural uranium and depleted uranium, as well as the activity ratio (235)U/(238)U of depleted uranium, were determined in contaminated surface soil samples by application of this method. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Genome Wide Sampling Sequencing for SNP Genotyping: Methods, Challenges and Future Development. (United States)

    Jiang, Zhihua; Wang, Hongyang; Michal, Jennifer J; Zhou, Xiang; Liu, Bang; Woods, Leah C Solberg; Fuchs, Rita A


    Genetic polymorphisms, particularly single nucleotide polymorphisms (SNPs), have been widely used to advance quantitative, functional and evolutionary genomics. Ideally, all genetic variants among individuals should be discovered when next generation sequencing (NGS) technologies and platforms are used for whole genome sequencing or resequencing. In order to improve the cost-effectiveness of the process, however, the research community has mainly focused on developing genome-wide sampling sequencing (GWSS) methods, a collection of reduced genome complexity sequencing, reduced genome representation sequencing and selective genome target sequencing. Here we review the major steps involved in library preparation, the types of adapters used for ligation and the primers designed for amplification of ligated products for sequencing. Unfortunately, currently available GWSS methods have their drawbacks, such as inconsistency in the number of reads per sample library, the number of sites/targets per individual, and the number of reads per site/target, all of which result in missing data. Suggestions are proposed here to improve library construction, genotype calling accuracy, genome-wide marker density and read mapping rate. In brief, optimized GWSS library preparation should generate a unique set of target sites with dense distribution along chromosomes and even coverage per site across all individuals.

  20. Direct infusion-SIM as fast and robust method for absolute protein quantification in complex samples

    Directory of Open Access Journals (Sweden)

    Christina Looße


    Full Text Available Relative and absolute quantification of proteins in biological and clinical samples are common approaches in proteomics. Until now, targeted protein quantification is mainly performed using a combination of HPLC-based peptide separation and selected reaction monitoring on triple quadrupole mass spectrometers. Here, we show for the first time the potential of absolute quantification using a direct infusion strategy combined with single ion monitoring (SIM on a Q Exactive mass spectrometer. By using complex membrane fractions of Escherichia coli, we absolutely quantified the recombinant expressed heterologous human cytochrome P450 monooxygenase 3A4 (CYP3A4 comparing direct infusion-SIM with conventional HPLC-SIM. Direct-infusion SIM revealed only 14.7% (±4.1 (s.e.m. deviation on average, compared to HPLC-SIM and a decreased processing and analysis time of 4.5 min (that could be further decreased to 30 s for a single sample in contrast to 65 min by the LC–MS method. Summarized, our simplified workflow using direct infusion-SIM provides a fast and robust method for quantification of proteins in complex protein mixtures.

  1. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection. (United States)

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao


    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel "melting temperature (Tm) mapping method" for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a "match" or "broad match" with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment.

  2. Comparison of granulometric methods and sampling strategies used in marine habitat classification and Ecological Status assessment. (United States)

    Forde, James; Collins, Patrick Colman; Patterson, Adrian; Kennedy, Robert


    Sediment particle size analysis (PSA) is routinely used to support benthic macrofaunal community distribution data in habitat mapping and Ecological Status (ES) assessment. No optimal PSA Method to explain variability in multivariate macrofaunal distribution has been identified nor have the effects of changing sampling strategy been examined. Here, we use benthic macrofaunal and PSA grabs from two embayments in the south of Ireland. Four frequently used PSA Methods and two common sampling strategies are applied. A combination of laser particle sizing and wet/dry sieving without peroxide pre-treatment to remove organics was identified as the optimal Method for explaining macrofaunal distributions. ES classifications and EUNIS sediment classification were robust to changes in PSA Method. Fauna and PSA samples returned from the same grab sample significantly decreased macrofaunal variance explained by PSA and caused ES to be classified as lower. Employing the optimal PSA Method and sampling strategy will improve benthic monitoring. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Method and apparatus for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling (United States)

    Farthing, William Earl [Pinson, AL; Felix, Larry Gordon [Pelham, AL; Snyder, Todd Robert [Birmingham, AL


    An apparatus and method for diluting and cooling that is extracted from high temperature and/or high pressure industrial processes. Through a feedback process, a specialized, CFD-modeled dilution cooler is employed along with real-time estimations of the point at which condensation will occur within the dilution cooler to define a level of dilution and diluted gas temperature that results in a gas that can be conveyed to standard gas analyzers that contains no condensed hydrocarbon compounds or condensed moisture.

  4. Method and apparatus maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling (United States)

    Farthing, William Earl; Felix, Larry Gordon; Snyder, Todd Robert


    An apparatus and method for diluting and cooling that is extracted from high temperature and/or high pressure industrial processes. Through a feedback process, a specialized, CFD-modeled dilution cooler is employed along with real-time estimations of the point at which condensation will occur within the dilution cooler to define a level of dilution and diluted gas temperature that results in a gas that can be conveyed to standard gas analyzers that contains no condensed hydrocarbon compounds or condensed moisture.

  5. Applicability of Greulich and Pyle and Demirijan aging methods to a sample of Italian population. (United States)

    Santoro, Valeria; Roca, Roberta; De Donno, Antonio; Fiandaca, Chiara; Pinto, Giorgia; Tafuri, Silvio; Introna, Francesco


    Age estimation in forensics is essential in cases involving both living and dead subjects. For living subjects, age estimation may be used to establish an individual's status as a minor in cases involving adoption, criminal responsibility, child pornography, and those seeking asylum. Criteria for age estimation in the living have recently been put forth by The Study Group on Forensic Age Diagnostics. The group has proposed guidelines with a three-step procedure: a physical examination and anthropometrical analysis; dental analysis by orthopantomogram (OPG); and X-ray study of the left hand and wrist. The board of FASE highlighted advantages and limits of each method, and suggested practical solutions concerning the age estimation process for adults and subadults. The aim of this study was to verify the applicability of the Greulich and Pyle, and Demirjian techniques on a sample group of Italians, whose ages were known, in determining the skeletal and dental age, in addition to evaluating the reliability of these techniques. 535 subjects between the ages of 7 and 15years were examined, each one undergoing both an orthopantomography (OPG) and radiography of the left wrist and hand. The data obtained underwent statistical analysis. The analyses have shown that a correlation exists between skeletal and dental age, and real age. Age estimation carried out using the Greulich and Pyle method has shown itself to be especially accurate on the Italian sample, particularly in the age ranges of 7-9years and 10.4-11.5years. The Greulich and Pyle method has shown itself to be reliable for the sample analyzed notwithstanding the ethnic differences between the original sample of reference and those analyzed in this study. Application of the Demirjian technique resulted in an overestimation of dental age. This difference is shown to be more highly significant in the higher age ranges. The combination of the Greulich and Pyle, and Demirjian methods have revealed a difference

  6. Quantitative method of determining beryllium or a compound thereof in a sample (United States)

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.


    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  7. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method (United States)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.


    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  8. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. (United States)

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly


    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  9. Optical method and system for the characterization of laterally-patterned samples in integrated circuits (United States)

    Maris, Humphrey J.


    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  10. Optimization of Initial Prostate Biopsy in Clinical Practice: Sampling, Labeling, and Specimen Processing (United States)

    Bjurlin, Marc A.; Carter, H. Ballentine; Schellhammer, Paul; Cookson, Michael S.; Gomella, Leonard G.; Troyer, Dean; Wheeler, Thomas M.; Schlossberg, Steven; Penson, David F.; Taneja, Samir S.


    Purpose An optimal prostate biopsy in clinical practice is based on a balance between adequate detection of clinically significant prostate cancers (sensitivity), assuredness regarding the accuracy of negative sampling (negative predictive value [NPV]), limited detection of clinically insignificant cancers, and good concordance with whole-gland surgical pathology results to allow accurate risk stratification and disease localization for treatment selection. Inherent within this optimization is variation of the core number, location, labeling, and processing for pathologic evaluation. To date, there is no consensus in this regard. The purpose of this review is 3-fold: 1. To define the optimal number and location of biopsy cores during primary prostate biopsy among men with suspected prostate cancer, 2. To define the optimal method of labeling prostate biopsy cores for pathologic processing that will provide relevant and necessary clinical information for all potential clinical scenarios, and 3. To determine the maximal number of prostate biopsy cores allowable within a specimen jar that would not preclude accurate histologic evaluation of the tissue. Materials and Methods A bibliographic search covering the period up to July, 2012 was conducted using PubMed®. This search yielded approximately 550 articles. Articles were reviewed and categorized based on which of the three objectives of this review was addressed. Data was extracted, analyzed, and summarized. Recommendations based on this literature review and our clinical experience is provided. Results The use of 10–12-core extended-sampling protocols increases cancer detection rates (CDRs) compared to traditional sextant sampling methods and reduces the likelihood that patients will require a repeat biopsy by increasing NPV, ultimately allowing more accurate risk stratification without increasing the likelihood of detecting insignificant cancers. As the number of cores increases above 12 cores, the increase in

  11. Optimization of the solvent-based dissolution method to sample volatile organic compound vapors for compound-specific isotope analysis. (United States)

    Bouchard, Daniel; Wanner, Philipp; Luo, Hong; McLoughlin, Patrick W; Henderson, James K; Pirkle, Robert J; Hunkeler, Daniel


    The methodology of the solvent-based dissolution method used to sample gas phase volatile organic compounds (VOC) for compound-specific isotope analysis (CSIA) was optimized to lower the method detection limits for TCE and benzene. The sampling methodology previously evaluated by [1] consists in pulling the air through a solvent to dissolve and accumulate the gaseous VOC. After the sampling process, the solvent can then be treated similarly as groundwater samples to perform routine CSIA by diluting an aliquot of the solvent into water to reach the required concentration of the targeted contaminant. Among solvents tested, tetraethylene glycol dimethyl ether (TGDE) showed the best aptitude for the method. TGDE has a great affinity with TCE and benzene, hence efficiently dissolving the compounds during their transition through the solvent. The method detection limit for TCE (5±1μg/m3) and benzene (1.7±0.5μg/m3) is lower when using TGDE compared to methanol, which was previously used (385μg/m3 for TCE and 130μg/m3 for benzene) [2]. The method detection limit refers to the minimal gas phase concentration in ambient air required to load sufficient VOC mass into TGDE to perform δ13C analysis. Due to a different analytical procedure, the method detection limit associated with δ37Cl analysis was found to be 156±6μg/m3 for TCE. Furthermore, the experimental results validated the relationship between the gas phase TCE and the progressive accumulation of dissolved TCE in the solvent during the sampling process. Accordingly, based on the air-solvent partitioning coefficient, the sampling methodology (e.g. sampling rate, sampling duration, amount of solvent) and the final TCE concentration in the solvent, the concentration of TCE in the gas phase prevailing during the sampling event can be determined. Moreover, the possibility to analyse for TCE concentration in the solvent after sampling (or other targeted VOCs) allows the field deployment of the sampling method

  12. Molecular cancer classification using a meta-sample-based regularized robust coding method. (United States)

    Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen


    Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.

  13. Evaluation of a modified sampling method for molecular analysis of air microflora. (United States)

    Lech, T; Ziembinska-Buczynska, A


    A serious issue concerning the durability of economically important materials for humans related to cultural heritage is the process of biodeterioration. As a result of this phenomenon, priceless works of art, documents, and old prints have undergone a process of decomposition caused by microorganisms. Therefore, it is important to constantly monitor the presence and diversity of microorganisms in exposition rooms and storage areas of historical objects. In addition, the use of molecular biology tools for conservation studies will enable detailed research as well as reduce the time needed to perform the analyses compared with using conventional methods related to microbiology and conservation. The aim of this study was to adapt the sampling indoor air method for direct DNA extraction from microorganisms, including evaluating the extracted DNA quality and concentration. The obtained DNA was used to study the diversity of mold fungi in indoor air using polymerase chain reaction-denaturing gradient gel electrophoresis in specific archives and museum environments. The research was conducted in 2 storage rooms of the National Archives in Krakow and in 1 exposition room of the Archaeological Museum in Krakow (Poland).

  14. Sample-Size Planning for More Accurate Statistical Power: A Method Adjusting Sample Effect Sizes for Publication Bias and Uncertainty. (United States)

    Anderson, Samantha F; Kelley, Ken; Maxwell, Scott E


    The sample size necessary to obtain a desired level of statistical power depends in part on the population value of the effect size, which is, by definition, unknown. A common approach to sample-size planning uses the sample effect size from a prior study as an estimate of the population value of the effect to be detected in the future study. Although this strategy is intuitively appealing, effect-size estimates, taken at face value, are typically not accurate estimates of the population effect size because of publication bias and uncertainty. We show that the use of this approach often results in underpowered studies, sometimes to an alarming degree. We present an alternative approach that adjusts sample effect sizes for bias and uncertainty, and we demonstrate its effectiveness for several experimental designs. Furthermore, we discuss an open-source R package, BUCSS, and user-friendly Web applications that we have made available to researchers so that they can easily implement our suggested methods.

  15. S2I techniques for analog sampled-data signal processing

    DEFF Research Database (Denmark)

    Machado, Gerson A. S.; Toumazou, Chris; Saether, Geir E.


    Some recent developments in Analog-Sampled-Data Signal Processing (ASD SP) are reviewed. Following a brief review of the state-of-the-art in switched capacitor (SC) signal processing, the "current mode" switched-current (SI/S2I) technique is presented. New techniques for exploring niches in low...

  16. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012 (United States)

    Zuellig, Robert E.; Bruce, James F.; Stogner, Sr., Robert W.; Brown, Krystal D.


    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  17. A new enrichment method for isolation of Bacillus thuringiensis from diverse sample types. (United States)

    Patel, Ketan D; Bhanshali, Forum C; Chaudhary, Avani V; Ingle, Sanjay S


    New or more efficient methodologies having different principles are needed, as one method could not be suitable for isolation of organisms from samples of diverse types and from various environments. In present investigation, growth kinetics study revealed a higher germination rate, a higher growth rate, and maximum sporulation of Bacillus thuringiensis (Bt) compared to other Bacillus species. Considering these facts, a simple and efficient enrichment method was devised which allowed propagation of spores and vegetative cells of Bt and thereby increased Bt cell population proportionately. The new enrichment method yielded Bt from 44 out of 58 samples. Contrarily, Bt was isolated only from 16 and 18 samples by sodium acetate selection and dry heat pretreatment methods, respectively. Moreover, the percentages of Bt colonies isolated by the enrichment method were higher comparatively. Vegetative whole cell protein profile analysis indicated isolation of diverse population of Bt from various samples. Bt strains isolated by the enrichment method represented novel serovars and possibly new cry2 gene.

  18. Unexpected toxicity to aquatic organisms of some aqueous bisphenol A samples treated by advanced oxidation processes. (United States)

    Tišler, Tatjana; Erjavec, Boštjan; Kaplan, Renata; Şenilă, Marin; Pintar, Albin


    In this study, photocatalytic and catalytic wet-air oxidation (CWAO) processes were used to examine removal efficiency of bisphenol A from aqueous samples over several titanate nanotube-based catalysts. Unexpected toxicity of bisphenol A (BPA) samples treated by means of the CWAO process to some tested species was determined. In addition, the CWAO effluent was recycled five- or 10-fold in order to increase the number of interactions between the liquid phase and catalyst. Consequently, the inductively coupled plasma mass spectrometry (ICP-MS) analysis indicated higher concentrations of some toxic metals like chromium, nickel, molybdenum, silver, and zinc in the recycled samples in comparison to both the single-pass sample and the photocatalytically treated solution. The highest toxicity of five- and 10-fold recycled solutions in the CWAO process was observed in water fleas, which could be correlated to high concentrations of chromium, nickel, and silver detected in tested samples. The obtained results clearly demonstrated that aqueous samples treated by means of advanced oxidation processes should always be analyzed using (i) chemical analyses to assess removal of BPA and total organic carbon from treated aqueous samples, as well as (ii) a battery of aquatic organisms from different taxonomic groups to determine possible toxicity.

  19. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J


    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  20. Kinetic studies in solid state reactions by sample-controlled methods and advanced analysis procedures


    Pérez-Maqueda, Luis A.; Criado, J. M.; Sánchez-Jiménez, P.E.; Perejón, Antonio


    A comparative study of both conventional rising temperature and sample-controlled methods, like constant rate thermal analysis (CRTA), is carried out after analyzing a set of solid state reactions using both methods. It is shown that CRTA avoids the influence of heat and mass transfer phenomena for a wide range of sample sizes leading to reliable kinetic parameters. On the other hand, conventional rising temperature methods yield α–T plots dependent on experimental conditions, even when using...

  1. Large loop conformation sampling using the activation relaxation technique, ART-nouveau method. (United States)

    St-Pierre, Jean-François; Mousseau, Normand


    We present an adaptation of the ART-nouveau energy surface sampling method to the problem of loop structure prediction. This method, previously used to study protein folding pathways and peptide aggregation, is well suited to the problem of sampling the conformation space of large loops by targeting probable folding pathways instead of sampling exhaustively that space. The number of sampled conformations needed by ART nouveau to find the global energy minimum for a loop was found to scale linearly with the sequence length of the loop for loops between 8 and about 20 amino acids. Considering the linear scaling dependence of the computation cost on the loop sequence length for sampling new conformations, we estimate the total computational cost of sampling larger loops to scale quadratically compared to the exponential scaling of exhaustive search methods. Copyright © 2012 Wiley Periodicals, Inc.

  2. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample. (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong


    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Serum chromium levels sampled with steel needle versus plastic IV cannula. Does method matter?

    DEFF Research Database (Denmark)

    Penny, Jeannette Ø; Overgaard, Søren


    PURPOSE: Modern metal-on-metal (MoM) joint articulations releases metal ions to the body. Research tries to establish how much this elevates metal ion levels and whether it causes adverse effects. The steel needle that samples the blood may introduce additional chromium to the sample thereby...... causing bias. This study aimed to test that theory. METHODS: We compared serum chromium values for two sampling methods, steel needle and IV plastic cannula, as well as sampling sequence in 16 healthy volunteers. RESULTS: We found statistically significant chromium contamination from the steel needle...... significant. CONCLUSION: The chromium contamination from the steel needle is low, and sampling method matters little in MoM populations. If using steel needles we suggest discarding the first sample....

  4. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup


    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and dimini......When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  5. Optoelectronic imaging of speckle using image processing method (United States)

    Wang, Jinjiang; Wang, Pengfei


    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  6. Comparison of chlorzoxazone one-sample methods to estimate CYP2E1 activity in humans

    DEFF Research Database (Denmark)

    Kramer, Iza; Dalhoff, Kim; Clemmesen, Jens O


    OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one-time-point cl......OBJECTIVE: Comparison of a one-sample with a multi-sample method (the metabolic fractional clearance) to estimate CYP2E1 activity in humans. METHODS: Healthy, male Caucasians ( n=19) were included. The multi-sample fractional clearance (Cl(fe)) of chlorzoxazone was compared with one......-time-point clearance estimation (Cl(est)) at 3, 4, 5 and 6 h. Furthermore, the metabolite/drug ratios (MRs) estimated from one-time-point samples at 1, 2, 3, 4, 5 and 6 h were compared with Cl(fe). RESULTS: The concordance between Cl(est) and Cl(fe) was highest at 6 h. The minimal mean prediction error (MPE) of Cl......-dose-sample estimates, Cl(est) at 3 h or 6 h, and MR at 3 h, can serve as reliable markers of CYP2E1 activity. The one-sample clearance method is an accurate, renal function-independent measure of the intrinsic activity; it is simple to use and easily applicable to humans....

  7. Method matters: Experimental evidence for shorter avian sperm in faecal compared to abdominal massage samples.

    Directory of Open Access Journals (Sweden)

    Antje Girndt

    Full Text Available Birds are model organisms in sperm biology. Previous work in zebra finches, suggested that sperm sampled from males' faeces and ejaculates do not differ in size. Here, we tested this assumption in a captive population of house sparrows, Passer domesticus. We compared sperm length in samples from three collection techniques: female dummy, faecal and abdominal massage samples. We found that sperm were significantly shorter in faecal than abdominal massage samples, which was explained by shorter heads and midpieces, but not flagella. This result might indicate that faecal sampled sperm could be less mature than sperm collected by abdominal massage. The female dummy method resulted in an insufficient number of experimental ejaculates because most males ignored it. In light of these results, we recommend using abdominal massage as a preferred method for avian sperm sampling. Where avian sperm cannot be collected by abdominal massage alone, we advise controlling for sperm sampling protocol statistically.

  8. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Coleman, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Young, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Brown, L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt or SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.

  9. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John


    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  10. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish (United States)

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.


    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  11. Method for Processing Liver Spheroids Using an Automatic Tissue Processor (United States)


    tissue processor (Leica Biosystems, Inc.; Nussloch, Germany). This instrument is used to prepare tissue for histologic study through a process of...microscopy because they make it possible to observe the microscopic anatomy of the samples. The Molecular Toxicology Branch intends to use the Leica...approximately 1 mm in diameter, which is significantly smaller than the traditional histology whole-tissue samples. Because of the smaller size of these

  12. Applicability of Demirjian's four methods and Willems method for age estimation in a sample of Turkish children. (United States)

    Akkaya, Nursel; Yilanci, Hümeyra Özge; Göksülük, Dinçer


    The aim of this study was to evaluate applicability of five dental methods including Demirjian's original, revised, four teeth, and alternate four teeth methods and Willems method for age estimation in a sample of Turkish children. Panoramic radiographs of 799 children (412 females, 387 males) aged between 2.20 and 15.99years were examined by two observers. A repeated measures ANOVA was performed to compare dental methods among gender and age groups. All of the five methods overestimated the chronological age on the average. Among these, Willems method was found to be the most accurate method, which showed 0.07 and 0.15years overestimation for males and females, respectively. It was followed by Demirjian's four teeth methods, revised and original methods. According to the results, Willems method can be recommended for dental age estimation of Turkish children in forensic applications. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Probabilistic finite element stiffness of a laterally loaded monopile based on an improved asymptotic sampling method

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammadjavad; Bayat, Mehdi; Andersen, Lars Vabbersgaard


    The mechanical responses of an offshore monopile foundation mounted in over-consolidated clay are calculated by employing a stochastic approach where a nonlinear p–y curve is incorporated with a finite element scheme. The random field theory is applied to represent a spatial variation for undrained...... shear strength of clay. Normal and Sobol sampling are employed to provide the asymptotic sampling method to generate the probability distribution of the foundation stiffnesses. Monte Carlo simulation is used as a benchmark. Asymptotic sampling accompanied with Sobol quasi random sampling demonstrates...... an efficient method for estimating the probability distribution of stiffnesses for the offshore monopile foundation....

  14. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments. (United States)

    Wagner, Martin; Stessl, Beatrix


    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  15. Extension of moment projection method to the fragmentation process

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Shaohua [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Yapp, Edward K.Y.; Akroyd, Jethro; Mosbach, Sebastian [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); Xu, Rong [School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore); Yang, Wenming [Department of Mechanical Engineering, National University of Singapore, Engineering Block EA, Engineering Drive 1, 117576 (Singapore); Kraft, Markus, E-mail: [Department of Chemical Engineering and Biotechnology, University of Cambridge, New Museums Site, Pembroke Street, Cambridge, CB2 3RA (United Kingdom); School of Chemical and Biomedical Engineering, Nanyang Technological University, 62 Nanyang Drive, 637459 (Singapore)


    The method of moments is a simple but efficient method of solving the population balance equation which describes particle dynamics. Recently, the moment projection method (MPM) was proposed and validated for particle inception, coagulation, growth and, more importantly, shrinkage; here the method is extended to include the fragmentation process. The performance of MPM is tested for 13 different test cases for different fragmentation kernels, fragment distribution functions and initial conditions. Comparisons are made with the quadrature method of moments (QMOM), hybrid method of moments (HMOM) and a high-precision stochastic solution calculated using the established direct simulation algorithm (DSA) and advantages of MPM are drawn.

  16. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan


    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  17. Method for Vanadium Speciation in Aqueous Samples by HPLC-ICP ...

    African Journals Online (AJOL)

    Method for Vanadium Speciation in Aqueous Samples by HPLC-ICP-OES. M Hu, PP Coetzee. Abstract. A method for vanadium speciation is proposed. The method uses a low concentration eluent, 10 mmol L–1 EDTA and 14 mmol L–1 sodium carbonate, for the ion chromatographic separation of vanadium species at a ...

  18. Optimal sampling strategies to assess inulin clearance in children by the inulin single-injection method

    NARCIS (Netherlands)

    van Rossum, Lyonne K.; Mathot, Ron A. A.; Cransberg, Karlien; Vulto, Arnold G.


    Glomerular filtration rate in patients can be determined by estimating the plasma clearance of inulin with the single-injection method. In this method, a single bolus injection of inulin is administered and several blood samples are collected. For practical and convenient application of this method

  19. Methods, compounds and systems for detecting a microorganism in a sample

    Energy Technology Data Exchange (ETDEWEB)

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.


    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  20. Effects of Processing Method and Consumers' Geo-Political ...

    African Journals Online (AJOL)

    Effects of Processing Method and Consumers' Geo-Political Background on the Scoring Pattern of Sensory Quality Attributes of Ugba , Fermented Seeds of African Oil Bean Tree ( Pentaclethra macrophylla Bentham)

  1. Effect of processing method on the Proximate composition, mineral ...

    African Journals Online (AJOL)

    Effect of processing method on the Proximate composition, mineral content and antinutritional factors of Taro (Colocasia esculenta, L.) growth in Ethiopia. T Adane, A Shimelis, R Negussie, B Tilahun, GD Haki ...

  2. Estimation for Non-Gaussian Locally Stationary Processes with Empirical Likelihood Method

    Directory of Open Access Journals (Sweden)

    Hiroaki Ogata


    Full Text Available An application of the empirical likelihood method to non-Gaussian locally stationary processes is presented. Based on the central limit theorem for locally stationary processes, we give the asymptotic distributions of the maximum empirical likelihood estimator and the empirical likelihood ratio statistics, respectively. It is shown that the empirical likelihood method enables us to make inferences on various important indices in a time series analysis. Furthermore, we give a numerical study and investigate a finite sample property.



    Sanjay B Patil; Dr Shrikant K Bodhe


    In order to increase the average sugarcane yield per acres with minimum cost farmers are adapting precision farming technique. This paper includes the area measurement of sugarcane leaf based on image processing method which is useful for plants growth monitoring, to analyze fertilizer deficiency and environmental stress,to measure diseases severity. In image processing method leaf area is calculated through pixel number statistic. Unit pixel in the same digital images represent the same size...

  4. A method of image multi-resolution processing based on FPGA + DSP architecture (United States)

    Peng, Xiaohan; Zhong, Sheng; Lu, Hongqiang


    In real-time image processing, with the improvement of resolution and frame rate of camera imaging, not only the requirement of processing capacity is improving, but also the requirement of the optimization of process is improving. With regards to the FPGA + DSP architecture image processing system, there are three common methods to overcome the challenge above. The first is using higher performance DSP. For example, DSP with higher core frequency or with more cores can be used. The second is optimizing the processing method, make the algorithm to accomplish the same processing results but spend less time. Last but not least, pre-processing in the FPGA can make the image processing more efficient. A method of multi-resolution pre-processing by FPGA based on FPGA + DSP architecture is proposed here. It takes advantage of built-in first in first out (FIFO) and external synchronous dynamic random access memory (SDRAM) to buffer the images which come from image detector, and provides down-sampled images or cut-down images for DSP flexibly and efficiently according to the request parameters sent by DSP. DSP can thus get the degraded image instead of the whole image to process, shortening the processing time and transmission time greatly. The method results in alleviating the burden of image processing of DSP and also solving the problem of single method of image resolution reduction cannot meet the requirements of image processing task of DSP.

  5. Comparing two sampling methods to engage hard-to-reach communities in research priority setting

    Directory of Open Access Journals (Sweden)

    Melissa A. Valerio


    Full Text Available Abstract Background Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. Methods In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1 snowball sampling, a chain- referral method or 2 purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community. Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities’ stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Results Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 % consented, 52 (95 % attended the first meeting, and 36 (65 % attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 % consented, 36 (58 % attended the first meeting, and 26 (42 % attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05. Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045 which was higher for the purposive/convenience sampling group and for city improvements

  6. [Sampling plan, weighting process and design effects of the Brazilian Oral Health Survey]. (United States)

    Silva, Nilza Nunes da; Roncalli, Angelo Giuseppe


    To present aspects of the sampling plan of the Brazilian Oral Health Survey (SBBrasil Project). with theoretical and operational issues that should be taken into account in the primary data analyses. The studied population was composed of five demographic groups from urban areas of Brazil in 2010. Two and three stage cluster sampling was used. adopting different primary units. Sample weighting and design effects (deff) were used to evaluate sample consistency. In total. 37,519 individuals were reached. Although the majority of deff estimates were acceptable. some domains showed distortions. The majority (90%) of the samples showed results in concordance with the precision proposed in the sampling plan. The measures to prevent losses and the effects the cluster sampling process in the minimum sample sizes proved to be effective for the deff. which did not exceeded 2. even for results derived from weighting. The samples achieved in the SBBrasil 2010 survey were close to the main proposals for accuracy of the design. Some probabilities proved to be unequal among the primary units of the same domain. Users of this database should bear this in mind, introducing sample weighting in calculations of point estimates, standard errors, confidence intervals and design effects.

  7. Occurrence of Arcobacter in Iranian poultry and slaughterhouse samples implicates contamination by processing equipment and procedures. (United States)

    Khoshbakht, R; Tabatabaei, M; Shirzad Aski, H; Seifi, S


    1. The occurrence of Arcobacter spp. and three pathogenic species of Arcobacter from Iranian poultry carcasses was investigated at different steps of broiler processing to determine critical control points for reducing carcass contamination. 2. Samples were collected from (a) cloaca immediately before processing, (b) different points during processing and (c) at different stations in a processing plant of a slaughterhouse in southern Iran. 3. After enrichment steps in Arcobacter selective broth, DNA of the samples was extracted and three significant pathogen species of Arcobacter were identified based on polymerase chain reaction (PCR) detection of 16S rRNA and specific species PCR. 4. Out of a total of 540 samples, 244 (45%) were positive for Arcobacter spp. Arcobacter butzleri was more frequently detected (73% ± 13.9%) than A. cryaeophilus (9% ± 13.9%) and A. skirrowii (4.1%). In addition, co-colonisation (A. butzleri and A. cryaerophilus) occurred in 13.9% of the positive samples. 5. The results indicate a high prevalence of Arcobacter in the investigated slaughterhouse and broiler carcasses and that Arcobacter is not a normal flora of the broilers. Evidence for the presence of Arcobacter in the environment and water of processing plants suggests that these are sources of contamination of poultry carcasses. In addition, contamination of the poultry carcasses can spread between poultry meats in different parts and processes of the slaughterhouse (pre-scalding to after evisceration).

  8. Novel joint selection methods can reduce sample size for rheumatoid arthritis clinical trials with ultrasound endpoints. (United States)

    Allen, John C; Thumboo, Julian; Lye, Weng Kit; Conaghan, Philip G; Chew, Li-Ching; Tan, York Kiat


    To determine whether novel methods of selecting joints through (i) ultrasonography (individualized-ultrasound [IUS] method), or (ii) ultrasonography and clinical examination (individualized-composite-ultrasound [ICUS] method) translate into smaller rheumatoid arthritis (RA) clinical trial sample sizes when compared to existing methods utilizing predetermined joint sites for ultrasonography. Cohen's effect size (ES) was estimated (ES^) and a 95% CI (ES^L, ES^U) calculated on a mean change in 3-month total inflammatory score for each method. Corresponding 95% CIs [nL(ES^U), nU(ES^L)] were obtained on a post hoc sample size reflecting the uncertainty in ES^. Sample size calculations were based on a one-sample t-test as the patient numbers needed to provide 80% power at α = 0.05 to reject a null hypothesis H0 : ES = 0 versus alternative hypotheses H1 : ES = ES^, ES = ES^L and ES = ES^U. We aimed to provide point and interval estimates on projected sample sizes for future studies reflecting the uncertainty in our study ES^S. Twenty-four treated RA patients were followed up for 3 months. Utilizing the 12-joint approach and existing methods, the post hoc sample size (95% CI) was 22 (10-245). Corresponding sample sizes using ICUS and IUS were 11 (7-40) and 11 (6-38), respectively. Utilizing a seven-joint approach, the corresponding sample sizes using ICUS and IUS methods were nine (6-24) and 11 (6-35), respectively. Our pilot study suggests that sample size for RA clinical trials with ultrasound endpoints may be reduced using the novel methods, providing justification for larger studies to confirm these observations. © 2017 Asia Pacific League of Associations for Rheumatology and John Wiley & Sons Australia, Ltd.

  9. Integrated Modeling and Intelligent Control Methods of Grinding Process

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang


    Full Text Available The grinding process is a typical complex nonlinear multivariable process with strongly coupling and large time delays. Based on the data-driven modeling theory, the integrated modeling and intelligent control method of grinding process is carried out in the paper, which includes the soft-sensor model of economic and technique indexes, the optimized set-point model utilizing case-based reasoning, and the self-tuning PID decoupling controller. For forecasting the key technology indicators (grinding granularity and mill discharge rate of grinding process, an adaptive soft-sensor modeling method based on wavelet neural network optimized by the improved shuffled frog leaping algorithm (ISFLA is proposed. Then, a set point optimization control strategy of grinding process based on case-based reasoning (CBR method is adopted to obtain the optimized velocity set-point of ore feed and pump water feed in the grinding process controlled loops. Finally, a self-tuning PID decoupling controller optimized is used to control the grinding process. Simulation results and industrial application experiments clearly show the feasibility and effectiveness of control methods and satisfy the real-time control requirements of the grinding process.

  10. Surface processing methods for point sets using finite elements

    NARCIS (Netherlands)

    Clarenz, Ulrich; Rumpf, Martin; Telea, Alexandru


    We present a framework for processing point-based surfaces via partial differential equations (PDEs). Our framework efficiently and effectively brings well-known PDE-based processing techniques to the field of point-based surfaces. At the core of our method is a finite element discretization of PDEs

  11. Indigenous processing methods and raw materials of borde , an ...

    African Journals Online (AJOL)

    A study of village-level processing techniques and raw materials used for the production of borde was carried out using open-ended questionnaires and on the spot interviews with producers at six localities in southern Ethiopia. The major focus of the study was on indigenous processing methods, types and proportions of ...

  12. Process-driven architecture : Design techniques and methods

    NARCIS (Netherlands)

    Jaskiewicz, T.


    This paper explores the notion of process-driven architecture and, as a consequence, application of complex systems in the newly defined area of digital process-driven architectural design in order to formulate a suitable design method. Protospace software environment and SwarmCAD software

  13. Influence of heat processing methods on the nutrient composition ...

    African Journals Online (AJOL)

    Dr. J. T. Ekanem

    ... use of the processed seeds as food for humans and oil extracts for the manufacture of industrial products. Key words: Heat processing methods, Arachis hypogaea seeds, nutrient composition, lipid characterization. ..... the teeth, painful swellings of the joints and decreased .... (1990) Chemical composition of yam beans.

  14. An assessment of oil processing methods and technology in Taraba ...

    African Journals Online (AJOL)

    Information obtained from the questionnaire were bio-data, oil bearing seeds materials, processing methods and technology, scale of operation, packaging and maintenance of the machine. Data collected were analyzed using descriptive statistics. Results: The study shows that oil processing was carried out in all the LGAs.

  15. Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts

    DEFF Research Database (Denmark)

    Brond, J. C.; Arvidsson, D.


    ActiGraph acceleration data are processed through several steps (including band-pass filtering to attenuate unwanted signal frequencies) to generate the activity counts commonly used in physical activity research. We performed three experiments to investigate the effect of sampling frequency...... on the generation of activity counts. Ideal acceleration signals were produced in the MATLAB software. Thereafter, ActiGraph GT3X+ monitors were spun in a mechanical setup. Finally, 20 subjects performed walking and running wearing GT3X+ monitors. Acceleration data from all experiments were collected with different...... the amount of activity counts generated was less, indicating that raw data stored in the GT3X+ monitor is processed. Between 600 and 1,600 more counts per minute were generated with the sampling frequencies 40 and 100 Hz compared with 30 Hz during running. Sampling frequency affects the processing of Acti...

  16. Assessing impacts of DNA extraction methods on next generation sequencing of water and wastewater samples. (United States)

    Walden, Connie; Carbonero, Franck; Zhang, Wen


    Next Generation Sequencing (NGS) is increasingly affordable and easier to perform. However, standard protocols prior to the sequencing step are only available for few selected sample types. Here we investigated the impact of DNA extraction methods on the consistency of NGS results. Four commercial DNA extraction kits (QIAamp DNA Mini Kit, QIAamp DNA Stool Mini Kit, MO BIO Power Water Kit, and MO BIO Power Soil DNA Isolation Kit) were used on sample sources including lake water and wastewater, and sample types including planktonic and biofilm bacteria communities. Sampling locations included a lake water reservoir, a trickling filter, and a moving bed biofilm reactor (MBBR). Unique genera such as Gemmatimonadetes, Elusimicrobia, and Latescibacteria were found in multiple samples. The Stool Mini Kit was least efficient in terms of diversity in sampling results with freshwater lake samples, and surprisingly the Power Water Kit was the least efficient across all sample types examined. Detailed NGS beta diversity comparisons indicated that the Mini Kit and PowerSoil Kit are best suited for studies that extract DNA from a variety of water and wastewater samples. We ultimately recommend application of Mini Kit or PowerSoil Kit as an improvement to NGS protocols for these sampling environments. These results are a step toward achieving accurate comparability of complex samples from water and wastewater environments by applying a single DNA extraction method, further streamlining future investigations. Copyright © 2017 Elsevier B.V. All rights reserved.


    DEFF Research Database (Denmark)


    The invention relates to a method for preparing a substrate (105a) comprising a sample reception area (110) and a sensing area (111). The method comprises the steps of: 1) applying a sample on the sample reception area; 2) rotating the substrate around a predetermined axis; 3) during rotation......, at least part of the liquid travels from the sample reception area to the sensing area due to capillary forces acting between the liquid and the substrate; and 4) removing the wave of particles and liquid formed at one end of the substrate. The sensing area is closer to the predetermined axis than...... the sample reception area. The sample comprises a liquid part and particles suspended therein....

  18. Effect of Processing Methods on the Nutrients and Anti Nutrients ...

    African Journals Online (AJOL)

    The nutrients and antinutrients compositions of vegetables cannot be over emphasized. The effect of different processing methods on the nutrients and antinutrients compositions in leaves of wild lettuce (Lactuca taraxacifolia) i.e. the sweet type was evaluated to determine the most appropriate methods for retaining its ...

  19. A generalised interpolating post–processing method for integral ...

    African Journals Online (AJOL)

    Interpolating post-processing method for integral equation has been demonstrated to be superior to the iteration method by Qun Lin, Shechua Zhang and Ningning Yan. They demonstrated that it is of order O (h2r+2) This paper describes the generalization in the choice of h, the mesh size which leads to a higher order of O ...

  20. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations (United States)

    Mayer, Philipp; Parkerton, Thomas F; Adams, Rachel G; Cargill, John G; Gan, Jay; Gouin, Todd; Gschwend, Philip M; Hawthorne, Steven B; Helm, Paul; Witt, Gesine; You, Jing; Escher, Beate I


    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive uptake into benthic organisms and exchange with the overlying water column. Consequently, Cfree provides a more relevant dose metric than total sediment concentration. Recent developments in PSMs have significantly improved our ability to reliably measure even very low levels of Cfree. Application of PSMs in sediments is preferably conducted in the equilibrium regime, where freely dissolved concentrations in the sediment are well-linked to the measured concentration in the sampler via analyte-specific partition ratios. The equilibrium condition can then be assured by measuring a time series or a single time point using passive samplers with different surface to volume ratios. Sampling in the kinetic regime is also possible and generally involves the application of performance reference compounds for the calibration. Based on previous research on hydrophobic organic contaminants, it is concluded that Cfree allows a direct assessment of 1) contaminant exchange and equilibrium status between sediment and overlying water, 2) benthic bioaccumulation, and 3) potential toxicity to benthic organisms. Thus, the use of PSMs to measure Cfree provides an improved basis for the mechanistic understanding of fate and transport processes in sediments and has the potential to significantly improve risk assessment and management of contaminated sediments. Integr Environ Assess Manag 2014;10:197–209. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288295