WorldWideScience

Sample records for sampling technique based

  1. RF Sub-sampling Receiver Architecture based on Milieu Adapting Techniques

    DEFF Research Database (Denmark)

    Behjou, Nastaran; Larsen, Torben; Jensen, Ole Kiel

    2012-01-01

    A novel sub-sampling based architecture is proposed which has the ability of reducing the problem of image distortion and improving the signal to noise ratio significantly. The technique is based on sensing the environment and adapting the sampling rate of the receiver to the best possible...

  2. The role of graphene-based sorbents in modern sample preparation techniques.

    Science.gov (United States)

    de Toffoli, Ana Lúcia; Maciel, Edvaldo Vasconcelos Soares; Fumes, Bruno Henrique; Lanças, Fernando Mauro

    2018-01-01

    The application of graphene-based sorbents in sample preparation techniques has increased significantly since 2011. These materials have good physicochemical properties to be used as sorbent and have shown excellent results in different sample preparation techniques. Graphene and its precursor graphene oxide have been considered to be good candidates to improve the extraction and concentration of different classes of target compounds (e.g., parabens, polycyclic aromatic hydrocarbon, pyrethroids, triazines, and so on) present in complex matrices. Its applications have been employed during the analysis of different matrices (e.g., environmental, biological and food). In this review, we highlight the most important characteristics of graphene-based material, their properties, synthesis routes, and the most important applications in both off-line and on-line sample preparation techniques. The discussion of the off-line approaches includes methods derived from conventional solid-phase extraction focusing on the miniaturized magnetic and dispersive modes. The modes of microextraction techniques called stir bar sorptive extraction, solid phase microextraction, and microextraction by packed sorbent are discussed. The on-line approaches focus on the use of graphene-based material mainly in on-line solid phase extraction, its variation called in-tube solid-phase microextraction, and on-line microdialysis systems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Sample preparation techniques based on combustion reactions in closed vessels - A brief overview and recent applications

    International Nuclear Information System (INIS)

    Flores, Erico M.M.; Barin, Juliano S.; Mesko, Marcia F.; Knapp, Guenter

    2007-01-01

    In this review, a general discussion of sample preparation techniques based on combustion reactions in closed vessels is presented. Applications for several kinds of samples are described, taking into account the literature data reported in the last 25 years. The operational conditions as well as the main characteristics and drawbacks are discussed for bomb combustion, oxygen flask and microwave-induced combustion (MIC) techniques. Recent applications of MIC techniques are discussed with special concern for samples not well digested by conventional microwave-assisted wet digestion as, for example, coal and also for subsequent determination of halogens

  4. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    Directory of Open Access Journals (Sweden)

    TZAR MN

    2011-01-01

    Full Text Available The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article provides a simple guide on nail sampling technique and the interpretation of culture results.

  5. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  6. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    Science.gov (United States)

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  8. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  9. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  11. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  12. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  13. The particle analysis based on FT-TIMS technique for swipe sample under the frame of nuclear safeguard

    International Nuclear Information System (INIS)

    Yang Tianli; Liu Xuemei; Liu Zhao; Tang Lei; Long Kaiming

    2008-06-01

    Under the frame of nuclear safeguard, the particles analysis for swipe sample is an advance mean to detect the undeclared uranium enriched facilities and undeclared uranium enriched activity. The technique of particle analysis based on fission track-thermal ionization mass spectrometry (FT-TIMS) for swipe sample have been built. The reliability and the experimental background for selecting particles consisting of uranium from swipe sample by FT method have been verified. In addition, the utilization coefficient of particles on the surface of swipe sample have also been tested. These works have provided the technique support for application in the area of nuclear verification. (authors)

  14. Two sampling techniques for game meat

    OpenAIRE

    van der Merwe, Maretha; Jooste, Piet J.; Hoffman, Louw C.; Calitz, Frikkie J.

    2013-01-01

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling...

  15. A line-based vegetation sampling technique and its application in ...

    African Journals Online (AJOL)

    percentage cover, density and intercept frequency) and also provides plant size distributions, yet requires no more sampling effort than the line-intercept method.. A field test of the three techniques in succulent karoo, showed that the discriminating ...

  16. Two sampling techniques for game meat

    Directory of Open Access Journals (Sweden)

    Maretha van der Merwe

    2013-03-01

    Full Text Available A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g and square centimetres (cm2 for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12 that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13 and analyses performed for aerobic plate count (APC, Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  17. Two sampling techniques for game meat.

    Science.gov (United States)

    van der Merwe, Maretha; Jooste, Piet J; Hoffman, Louw C; Calitz, Frikkie J

    2013-03-20

    A study was conducted to compare the excision sampling technique used by the export market and the sampling technique preferred by European countries, namely the biotrace cattle and swine test. The measuring unit for the excision sampling was grams (g) and square centimetres (cm2) for the swabbing technique. The two techniques were compared after a pilot test was conducted on spiked approved beef carcasses (n = 12) that statistically proved the two measuring units correlated. The two sampling techniques were conducted on the same game carcasses (n = 13) and analyses performed for aerobic plate count (APC), Escherichia coli and Staphylococcus aureus, for both techniques. A more representative result was obtained by swabbing and no damage was caused to the carcass. Conversely, the excision technique yielded fewer organisms and caused minor damage to the carcass. The recovery ratio from the sampling technique improved 5.4 times for APC, 108.0 times for E. coli and 3.4 times for S. aureus over the results obtained from the excision technique. It was concluded that the sampling methods of excision and swabbing can be used to obtain bacterial profiles from both export and local carcasses and could be used to indicate whether game carcasses intended for the local market are possibly on par with game carcasses intended for the export market and therefore safe for human consumption.

  18. NAIL SAMPLING TECHNIQUE AND ITS INTERPRETATION

    OpenAIRE

    TZAR MN; LEELAVATHI M

    2011-01-01

    The clinical suspicion of onychomyosis based on appearance of the nails, requires culture for confirmation. This is because treatment requires prolonged use of systemic agents which may cause side effects. One of the common problems encountered is improper nail sampling technique which results in loss of essential information. The unfamiliar terminologies used in reporting culture results may intimidate physicians resulting in misinterpretation and hamper treatment decision. This article prov...

  19. Radioisotope Sample Measurement Techniques in Medicine and Biology. Proceedings of the Symposium on Radioisotope Sample Measurement Techniques

    International Nuclear Information System (INIS)

    1965-01-01

    The medical and biological applications of radioisotopes depend on two basically different types of measurements, those on living subjects in vivo and those on samples in vitro. The International Atomic Energy Agency has in the past held several meetings on in vivo measurement techniques, notably whole-body counting and radioisotope scanning. The present volume contains the Proceedings of the first Symposium the Agency has organized to discuss the various aspects of techniques for sample measurement in vitro. The range of these sample measurement techniques is very wide. The sample may weigh a few milligrams or several hundred grams, and may be in the gaseous, liquid or solid state. Its radioactive content may consist of a single, known radioisotope or several unknown ones. The concentration of radioactivity may be low, medium or high. The measurements may be made manually or automatically and any one of the many radiation detectors now available may be used. The 53 papers presented at the Symposium illustrate the great variety of methods now in use for radioactive- sample measurements. The first topic discussed is gamma-ray spectrometry, which finds an increasing number of applications in sample measurements. Other sections of the Proceedings deal with: the use of computers in gamma-ray spectrometry and multiple tracer techniques; recent developments in activation analysis where both gamma-ray spectrometry and computing techniques are applied; thin-layer and paper radio chromatographic techniques for use with low energy beta-ray emitters; various aspects of liquid scintillation counting techniques in the measurement of alpha- and beta-ray emitters, including chemical and colour quenching; autoradiographic techniques; calibration of equipment; and standardization of radioisotopes. Finally, some applications of solid-state detectors are presented; this section may be regarded as a preview of important future developments. The meeting was attended by 203 participants

  20. LVQ-SMOTE - Learning Vector Quantization based Synthetic Minority Over-sampling Technique for biomedical data.

    Science.gov (United States)

    Nakamura, Munehiro; Kajiwara, Yusuke; Otsuka, Atsushi; Kimura, Haruhiko

    2013-10-02

    Over-sampling methods based on Synthetic Minority Over-sampling Technique (SMOTE) have been proposed for classification problems of imbalanced biomedical data. However, the existing over-sampling methods achieve slightly better or sometimes worse result than the simplest SMOTE. In order to improve the effectiveness of SMOTE, this paper presents a novel over-sampling method using codebooks obtained by the learning vector quantization. In general, even when an existing SMOTE applied to a biomedical dataset, its empty feature space is still so huge that most classification algorithms would not perform well on estimating borderlines between classes. To tackle this problem, our over-sampling method generates synthetic samples which occupy more feature space than the other SMOTE algorithms. Briefly saying, our over-sampling method enables to generate useful synthetic samples by referring to actual samples taken from real-world datasets. Experiments on eight real-world imbalanced datasets demonstrate that our proposed over-sampling method performs better than the simplest SMOTE on four of five standard classification algorithms. Moreover, it is seen that the performance of our method increases if the latest SMOTE called MWMOTE is used in our algorithm. Experiments on datasets for β-turn types prediction show some important patterns that have not been seen in previous analyses. The proposed over-sampling method generates useful synthetic samples for the classification of imbalanced biomedical data. Besides, the proposed over-sampling method is basically compatible with basic classification algorithms and the existing over-sampling methods.

  1. Water sampling techniques for continuous monitoring of pesticides in water

    Directory of Open Access Journals (Sweden)

    Šunjka Dragana

    2017-01-01

    Full Text Available Good ecological and chemical status of water represents the most important aim of the Water Framework Directive 2000/60/EC, which implies respect of water quality standards at the level of entire river basin (2008/105/EC and 2013/39/EC. This especially refers to the control of pesticide residues in surface waters. In order to achieve the set goals, a continuous monitoring program that should provide a comprehensive and interrelated overview of water status should be implemented. However, it demands the use of appropriate analysis techniques. Until now, the procedure for sampling and quantification of residual pesticide quantities in aquatic environment was based on the use of traditional sampling techniques that imply periodical collecting of individual samples. However, this type of sampling provides only a snapshot of the situation in regard to the presence of pollutants in water. As an alternative, the technique of passive sampling of pollutants in water, including pesticides has been introduced. Different samplers are available for pesticide sampling in surface water, depending on compounds. The technique itself is based on keeping a device in water over a longer period of time which varies from several days to several weeks, depending on the kind of compound. In this manner, the average concentrations of pollutants dissolved in water during a time period (time-weighted average concentrations, TWA are obtained, which enables monitoring of trends in areal and seasonal variations. The use of these techniques also leads to an increase in sensitivity of analytical methods, considering that pre-concentration of analytes takes place within the sorption medium. However, the use of these techniques for determination of pesticide concentrations in real water environments requires calibration studies for the estimation of sampling rates (Rs. Rs is a volume of water per time, calculated as the product of overall mass transfer coefficient and area of

  2. Critical evaluation of sample pretreatment techniques.

    Science.gov (United States)

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  3. Sample preparation techniques for (p, X) spectrometry

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1985-01-01

    Samples are ashed at low temperature, using oxygen plasma; a rotary evaporator, and freeze drying speeded up the ashing. The new design of apparatus manufactured was only 10 watt but was as efficient as a 200 watt commercial machine; a circuit diagram is included. Samples of hair and biopsy samples of skin were analysed by the technique. A wool standard was prepared for interlaboratory comparison exercises. It was based on New Zealand merino sheep wool and was 2.9 kg in weight. A washing protocol was developed, which preserves most of the trace element content. The wool was ground in liquid nitrogen using a plastic pestle and beaker, driven by a rotary drill press. (author)

  4. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Investigation of CPD and HMDS Sample Preparation Techniques for Cervical Cells in Developing Computer-Aided Screening System Based on FE-SEM/EDX

    Science.gov (United States)

    Ng, Siew Cheok; Abu Osman, Noor Azuan

    2014-01-01

    This paper investigated the effects of critical-point drying (CPD) and hexamethyldisilazane (HMDS) sample preparation techniques for cervical cells on field emission scanning electron microscopy and energy dispersive X-ray (FE-SEM/EDX). We investigated the visualization of cervical cell image and elemental distribution on the cervical cell for two techniques of sample preparation. Using FE-SEM/EDX, the cervical cell images are captured and the cell element compositions are extracted for both sample preparation techniques. Cervical cell image quality, elemental composition, and processing time are considered for comparison of performances. Qualitatively, FE-SEM image based on HMDS preparation technique has better image quality than CPD technique in terms of degree of spread cell on the specimen and morphologic signs of cell deteriorations (i.e., existence of plate and pellet drying artifacts and membrane blebs). Quantitatively, with mapping and line scanning EDX analysis, carbon and oxygen element compositions in HMDS technique were higher than the CPD technique in terms of weight percentages. The HMDS technique has shorter processing time than the CPD technique. The results indicate that FE-SEM imaging, elemental composition, and processing time for sample preparation with the HMDS technique were better than CPD technique for cervical cell preparation technique for developing computer-aided screening system. PMID:25610902

  6. Newly introduced sample preparation techniques: towards miniaturization.

    Science.gov (United States)

    Costa, Rosaria

    2014-01-01

    Sampling and sample preparation are of crucial importance in an analytical procedure, representing quite often a source of errors. The technique chosen for the isolation of analytes greatly affects the success of a chemical determination. On the other hand, growing concerns about environmental and human safety, along with the introduction of international regulations for quality control, have moved the interest of scientists towards specific needs. Newly introduced sample preparation techniques are challenged to meet new criteria: (i) miniaturization, (ii) higher sensitivity and selectivity, and (iii) automation. In this survey, the most recent techniques introduced in the field of sample preparation will be described and discussed, along with many examples of applications.

  7. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  8. Review of online coupling of sample preparation techniques with liquid chromatography.

    Science.gov (United States)

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Development of sampling techniques for ITER Type B radwaste

    International Nuclear Information System (INIS)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok

    2016-01-01

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment

  10. Development of sampling techniques for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Kwon Pyo; Kim, Sung Geun; Jung, Sang Hee; Oh, Wan Ho; Park, Myung Chul; Kim, Hee Moon; Ahn, Sang Bok [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are several difficulties and limitation in sampling activities. As the Type B radwaste components are mostly metallic(mostly stainless steel) and bulk(∼ 1 m in size and ∼ 100 mm in thickness), it is difficult in taking samples from the surface of Type B radwaste by remote operation. But also, sampling should be performed without use of any liquid coolant to avoid the spread of contamination. And all sampling procedures are carried in the hot cell red zone with remote operation. Three kinds of sampling techniques are being developed. They are core sampling, chip sampling, and wedge sampling, which are the candidates of sampling techniques to be applied to ITER hot cell. Object materials for sampling are stainless steel or Cu alloy block in order to simulate ITER Type B radwaste. The best sampling technique for ITER Type B radwaste among the three sampling techniques will be suggested in several months after finishing the related experiment.

  11. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  12. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  13. Non-terminal blood sampling techniques in guinea pigs.

    Science.gov (United States)

    Birck, Malene M; Tveden-Nyborg, Pernille; Lindblad, Maiken M; Lykkesfeldt, Jens

    2014-10-11

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do not exceed guidelines for blood collection in laboratory animals(6). All the described methods have been thoroughly tested and applied for repeated in vivo blood sampling in studies within our research facility.

  14. Differences in sampling techniques on total post-mortem tryptase.

    Science.gov (United States)

    Tse, R; Garland, J; Kesha, K; Elstub, H; Cala, A D; Ahn, Y; Stables, S; Palmiere, C

    2017-11-20

    The measurement of mast cell tryptase is commonly used to support the diagnosis of anaphylaxis. In the post-mortem setting, the literature recommends sampling from peripheral blood sources (femoral blood) but does not specify the exact sampling technique. Sampling techniques vary between pathologists, and it is unclear whether different sampling techniques have any impact on post-mortem tryptase levels. The aim of this study is to compare the difference in femoral total post-mortem tryptase levels between two sampling techniques. A 6-month retrospective study comparing femoral total post-mortem tryptase levels between (1) aspirating femoral vessels with a needle and syringe prior to evisceration and (2) femoral vein cut down during evisceration. Twenty cases were identified, with three cases excluded from analysis. There was a statistically significant difference (paired t test, p sampling methods. The clinical significance of this finding and what factors may contribute to it are unclear. When requesting post-mortem tryptase, the pathologist should consider documenting the exact blood collection site and method used for collection. In addition, blood samples acquired by different techniques should not be mixed together and should be analyzed separately if possible.

  15. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    Science.gov (United States)

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  17. Advances in paper-based sample pretreatment for point-of-care testing.

    Science.gov (United States)

    Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng

    2017-06-01

    In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.

  18. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  19. New materials for sample preparation techniques in bioanalysis.

    Science.gov (United States)

    Nazario, Carlos Eduardo Domingues; Fumes, Bruno Henrique; da Silva, Meire Ribeiro; Lanças, Fernando Mauro

    2017-02-01

    The analysis of biological samples is a complex and difficult task owing to two basic and complementary issues: the high complexity of most biological matrices and the need to determine minute quantities of active substances and contaminants in such complex sample. To succeed in this endeavor samples are usually subject to three steps of a comprehensive analytical methodological approach: sample preparation, analytes isolation (usually utilizing a chromatographic technique) and qualitative/quantitative analysis (usually with the aid of mass spectrometric tools). Owing to the complex nature of bio-samples, and the very low concentration of the target analytes to be determined, selective sample preparation techniques is mandatory in order to overcome the difficulties imposed by these two constraints. During the last decade new chemical synthesis approaches has been developed and optimized, such as sol-gel and molecularly imprinting technologies, allowing the preparation of novel materials for sample preparation including graphene and derivatives, magnetic materials, ionic liquids, molecularly imprinted polymers, and much more. In this contribution we will review these novel techniques and materials, as well as their application to the bioanalysis niche. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    Science.gov (United States)

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  1. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  2. SEM-based characterization techniques

    International Nuclear Information System (INIS)

    Russell, P.E.

    1986-01-01

    The scanning electron microscope is now a common instrument in materials characterization laboratories. The basic role of the SEM as a topographic imaging system has steadily been expanding to include a variety of SEM-based analytical techniques. These techniques cover the range of basic semiconductor materials characterization to live-time device characterization of operating LSI or VLSI devices. This paper introduces many of the more commonly used techniques, describes the modifications or additions to a conventional SEM required to utilize the techniques, and gives examples of the use of such techniques. First, the types of signals available from a sample being irradiated by an electron beam are reviewed. Then, where applicable, the type of spectroscopy or microscopy which has evolved to utilize the various signal types are described. This is followed by specific examples of the use of such techniques to solve problems related to semiconductor technology. Techniques emphasized include: x-ray fluorescence spectroscopy, electron beam induced current (EBIC), stroboscopic voltage analysis, cathodoluminescnece and electron beam IC metrology. Current and future trends of some of the these techniques, as related to the semiconductor industry are discussed

  3. Array-based techniques for fingerprinting medicinal herbs

    Directory of Open Access Journals (Sweden)

    Xue Charlie

    2011-05-01

    Full Text Available Abstract Poor quality control of medicinal herbs has led to instances of toxicity, poisoning and even deaths. The fundamental step in quality control of herbal medicine is accurate identification of herbs. Array-based techniques have recently been adapted to authenticate or identify herbal plants. This article reviews the current array-based techniques, eg oligonucleotides microarrays, gene-based probe microarrays, Suppression Subtractive Hybridization (SSH-based arrays, Diversity Array Technology (DArT and Subtracted Diversity Array (SDA. We further compare these techniques according to important parameters such as markers, polymorphism rates, restriction enzymes and sample type. The applicability of the array-based methods for fingerprinting depends on the availability of genomics and genetics of the species to be fingerprinted. For the species with few genome sequence information but high polymorphism rates, SDA techniques are particularly recommended because they require less labour and lower material cost.

  4. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  5. Sample preparation for special PIE-techniques at ITU

    International Nuclear Information System (INIS)

    Toscano, E.H.; Manzel, R.

    2002-01-01

    Several sample preparation techniques were developed and installed in hot cells. The techniques were conceived to evaluate the performance of highly burnt fuel rods and include: (a) a device for the removal of the fuel, (b) a method for the preparation of the specimen ends for the welding of new end caps and for the careful cleaning of samples for Transmission Electron Microscopy and Glow Discharge Mass Spectroscopy, (c) a sample pressurisation device for long term creep tests, and (d) a diameter measuring device for creep or burst samples. Examples of the determination of the mechanical properties, the behaviour under transient conditions and for the assessment of the corrosion behaviour of high burnup cladding materials are presented. (author)

  6. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  7. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    Science.gov (United States)

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  8. Non-terminal blood sampling techniques in Guinea pigs

    DEFF Research Database (Denmark)

    Birck, Malene Muusfeldt; Tveden-Nyborg, Pernille; Lindblad, Maiken Marie

    2014-01-01

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features...... of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require...... repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e...

  9. Symbol synchronization and sampling frequency synchronization techniques in real-time DDO-OFDM systems

    Science.gov (United States)

    Chen, Ming; He, Jing; Cao, Zizheng; Tang, Jin; Chen, Lin; Wu, Xian

    2014-09-01

    In this paper, we propose and experimentally demonstrate a symbol synchronization and sampling frequency synchronization techniques in real-time direct-detection optical orthogonal frequency division multiplexing (DDO-OFDM) system, over 100-km standard single mode fiber (SSMF) using a cost-effective directly modulated distributed feedback (DFB) laser. The experiment results show that the proposed symbol synchronization based on training sequence (TS) has a low complexity and high accuracy even at a sampling frequency offset (SFO) of 5000-ppm. Meanwhile, the proposed pilot-assisted sampling frequency synchronization between digital-to-analog converter (DAC) and analog-to-digital converter (ADC) is capable of estimating SFOs with an accuracy of technique can also compensate SFO effects within a small residual SFO caused by deviation of SFO estimation and low-precision or unstable clock source. The two synchronization techniques are suitable for high-speed DDO-OFDM transmission systems.

  10. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of Bangalore city using cluster sampling and lot quality assurance sampling techniques

    Directory of Open Access Journals (Sweden)

    Punith K

    2008-01-01

    Full Text Available Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1 Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2 Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  11. The novel programmable riometer for in-depth ionospheric and magnetospheric observations (PRIAMOS) using direct sampling DSP techniques

    OpenAIRE

    Dekoulis, G.; Honary, F.

    2005-01-01

    This paper describes the feasibility study and simulation results for the unique multi-frequency, multi-bandwidth, Programmable Riometer for in-depth Ionospheric And Magnetospheric ObservationS (PRIAMOS) based on direct sampling digital signal processing (DSP) techniques. This novel architecture is based on sampling the cosmic noise wavefront at the antenna. It eliminates the usage of any intermediate frequency (IF) mixer stages (-6 dB) and the noise balancing technique (-3 dB), providing a m...

  12. Application of digital sampling techniques to particle identification

    International Nuclear Information System (INIS)

    Bardelli, L.; Poggi, G.; Bini, M.; Carraresi, L.; Pasquali, G.; Taccetti, N.

    2003-01-01

    An application of digital sampling techniques is presented which can greatly simplify experiments involving sub-nanosecond time-mark determinations and energy measurements with nuclear detectors, used for Pulse Shape Analysis and Time of Flight measurements in heavy ion experiments. In this work a 100 M Sample/s, 12 bit analog to digital converter has been used: examples of this technique applied to Silicon and CsI(Tl) detectors in heavy-ions experiments involving particle identification via Pulse Shape analysis and Time of Flight measurements are presented. The system is suited for applications to large detector arrays and to different kinds of detectors. Some preliminary results regarding the simulation of current signals in Silicon detectors are also discussed. (authors)

  13. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam; Jacobs, Sam Ade; Sharma, Shishir; Amato, Nancy M.; Rauchwerger, Lawrence

    2014-01-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  14. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam

    2014-05-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  15. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  16. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  17. Sampling techniques for thrips (Thysanoptera: Thripidae) in preflowering tomato.

    Science.gov (United States)

    Joost, P Houston; Riley, David G

    2004-08-01

    Sampling techniques for thrips (Thysanoptera: Thripidae) were compared in preflowering tomato plants at the Coastal Plain Experiment Station in Tifton, GA, in 2000 and 2003, to determine the most effective method of determining abundance of thrips on tomato foliage early in the growing season. Three relative sampling techniques, including a standard insect aspirator, a 946-ml beat cup, and an insect vacuum device, were compared for accuracy to an absolute method and to themselves for precision and efficiency of sampling thrips. Thrips counts of all relative sampling methods were highly correlated (R > 0.92) to the absolute method. The aspirator method was the most accurate compared with the absolute sample according to regression analysis in 2000. In 2003, all sampling methods were considered accurate according to Dunnett's test, but thrips numbers were lower and sample variation was greater than in 2000. In 2000, the beat cup method had the lowest relative variation (RV) or best precision, at 1 and 8 d after transplant (DAT). Only the beat cup method had RV values <25 for all sampling dates. In 2003, the beat cup method had the lowest RV value at 15 and 21 DAT. The beat cup method also was the most efficient method for all sample dates in both years. Frankliniella fusca (Pergande) was the most abundant thrips species on the foliage of preflowering tomato in both years of study at this location. Overall, the best thrips sampling technique tested was the beat cup method in terms of precision and sampling efficiency.

  18. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  19. Use of nuclear technique in samples for agricultural purposes

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von, E-mail: kerley@ufmg.br, E-mail: kerleyfisica@yahoo.com.br [Department of Sanitary and Environmental Engineering Federal University of Minas Gerais, Belo Horizonte (Brazil); Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2013-01-15

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg{sup -1}), 154 times greater than the values found in the sandy soil (26 mg kg{sup -1}) and 14 times greater than the in clayey soil (280 mg kg{sup -1}). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  20. Use of nuclear technique in samples for agricultural purposes

    International Nuclear Information System (INIS)

    Oliveira, Kerley A. P. de; Sperling, Eduardo Von; Menezes, Maria Angela B. C.; Jacomino, Vanusa M.F.

    2013-01-01

    The concern related to environment is growing. Due to this, it is needed to determine chemical elements in a large range of concentration. The neutron activation technique (NAA) determines the elemental composition by the measurement of artificial radioactivity in a sample that was submitted to a neutron flux. NAA is a sensitive and accurate technique with low detection limits. An example of application of NAA was the measurement of concentrations of rare earth elements (REE) in waste samples of phosphogypsum (PG) and cerrado soil samples (clayey and sandy soils). Additionally, a soil reference material of the International Atomic Energy Agency (IAEA) was also analyzed. The REE concentration in PG samples was two times higher than those found in national fertilizers, (total of 4,000 mg kg -1 ), 154 times greater than the values found in the sandy soil (26 mg kg -1 ) and 14 times greater than the in clayey soil (280 mg kg -1 ). The experimental results for the reference material were inside the uncertainty of the certified values pointing out the accuracy of the method (95%). The determination of La, Ce, Pr, Nd, Pm, Sm, Eu, Tb, Dy, Ho, Er, Tm, Yb and Lu in the samples and reference material confirmed the versatility of the technique on REE determination in soil and phosphogypsum samples that are matrices for agricultural interest. (author)

  1. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  2. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  3. Application of digital sampling techniques to particle identification in scintillation detectors

    International Nuclear Information System (INIS)

    Bardelli, L.; Bini, M.; Poggi, G.; Taccetti, N.

    2002-01-01

    In this paper, the use of a fast digitizing system for identification of fast charged particles with scintillation detectors is discussed. The three-layer phoswich detectors developed in the framework of the FIASCO experiment for the detection of light charged particles (LCP) and intermediate mass fragments (IMF) emitted in heavy-ion collisions at Fermi energies are briefly discussed. The standard analog electronics treatment of the signals for particle identification is illustrated. After a description of the digitizer designed to perform a fast digital sampling of the phoswich signals, the feasibility of particle identification on the sampled data is demonstrated. The results obtained with two different pulse shape discrimination analyses based on the digitally sampled data are compared with the standard analog signal treatment. The obtained results suggest, for the present application, the replacement of the analog methods with the digital sampling technique

  4. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  5. The electron transport problem sampling by Monte Carlo individual collision technique

    International Nuclear Information System (INIS)

    Androsenko, P.A.; Belousov, V.I.

    2005-01-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  6. Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples

    OpenAIRE

    Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma; Abdel-Rehim, Mohamed

    2014-01-01

    In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from o...

  7. Monoclonal antibody-based dipstick assay: a reliable field applicable technique for diagnosis of Schistosoma mansoni infection using human serum and urine samples.

    Science.gov (United States)

    Demerdash, Zeinab; Mohamed, Salwa; Hendawy, Mohamed; Rabia, Ibrahim; Attia, Mohy; Shaker, Zeinab; Diab, Tarek M

    2013-02-01

    A field applicable diagnostic technique, the dipstick assay, was evaluated for its sensitivity and specificity in diagnosing human Schistosoma mansoni infection. A monoclonal antibody (mAb) against S. mansoni adult worm tegumental antigen (AWTA) was employed in dipstick and sandwich ELISA for detection of circulating schistosome antigen (CSA) in both serum and urine samples. Based on clinical and parasitological examinations, 60 S. mansoni-infected patients, 30 patients infected with parasites other than schistosomiasis, and 30 uninfected healthy individuals were selected. The sensitivity and specificity of dipstick assay in urine samples were 86.7% and 90.0%, respectively, compared to 90.0% sensitivity and 91.7% specificity of sandwich ELISA. In serum samples, the sensitivity and specificity were 88.3% and 91.7% for dipstick assay vs. 91.7% and 95.0% for sandwich ELISA, respectively. The diagnostic efficacy of dipstick assay in urine and serum samples was 88.3% and 90.0%, while it was 90.8% and 93.3% for sandwich ELISA, respectively. The diagnostic indices of dipstick assay and ELISA either in serum or in urine were statistically comparable (P>0.05). In conclusion, the dipstick assay offers an alternative simple, rapid, non-invasive technique in detecting CSA or complement to stool examinations especially in field studies.

  8. Manipulation of biological samples using micro and nano techniques.

    Science.gov (United States)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    The constant interest in handling, integrating and understanding biological systems of interest for the biomedical field, the pharmaceutical industry and the biomaterial researchers demand the use of techniques that allow the manipulation of biological samples causing minimal or no damage to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable properties with physical transducers has been possible, giving rise to new and highly sensitive biosensing devices. This article reviews the different techniques available to manipulate and integrate biological materials in a controlled manner either by sliding them along a surface (2-D manipulation), by grapping them and moving them to a new position (3-D manipulation), or by manipulating and relocating them applying external forces. The advantages and drawbacks are mentioned together with examples that reflect the state of the art of manipulation techniques for biological samples (171 references).

  9. Using machine learning to accelerate sampling-based inversion

    Science.gov (United States)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  10. The electron transport problem sampling by Monte Carlo individual collision technique

    Energy Technology Data Exchange (ETDEWEB)

    Androsenko, P.A.; Belousov, V.I. [Obninsk State Technical Univ. of Nuclear Power Engineering, Kaluga region (Russian Federation)

    2005-07-01

    The problem of electron transport is of most interest in all fields of the modern science. To solve this problem the Monte Carlo sampling has to be used. The electron transport is characterized by a large number of individual interactions. To simulate electron transport the 'condensed history' technique may be used where a large number of collisions are grouped into a single step to be sampled randomly. Another kind of Monte Carlo sampling is the individual collision technique. In comparison with condensed history technique researcher has the incontestable advantages. For example one does not need to give parameters altered by condensed history technique like upper limit for electron energy, resolution, number of sub-steps etc. Also the condensed history technique may lose some very important tracks of electrons because of its limited nature by step parameters of particle movement and due to weakness of algorithms for example energy indexing algorithm. There are no these disadvantages in the individual collision technique. This report presents some sampling algorithms of new version BRAND code where above mentioned technique is used. All information on electrons was taken from Endf-6 files. They are the important part of BRAND. These files have not been processed but directly taken from electron information source. Four kinds of interaction like the elastic interaction, the Bremsstrahlung, the atomic excitation and the atomic electro-ionization were considered. In this report some results of sampling are presented after comparison with analogs. For example the endovascular radiotherapy problem (P2) of QUADOS2002 was presented in comparison with another techniques that are usually used. (authors)

  11. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  12. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  13. The Recent Developments in Sample Preparation for Mass Spectrometry-Based Metabolomics.

    Science.gov (United States)

    Gong, Zhi-Gang; Hu, Jing; Wu, Xi; Xu, Yong-Jiang

    2017-07-04

    Metabolomics is a critical member in systems biology. Although great progress has been achieved in metabolomics, there are still some problems in sample preparation, data processing and data interpretation. In this review, we intend to explore the roles, challenges and trends in sample preparation for mass spectrometry- (MS-) based metabolomics. The newly emerged sample preparation methods were also critically examined, including laser microdissection, in vivo sampling, dried blood spot, microwave, ultrasound and enzyme-assisted extraction, as well as microextraction techniques. Finally, we provide some conclusions and perspectives for sample preparation in MS-based metabolomics.

  14. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  15. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Current STR-based techniques in forensic science

    Directory of Open Access Journals (Sweden)

    Phuvadol Thanakiatkrai

    2013-01-01

    Full Text Available DNA analysis in forensic science is mainly based on short tandem repeat (STR genotyping. The conventional analysis is a three-step process of DNA extraction, amplification and detection. An overview of various techniques that are currently in use and are being actively researched for STR typing is presented. The techniques are separated into STR amplification and detection. New techniques for forensic STR analysis focus on increasing sensitivity, resolution and discrimination power for suboptimal samples. These are achieved by shifting primer-binding sites, using high-fidelity and tolerant polymerases and applying novel methods to STR detection. Examples in which STRs are used in criminal investigations are provided and future research directions are discussed.

  17. Calibrating passive sampling and passive dosing techniques to lipid based concentrations

    DEFF Research Database (Denmark)

    Mayer, Philipp; Schmidt, Stine Nørgaard; Annika, A.

    2011-01-01

    Equilibrium sampling into various formats of the silicone polydimethylsiloxane (PDMS) is increasingly used to measure the exposure of hydrophobic organic chemicals in environmental matrices, and passive dosing from silicone is increasingly used to control and maintain their exposure in laboratory...... coated vials and with Head Space Solid Phase Microextraction (HS-SPME) yielded lipid based concentrations that were in good agreement with each other, but about a factor of two higher than measured lipid-normalized concentrations in the organisms. Passive dosing was applied to bioconcentration...

  18. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  19. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    International Nuclear Information System (INIS)

    Chen, Jinyang; Ji, Xinghu; He, Zhike

    2015-01-01

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform

  20. High-throughput droplet analysis and multiplex DNA detection in the microfluidic platform equipped with a robust sample-introduction technique

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Jinyang; Ji, Xinghu [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); He, Zhike, E-mail: zhkhe@whu.edu.cn [Key Laboratory of Analytical Chemistry for Biology and Medicine (Ministry of Education), College of Chemistry and Molecular Sciences, Wuhan University, Wuhan 430072 (China); Suzhou Institute of Wuhan University, Suzhou 215123 (China)

    2015-08-12

    In this work, a simple, flexible and low-cost sample-introduction technique was developed and integrated with droplet platform. The sample-introduction strategy was realized based on connecting the components of positive pressure input device, sample container and microfluidic chip through the tygon tubing with homemade polydimethylsiloxane (PDMS) adaptor, so the sample was delivered into the microchip from the sample container under the driving of positive pressure. This sample-introduction technique is so robust and compatible that could be integrated with T-junction, flow-focus or valve-assisted droplet microchips. By choosing the PDMS adaptor with proper dimension, the microchip could be flexibly equipped with various types of familiar sample containers, makes the sampling more straightforward without trivial sample transfer or loading. And the convenient sample changing was easily achieved by positioning the adaptor from one sample container to another. Benefiting from the proposed technique, the time-dependent concentration gradient was generated and applied for quantum dot (QD)-based fluorescence barcoding within droplet chip. High-throughput droplet screening was preliminarily demonstrated through the investigation of the quenching efficiency of ruthenium complex to the fluorescence of QD. More importantly, multiplex DNA assay was successfully carried out in the integrated system, which shows the practicability and potentials in high-throughput biosensing. - Highlights: • A simple, robust and low-cost sample-introduction technique was developed. • Convenient and flexible sample changing was achieved in microfluidic system. • Novel strategy of concentration gradient generation was presented for barcoding. • High-throughput droplet screening could be realized in the integrated platform. • Multiplex DNA assay was successfully carried out in the droplet platform.

  1. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  2. Sample Based Unit Liter Dose Estimates

    International Nuclear Information System (INIS)

    JENSEN, L.

    1999-01-01

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999) and the Final Safety Analysis Report (FSAR) (FDH 1999) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new data to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in developing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks

  3. Sample Based Unit Liter Dose Estimates

    International Nuclear Information System (INIS)

    JENSEN, L.

    2000-01-01

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new data to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting μCi/g or μCi/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000)

  4. Assessment of the impact strength of the denture base resin polymerized by various processing techniques

    Directory of Open Access Journals (Sweden)

    Rajashree Jadhav

    2013-01-01

    Full Text Available Aim : To measure the impact strength of denture base resins polymerized using short and long curing cycles by water bath, pressure cooker and microwave techniques. Materials and Methods: For impact strength testing, 60 samples were made. The sample dimensions were 60 mm × 12 mm × 3 mm, as standardized by the American Standards for Testing and Materials (ASTM. A digital caliper was used to locate the midpoint of sample. The impact strength was measured in IZOD type of impact tester using CEAST Impact tester. The pendulum struck the sample and it broke. The energy required to break the sample was measured in Joules. Data were analyzed using Student′s " t" test. Results: There was statistically significant difference in the impact strength of denture base resins polymerized by long curing cycle and short curing cycle in each technique, with the long curing processing being the best. Conclusion: The polymerization technique plays an important role in the influence of impact strength in the denture base resin. This research demonstrates that the denture base resin polymerized by microwave processing technique possessed the highest impact strength.

  5. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  6. Refinement of NMR structures using implicit solvent and advanced sampling techniques.

    Science.gov (United States)

    Chen, Jianhan; Im, Wonpil; Brooks, Charles L

    2004-12-15

    NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified

  7. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    Science.gov (United States)

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Micro and Nano Techniques for the Handling of Biological Samples

    DEFF Research Database (Denmark)

    Micro and Nano Techniques for the Handling of Biological Samples reviews the different techniques available to manipulate and integrate biological materials in a controlled manner, either by sliding them along a surface (2-D manipulation), or by gripping and moving them to a new position (3-D...

  9. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  10. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  11. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  12. A preclustering-based ensemble learning technique for acute appendicitis diagnoses.

    Science.gov (United States)

    Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Huang, Te-Chia; Chuang, Wei-Yao

    2013-06-01

    Acute appendicitis is a common medical condition, whose effective, timely diagnosis can be difficult. A missed diagnosis not only puts the patient in danger but also requires additional resources for corrective treatments. An acute appendicitis diagnosis constitutes a classification problem, for which a further fundamental challenge pertains to the skewed outcome class distribution of instances in the training sample. A preclustering-based ensemble learning (PEL) technique aims to address the associated imbalanced sample learning problems and thereby support the timely, accurate diagnosis of acute appendicitis. The proposed PEL technique employs undersampling to reduce the number of majority-class instances in a training sample, uses preclustering to group similar majority-class instances into multiple groups, and selects from each group representative instances to create more balanced samples. The PEL technique thereby reduces potential information loss from random undersampling. It also takes advantage of ensemble learning to improve performance. We empirically evaluate this proposed technique with 574 clinical cases obtained from a comprehensive tertiary hospital in southern Taiwan, using several prevalent techniques and a salient scoring system as benchmarks. The comparative results show that PEL is more effective and less biased than any benchmarks. The proposed PEL technique seems more sensitive to identifying positive acute appendicitis than the commonly used Alvarado scoring system and exhibits higher specificity in identifying negative acute appendicitis. In addition, the sensitivity and specificity values of PEL appear higher than those of the investigated benchmarks that follow the resampling approach. Our analysis suggests PEL benefits from the more representative majority-class instances in the training sample. According to our overall evaluation results, PEL records the best overall performance, and its area under the curve measure reaches 0.619. The

  13. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  14. Cone penetrometer tests and HydroPunch sampling: A screening technique for plume definition

    International Nuclear Information System (INIS)

    Smolley, M.; Kappmeyer, J.C.

    1991-01-01

    Cone penetrometer tests and HydroPunch sampling were used to define the extent of volatile organic compounds in ground water. The investigation indicated that the combination of the these techniques is effective for obtaining ground water samples for preliminary plume definition. HydroPunch samples can be collected in unconsolidated sediments and the analytical results obtained from these samples are comparable to those obtained from adjacent monitoring wells. This sampling method is a rapid and cost-effective screening technique for characterizing the extent of contaminant plumes in soft sediment environments. Use of this screening technique allowed monitoring wells to be located at the plume boundary, thereby reducing the number of wells installed and the overall cost of the plume definition program

  15. Chance constrained problems: penalty reformulation and performance of sample approximation technique

    Czech Academy of Sciences Publication Activity Database

    Branda, Martin

    2012-01-01

    Roč. 48, č. 1 (2012), s. 105-122 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GBP402/12/G097 Institutional research plan: CEZ:AV0Z10750506 Keywords : chance constrained problems * penalty functions * asymptotic equivalence * sample approximation technique * investment problem Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.619, year: 2012 http://library.utia.cas.cz/separaty/2012/E/branda-chance constrained problems penalty reformulation and performance of sample approximation technique.pdf

  16. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  17. Analysis of soil samples from Gebeng area using NAA technique

    Science.gov (United States)

    Elias, Md Suhaimi; Wo, Yii Mei; Hamzah, Mohd Suhaimi; Shukor, Shakirah Abd; Rahman, Shamsiah Ab; Salim, Nazaratul Ashifa Abdullah; Azman, Muhamad Azfar; Hashim, Azian

    2017-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (Igeo) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (Igeo) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated).

  18. A comparative study of sample dissolution techniques and plasma-based instruments for the precise and accurate quantification of REEs in mineral matrices

    International Nuclear Information System (INIS)

    Whitty-Léveillé, Laurence; Turgeon, Keven; Bazin, Claude; Larivière, Dominic

    2017-01-01

    The recent commercialisation of inductively coupled plasma tandem mass spectrometric (ICP-MS/MS) instruments has provided analytical chemists with a new tool to properly quantify atomic composition in a variety of matrices with minimal sample preparation. In this article, we report on our assessment of the compatibility of 3 sample preparation techniques (open-vessel acid digestion, microwave digestion and alkaline fusion) for the quantification of rare earth elements (REEs) in mineral matrices. The combination of the high digestion temperatures (1050 °C) and using LiBO_2 as a flux was the most effective strategy for the digestion of all rare earth elements in mineral matrices and was compatible with ICP-MS/MS measurements. We also assessed the analytical performances of ICP-MS/MS against other plasma-based instrumentation (microwave induced plasma and inductively coupled plasma atomic emission spectroscopy (MIP-AES and ICP-AES, respectively) and single quadrupole inductively coupled plasma mass spectrometry (ICP-MS). The comparative study showed that the concentrations obtained by ICP-MS/MS are in excellent agreement with the certified reference material values, and much more suited than the other analytical techniques tested for the quantification of REEs, which exhibited low detectability and/or spectral interferences for some elements/isotopes. Finally, the ruggedness of the analytical protocol proposed which combines a rapid sample dissolution step performed by an automated fusion unit and an ICP-MS/MS as a detector was established using various certified mineral matrices containing variable levels of REEs. - Highlights: • Three types of digestion methods were tested. • Four types of analytical techniques were compared. • Elimination of the spectral interferences encountered in ICP-MS was achieved by the use of Tandem ICP-MS. • Robustness of the analytical procedure was successfully evaluate on four types of certified reference material.

  19. A comparative study of sample dissolution techniques and plasma-based instruments for the precise and accurate quantification of REEs in mineral matrices

    Energy Technology Data Exchange (ETDEWEB)

    Whitty-Léveillé, Laurence; Turgeon, Keven [Département de génie des mines, de la métallurgie et des matériaux, Université Laval, Québec, QC (Canada); Département de chimie, Université Laval, Québec, QC (Canada); Bazin, Claude [Département de génie des mines, de la métallurgie et des matériaux, Université Laval, Québec, QC (Canada); Larivière, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Département de chimie, Université Laval, Québec, QC (Canada)

    2017-04-08

    The recent commercialisation of inductively coupled plasma tandem mass spectrometric (ICP-MS/MS) instruments has provided analytical chemists with a new tool to properly quantify atomic composition in a variety of matrices with minimal sample preparation. In this article, we report on our assessment of the compatibility of 3 sample preparation techniques (open-vessel acid digestion, microwave digestion and alkaline fusion) for the quantification of rare earth elements (REEs) in mineral matrices. The combination of the high digestion temperatures (1050 °C) and using LiBO{sub 2} as a flux was the most effective strategy for the digestion of all rare earth elements in mineral matrices and was compatible with ICP-MS/MS measurements. We also assessed the analytical performances of ICP-MS/MS against other plasma-based instrumentation (microwave induced plasma and inductively coupled plasma atomic emission spectroscopy (MIP-AES and ICP-AES, respectively) and single quadrupole inductively coupled plasma mass spectrometry (ICP-MS). The comparative study showed that the concentrations obtained by ICP-MS/MS are in excellent agreement with the certified reference material values, and much more suited than the other analytical techniques tested for the quantification of REEs, which exhibited low detectability and/or spectral interferences for some elements/isotopes. Finally, the ruggedness of the analytical protocol proposed which combines a rapid sample dissolution step performed by an automated fusion unit and an ICP-MS/MS as a detector was established using various certified mineral matrices containing variable levels of REEs. - Highlights: • Three types of digestion methods were tested. • Four types of analytical techniques were compared. • Elimination of the spectral interferences encountered in ICP-MS was achieved by the use of Tandem ICP-MS. • Robustness of the analytical procedure was successfully evaluate on four types of certified reference material.

  20. A Comparison of Soil-Water Sampling Techniques

    Science.gov (United States)

    Tindall, J. A.; Figueroa-Johnson, M.; Friedel, M. J.

    2007-12-01

    The representativeness of soil pore water extracted by suction lysimeters in ground-water monitoring studies is a problem that often confounds interpretation of measured data. Current soil water sampling techniques cannot identify the soil volume from which a pore water sample is extracted, neither macroscopic, microscopic, or preferential flowpath. This research was undertaken to compare values of extracted suction lysimeters samples from intact soil cores with samples obtained by the direct extraction methods to determine what portion of soil pore water is sampled by each method. Intact soil cores (30 centimeter (cm) diameter by 40 cm height) were extracted from two different sites - a sandy soil near Altamonte Springs, Florida and a clayey soil near Centralia in Boone County, Missouri. Isotopically labeled water (O18? - analyzed by mass spectrometry) and bromide concentrations (KBr- - measured using ion chromatography) from water samples taken by suction lysimeters was compared with samples obtained by direct extraction methods of centrifugation and azeotropic distillation. Water samples collected by direct extraction were about 0.25 ? more negative (depleted) than that collected by suction lysimeter values from a sandy soil and about 2-7 ? more negative from a well structured clayey soil. Results indicate that the majority of soil water in well-structured soil is strongly bound to soil grain surfaces and is not easily sampled by suction lysimeters. In cases where a sufficient volume of water has passed through the soil profile and displaced previous pore water, suction lysimeters will collect a representative sample of soil pore water from the sampled depth interval. It is suggested that for stable isotope studies monitoring precipitation and soil water, suction lysimeter should be installed at shallow depths (10 cm). Samples should also be coordinated with precipitation events. The data also indicate that each extraction method be use to sample a different

  1. Can groundwater sampling techniques used in monitoring wells influence methane concentrations and isotopes?

    Science.gov (United States)

    Rivard, Christine; Bordeleau, Geneviève; Lavoie, Denis; Lefebvre, René; Malet, Xavier

    2018-03-06

    Methane concentrations and isotopic composition in groundwater are the focus of a growing number of studies. However, concerns are often expressed regarding the integrity of samples, as methane is very volatile and may partially exsolve during sample lifting in the well and transfer to sampling containers. While issues concerning bottle-filling techniques have already been documented, this paper documents a comparison of methane concentration and isotopic composition obtained with three devices commonly used to retrieve water samples from dedicated observation wells. This work lies within the framework of a larger project carried out in the Saint-Édouard area (southern Québec, Canada), whose objective was to assess the risk to shallow groundwater quality related to potential shale gas exploitation. The selected sampling devices, which were tested on ten wells during three sampling campaigns, consist of an impeller pump, a bladder pump, and disposable sampling bags (HydraSleeve). The sampling bags were used both before and after pumping, to verify the appropriateness of a no-purge approach, compared to the low-flow approach involving pumping until stabilization of field physicochemical parameters. Results show that methane concentrations obtained with the selected sampling techniques are usually similar and that there is no systematic bias related to a specific technique. Nonetheless, concentrations can sometimes vary quite significantly (up to 3.5 times) for a given well and sampling event. Methane isotopic composition obtained with all sampling techniques is very similar, except in some cases where sampling bags were used before pumping (no-purge approach), in wells where multiple groundwater sources enter the borehole.

  2. Trace uranium analysis in geological sample by isotope dilution-alpha spectrometry and comparison with other techniques

    International Nuclear Information System (INIS)

    Shihomatsu, H.M.; Iyer, S.S.

    1988-12-01

    Establishment of uranium determination in geological samples by alpha spectrometric isotope dilution technique using 233 U tracer is described in the present work. The various steps involved in the method namely, preparation of the sample, electrodeposition, alpha spectrometry, isotope dilution, calculation of the concentration and error statistics are discussed in detail. The experimental parameters for the electrodeposition of uranium, like current density, pH concentration of the electrolyte solution, deposition time, electrode distance were all optimised based on the efficiency of the deposition. The total accuracy and precision of the IDAS using 233 U tracer in the determination of uranium in mineral and granite samples were of the order of 1 to 2% for the concentration range of 50-1500 ppm of U. Our results are compared with those obtained by others workers using similar and different techniques. (author) [pt

  3. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  4. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments.

    Science.gov (United States)

    Bras, Wim; Koizumi, Satoshi; Terrill, Nicholas J

    2014-11-01

    Small- and wide-angle X-ray scattering (SAXS, WAXS) are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  5. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  6. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  7. Laboratory techniques for safe encapsulation of α-emitting powder samples

    International Nuclear Information System (INIS)

    Chamberlain, H.E.; Pottinger, J.S.

    1984-01-01

    Plutonium oxide powder samples can be encapsulated in thin plastic film to prevent spread of contamination in counting and X-ray diffraction equipment. The film has to be thin enough to transmit X-rays and α-particles. Techniques are described for the wrapping process and the precautions necessary to keep the sample processing line free of significant contamination. (author)

  8. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  9. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    Science.gov (United States)

    Johnson, Blake N; Mutharasan, Raj

    2014-04-07

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  10. Sample application of sensitivity/uncertainty analysis techniques to a groundwater transport problem. National Low-Level Waste Management Program

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rood, A.S.; Harris, G.A.; Maheras, S.J.; Kotecki, M.

    1991-06-01

    The primary objective of this document is to provide sample applications of selected sensitivity and uncertainty analysis techniques within the context of the radiological performance assessment process. These applications were drawn from the companion document Guidelines for Sensitivity and Uncertainty Analyses of Low-Level Radioactive Waste Performance Assessment Computer Codes (S. Maheras and M. Kotecki, DOE/LLW-100, 1990). Three techniques are illustrated in this document: one-factor-at-a-time (OFAT) analysis, fractional factorial design, and Latin hypercube sampling. The report also illustrates the differences in sensitivity and uncertainty analysis at the early and latter stages of the performance assessment process, and potential pitfalls that can be encountered when applying the techniques. The emphasis is on application of the techniques as opposed to the actual results, since the results are hypothetical and are not based on site-specific conditions

  11. Stochastic bounded consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with general sampling delay

    International Nuclear Information System (INIS)

    Wu Zhi-Hai; Peng Li; Xie Lin-Bo; Wen Ji-Wei

    2013-01-01

    In this paper we provide a unified framework for consensus tracking of leader-follower multi-agent systems with measurement noises based on sampled data with a general sampling delay. First, a stochastic bounded consensus tracking protocol based on sampled data with a general sampling delay is presented by employing the delay decomposition technique. Then, necessary and sufficient conditions are derived for guaranteeing leader-follower multi-agent systems with measurement noises and a time-varying reference state to achieve mean square bounded consensus tracking. The obtained results cover no sampling delay, a small sampling delay and a large sampling delay as three special cases. Last, simulations are provided to demonstrate the effectiveness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  12. Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing

    Science.gov (United States)

    Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.

    2018-04-01

    We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.

  13. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments

    Directory of Open Access Journals (Sweden)

    Wim Bras

    2014-11-01

    Full Text Available Small- and wide-angle X-ray scattering (SAXS, WAXS are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  14. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  15. Assessment of Natural Radioactivity in TENORM Samples Using Different Techniques

    International Nuclear Information System (INIS)

    Salman, Kh.A.; Shahein, A.Y.

    2009-01-01

    In petroleum oil industries, technologically-enhanced, naturally occurring radioactive materials are produced. The presence of TENORM constitutes a significant radiological human health hazard. In the present work, liquid scintillation counting technique was used to determine both 222 Rn and 226 Ra concentrations in TENORM samples, by measuring 222 Rn concentrations in the sample at different intervals of time after preparation. The radiation doses from the TENORM samples were estimated using thermoluminenscent detector (TLD-4000). The estimated radiation doses were found to be proportional to both the measured radiation doses in site and natural activity concentration in the samples that measured with LSC

  16. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Yusof, Alias Mohamad; Jagir Singh, Jasbir Kaur

    1987-09-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding.

  18. Determination of trace element contents in grass samples for cattle feeding using NAA techniques

    International Nuclear Information System (INIS)

    Alias Mohamad Yusof; Jasbir Kaur Jagir Singh

    1987-01-01

    An investigation on trace elements contents in six types of grass samples used for cattle feeding have been carried out using NAA techniques. The grass samples, Mardi Digit, African Star, Signal, Guinea, Setaria and Setaria Splendida were found to contain at least 19 trace elements in varying concentrations. The results were compared to the figures obtained from available sources to ascertain the status as to whether the grass samples studied would satisfy the minimum requirements of trace elements present in grass for cattle feeding or otherwise. Preference made on the suitability of the grass samples for cattle feeding was based on the availability and abundance of the trace elements, taking into account factors such as the degree of toxicity, inadequate amounts and contamination due to the presence of other trace elements not essential for cattle feeding. (author)

  19. Trace elements and As speciation analysis of fly ash samples from an Indonesian coal power plant by means of neutron activation analysis and synchrotron based techniques

    International Nuclear Information System (INIS)

    Muhayatun Santoso; Diah Dwiana Lestiani; Endah Damastuti; Syukria Kurniawat; Bennett, J.W.; Juan Jose Leani; Mateusz Czyzycki; Alessandro Migliori; Germanos Karydas, Andreas

    2016-01-01

    The elemental characterization of coal fly ash samples is required to estimate the coal burning emissions into the environment and to assess the potential impact into the biosphere. Fly ash samples collected from a coal fired power plant in center Java, Indonesia were characterized by instrumental neutron activation analysis at two different facilities (BATAN, ANSTO) and synchrotron based techniques at Elettra Italy. Assessment of thirty (30) elements and an investigation of the potential toxicity of As species in coal fly ash were presented. The results obtained are discussed and compared with those reported from other regions of the world. (author)

  20. Cleaning and Cleanliness Verification Techniques for Mars Returned Sample Handling

    Science.gov (United States)

    Mickelson, E. T.; Lindstrom, D. J.; Allton, J. H.; Hittle, J. D.

    2002-01-01

    Precision cleaning and cleanliness verification techniques are examined as a subset of a comprehensive contamination control strategy for a Mars sample return mission. Additional information is contained in the original extended abstract.

  1. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  2. Field-based detection of biological samples for forensic analysis: Established techniques, novel tools, and future innovations.

    Science.gov (United States)

    Morrison, Jack; Watts, Giles; Hobbs, Glyn; Dawnay, Nick

    2018-04-01

    Field based forensic tests commonly provide information on the presence and identity of biological stains and can also support the identification of species. Such information can support downstream processing of forensic samples and generate rapid intelligence. These approaches have traditionally used chemical and immunological techniques to elicit the result but some are known to suffer from a lack of specificity and sensitivity. The last 10 years has seen the development of field-based genetic profiling systems, with specific focus on moving the mainstay of forensic genetic analysis, namely STR profiling, out of the laboratory and into the hands of the non-laboratory user. In doing so it is now possible for enforcement officers to generate a crime scene DNA profile which can then be matched to a reference or database profile. The introduction of these novel genetic platforms also allows for further development of new molecular assays aimed at answering the more traditional questions relating to body fluid identity and species detection. The current drive for field-based molecular tools is in response to the needs of the criminal justice system and enforcement agencies, and promises a step-change in how forensic evidence is processed. However, the adoption of such systems by the law enforcement community does not represent a new strategy in the way forensic science has integrated previous novel approaches. Nor do they automatically represent a threat to the quality control and assurance practices that are central to the field. This review examines the historical need and subsequent research and developmental breakthroughs in field-based forensic analysis over the past two decades with particular focus on genetic methods Emerging technologies from a range of scientific fields that have potential applications in forensic analysis at the crime scene are identified and associated issues that arise from the shift from laboratory into operational field use are discussed

  3. A Study of Assimilation Bias in Name-Based Sampling of Migrants

    Directory of Open Access Journals (Sweden)

    Schnell Rainer

    2014-06-01

    Full Text Available The use of personal names for screening is an increasingly popular sampling technique for migrant populations. Although this is often an effective sampling procedure, very little is known about the properties of this method. Based on a large German survey, this article compares characteristics of respondents whose names have been correctly classified as belonging to a migrant population with respondentswho aremigrants and whose names have not been classified as belonging to a migrant population. Although significant differences were found for some variables even with some large effect sizes, the overall bias introduced by name-based sampling (NBS is small as long as procedures with small false-negative rates are employed.

  4. The development and use of parametric sampling techniques for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dalrymple, G.J.; Broyd, T.W.

    1987-01-01

    In order to enable evaluation to be made of proposals for the underground disposal of low and intermediate level radioactive wastes in the United Kingdom, the Department of the Environment (DoE) research programme includes development of computer-based methods for use in a multistage assessment process. To test the adequacy of the various methods of data acquisitions and radiological assessment a mock assessment exercise is currently being conducted by the department. This paper outlines the proposed methodology which provides for the use of probabilistic modelling based upon the Atomic Energy of Canada Ltd SYVAC variability analysis approach using new models (SYVAC 'A') and data appropriate to UK conditions for a deep horizontal tunnel repository concept. This chapter describes the choice of a suitable technique for the sampling of data input to the SYVAC 'A' model and techniques for analysing the predictions of dose and risk made by the model. The sensitivity of the model predictions (risk and dose to man) to the input parameters was compared for four different methods. All four methods identified the same geological parameters as the most important. (author)

  5. THE STUDY OF HEAVY METAL FROM ENVIRONMENTAL SAMPLES BY ATOMIC TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Ion V. POPESCU

    2011-05-01

    Full Text Available Using the Atomic Absorption Spectrometry (AAS and Energy Dispersive X-ray spectrometry (EDXRF techniques we analyzed the contents of heavy metals ( Cd, Cr, Ni, Pb, Ti, Sr, Co, Bi from eight wild mushrooms and soil substrate samples (48 samples of eight fungal species and 32 underlying soil samples, collected from ten forest sites of Dambovița County Romania. It was determined that the elements, especially heavy metals, in soil were characteristic of the acidic soils of the Romanian forest lands and are influenced by industrial pollution. Analytical possibilities of AAS and EDXRF analytical techniques have been compared and the heavy metal transfer from substrate to mushrooms has been studied. The coefficient of accumulation of essential and heavy metals has been calculated as well. Heavy metal contents of all analyzed mushrooms were generally higher than previously reported in literature.

  6. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  7. A technique for extracting blood samples from mice in fire toxicity tests

    Science.gov (United States)

    Bucci, T. J.; Hilado, C. J.; Lopez, M. T.

    1976-01-01

    The extraction of adequate blood samples from moribund and dead mice has been a problem because of the small quantity of blood in each animal and the short time available between the animals' death and coagulation of the blood. These difficulties are particularly critical in fire toxicity tests because removal of the test animals while observing proper safety precautions for personnel is time-consuming. Techniques for extracting blood samples from mice were evaluated, and a technique was developed to obtain up to 0.8 ml of blood from a single mouse after death. The technique involves rapid exposure and cutting of the posterior vena cava and accumulation of blood in the peritoneal space. Blood samples of 0.5 ml or more from individual mice have been consistently obtained as much as 16 minutes after apparent death. Results of carboxyhemoglobin analyses of blood appeared reproducible and consistent with carbon monoxide concentrations in the exposure chamber.

  8. Comparison of sampling techniques for Rift Valley Fever virus ...

    African Journals Online (AJOL)

    We investigated mosquito sampling techniques with two types of traps and attractants at different time for trapping potential vectors for Rift Valley Fever virus. The study was conducted in six villages in Ngorongoro district in Tanzania from September to October 2012. A total of 1814 mosquitoes were collected, of which 738 ...

  9. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    Science.gov (United States)

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the

  10. Application of nuclear and allied techniques for the characterisation of forensic samples

    International Nuclear Information System (INIS)

    Sudersanan, M.; Kayasth, S.R.; Pant, D.R.; Chattopadhyay, N.; Bhattacharyya, C.N.

    2002-01-01

    Full text: Forensic science deals with the application of various techniques for physics, chemistry and biology for crime investigation. The legal implication of such analysis put considerable restriction on the choice of analytical techniques. Moreover, the unknown nature of the materials, the limited availability of samples and the large number of elements to be analysed put considerable strain on the analytical chemist on the selection of the appropriate technique. The availability of nuclear techniques has considerably enhanced the scope of forensic analysis. This paper deals with the recent results on the use of nuclear and allied analytical techniques for forensic applications. One of the important types of samples of forensic importance pertain to the identification of gunshot residues. The use of nuclear techniques has considerably simplified the interpretation of results through the use of appropriate elements like Ba, Cu, Sb, Zn, As and Sn etc. The combination of non-nuclear techniques for elements like Pb and Ni which are not easily amenable to be analysed by NAA and the use of appropriate separation procedure has led to the use of this method as a valid and versatile analytical procedure. In view of the presence of a large amounts of extraneous materials like cloth, body tissues etc in these samples and the limited availability of materials, the procedures for sample collection, dissolution and analysis have been standardized. Analysis of unknown materials like powders, metallic pieces etc. for the possible presence of nuclear materials or as materials in illicit trafficking is becoming important in recent years. The use of multi-technique approach is important in this case. Use of non-destructive techniques like XRF and radioactive counting enables the preliminary identification of materials and for the detection of radioactivity. Subsequent analysis by NAA or other appropriate analytical methods allows the characterization of the materials. Such

  11. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    Because toxic solvents and other hazardous materials are commonly used in analytical methods, characterization procedures result in significant and costly amount of waste. We are developing alternative analytical methods in the radiological and organic areas to reduce the volume or form of the hazardous waste produced during sample analysis. For the radiological area, we have examined high-pressure, closed-vessel microwave digestion as a way to minimize waste from sample preparation operations. Heated solutions of strong mineral acids can be avoided for sample digestion by using the microwave approach. Because reactivity increases with pressure, we examined the use of less hazardous solvents to leach selected contaminants from soil for subsequent analysis. We demonstrated the feasibility of this approach by extracting plutonium from a NET reference material using citric and tartaric acids with microwave digestion. Analytical results were comparable to traditional digestion methods, while hazardous waste was reduced by a factor often. We also evaluated the suitability of other natural acids, determined the extraction performance on a wider variety of soil types, and examined the extraction efficiency of other contaminants. For the organic area, we examined ways to minimize the wastes associated with the determination of polychlorinated biphenyls (PCBs) in environmental samples. Conventional methods for analyzing semivolatile organic compounds are labor intensive and require copious amounts of hazardous solvents. For soil and sediment samples, we have a method to analyze PCBs that is based on microscale extraction using benign solvents (e.g., water or hexane). The extraction is performed at elevated temperatures in stainless steel cells containing the sample and solvent. Gas chromatography-mass spectrometry (GC/MS) was used to quantitate the analytes in the isolated extract. More recently, we developed a method utilizing solid-phase microextraction (SPME) for natural

  12. Quadrature demodulation based circuit implementation of pulse stream for ultrasonic signal FRI sparse sampling

    International Nuclear Information System (INIS)

    Shoupeng, Song; Zhou, Jiang

    2017-01-01

    Converting ultrasonic signal to ultrasonic pulse stream is the key step of finite rate of innovation (FRI) sparse sampling. At present, ultrasonic pulse-stream-forming techniques are mainly based on digital algorithms. No hardware circuit that can achieve it has been reported. This paper proposes a new quadrature demodulation (QD) based circuit implementation method for forming an ultrasonic pulse stream. Elaborating on FRI sparse sampling theory, the process of ultrasonic signal is explained, followed by a discussion and analysis of ultrasonic pulse-stream-forming methods. In contrast to ultrasonic signal envelope extracting techniques, a quadrature demodulation method (QDM) is proposed. Simulation experiments were performed to determine its performance at various signal-to-noise ratios (SNRs). The circuit was then designed, with mixing module, oscillator, low pass filter (LPF), and root of square sum module. Finally, application experiments were carried out on pipeline sample ultrasonic flaw testing. The experimental results indicate that the QDM can accurately convert ultrasonic signal to ultrasonic pulse stream, and reverse the original signal information, such as pulse width, amplitude, and time of arrival. This technique lays the foundation for ultrasonic signal FRI sparse sampling directly with hardware circuitry. (paper)

  13. Determination of palladium in biological samples applying nuclear analytical techniques

    International Nuclear Information System (INIS)

    Cavalcante, Cassio Q.; Sato, Ivone M.; Salvador, Vera L. R.; Saiki, Mitiko

    2008-01-01

    This study presents Pd determinations in bovine tissue samples containing palladium prepared in the laboratory, and CCQM-P63 automotive catalyst materials of the Proficiency Test, using instrumental thermal and epithermal neutron activation analysis and energy dispersive X-ray fluorescence techniques. Solvent extraction and solid phase extraction procedures were also applied to separate Pd from interfering elements before the irradiation in the nuclear reactor. The results obtained by different techniques were compared against each other to examine sensitivity, precision and accuracy. (author)

  14. A new sampling technique for surface exposure dating using a portable electric rock cutter

    Directory of Open Access Journals (Sweden)

    Yusuke Suganuma

    2012-07-01

    Full Text Available Surface exposure dating using in situ cosmogenic nuclides has contributed to our understanding of Earth-surface processes. The precision of the ages estimated by this method is affected by the sample geometry; therefore, high accuracy measurements of the thickness and shape of the rock sample (thickness and shape is crucial. However, it is sometimes diffi cult to meet these requirements by conventional sampling methods with a hammer and chisel. Here, we propose a new sampling technique using a portable electric rock cutter. This sampling technique is faster, produces more precisely shaped samples, and allows for a more precise age interpretation. A simple theoretical modeldemonstrates that the age error due to defective sample geometry increases as the total sample thickness increases, indicating the importance of precise sampling for surface exposure dating.

  15. Mantle biopsy: a technique for nondestructive tissue-sampling of freshwater mussels

    Science.gov (United States)

    David J. Berg; Wendell R. Haag; Sheldon I. Guttman; James B. Sickel

    1995-01-01

    Mantle biopsy is a means of obtaining tissue samples for genetic, physiological, and contaminant studies of bivalves; but the effects of this biopsy on survival have not been determined. We describe a simple technique for obtaining such samples from unionacean bivalves and how we compared survival among biopsied and control organisms in field experiments. Survival was...

  16. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  17. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  18. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  19. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    Science.gov (United States)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  20. Analytical techniques for measurement of 99Tc in environmental samples

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    Three new methods have been developed for measuring 99 Tc in environmental samples. The most sensitive method is isotope dilution mass spectrometry, which allows measurement of about 1 x 10 -12 grams of 99 Tc. Results on analysis of five samples by this method compare very well with values obtained by a second independent method, which involves counting of beta particles from 99 Tc and internal conversion electrons from /sup 97m/Tc. A third method involving electrothermal atomic absorption has also been developed. Although this method is not as sensitive as the first two techniques, the cost per analysis is expected to be considerably less for certain types of samples

  1. Determination of some trace elements in biological samples using XRF and TXRF techniques

    International Nuclear Information System (INIS)

    Khuder, A.; Karjou, J.; Sawan, M. K.

    2006-07-01

    XRF and TXRF techniques were successfully used for the multi-element determination of trace elements in whole blood and human head hair samples. This was achieved by the direct analysis using XRF technique with different collimation units and by the optimized chemical procedures for TXRF analysis. Light element of S and P were preferably determined by XRF with primary x-ray excitation, while, elements of K, Ca, Fe, and Br were determined with a very good accuracy and precision using XRF with Cu- and Mo-secondary targets. The chemical procedure dependent on the preconcentration of trace elements by APDC was superiorly used for the determination of traces of Ni and Pb in the range of 1.0-1.7 μg/dl and 11-23 μg/dl, respectively, in whole blood samples by TXRF technique; determination of other elements as Cu and Zn was also achievable using this approach. Rb in whole blood samples was determined directly after the digestion of samples using PTFE-bomb for TXRF analysis. (author)

  2. Recent trends in sorption-based sample preparation and liquid chromatography techniques for food analysis.

    Science.gov (United States)

    V Soares Maciel, Edvaldo; de Toffoli, Ana Lúcia; Lanças, Fernando Mauro

    2018-04-20

    The accelerated rising of the world's population increased the consumption of food, thus demanding more rigors in the control of residue and contaminants in food-based products marketed for human consumption. In view of the complexity of most food matrices, including fruits, vegetables, different types of meat, beverages, among others, a sample preparation step is important to provide more reliable results when combined with HPLC separations. An adequate sample preparation step before the chromatographic analysis is mandatory in obtaining higher precision and accuracy in order to improve the extraction of the target analytes, one of the priorities in analytical chemistry. The recent discovery of new materials such as ionic liquids, graphene-derived materials, molecularly imprinted polymers, restricted access media, magnetic nanoparticles, and carbonaceous nanomaterials, provided high sensitivity and selectivity results in an extensive variety of applications. These materials, as well as their several possible combinations, have been demonstrated to be highly appropriate for the extraction of different analytes in complex samples such as food products. The main characteristics and application of these new materials in food analysis will be presented and discussed in this paper. Another topic discussed in this review covers the main advantages and limitations of sample preparation microtechniques, as well as their off-line and on-line combination with HPLC for food analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    Science.gov (United States)

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  5. Elastic moduli of normal and pathological human breast tissues: an inversion-technique-based investigation of 169 samples

    Science.gov (United States)

    Samani, Abbas; Zubovits, Judit; Plewes, Donald

    2007-03-01

    Understanding and quantifying the mechanical properties of breast tissues has been a subject of interest for the past two decades. This has been motivated in part by interest in modelling soft tissue response for surgery planning and virtual-reality-based surgical training. Interpreting elastography images for diagnostic purposes also requires a sound understanding of normal and pathological tissue mechanical properties. Reliable data on tissue elastic properties are very limited and those which are available tend to be inconsistent, in part as a result of measurement methodology. We have developed specialized techniques to measure tissue elasticity of breast normal tissues and tumour specimens and applied them to 169 fresh ex vivo breast tissue samples including fat and fibroglandular tissue as well as a range of benign and malignant breast tumour types. Results show that, under small deformation conditions, the elastic modulus of normal breast fat and fibroglandular tissues are similar while fibroadenomas were approximately twice the stiffness. Fibrocystic disease and malignant tumours exhibited a 3-6-fold increased stiffness with high-grade invasive ductal carcinoma exhibiting up to a 13-fold increase in stiffness compared to fibrogalndular tissue. A statistical analysis showed that differences between the elastic modulus of the majority of those tissues were statistically significant. Implications for the specificity advantages of elastography are reviewed.

  6. Elastic moduli of normal and pathological human breast tissues: an inversion-technique-based investigation of 169 samples

    International Nuclear Information System (INIS)

    Samani, Abbas; Zubovits, Judit; Plewes, Donald

    2007-01-01

    Understanding and quantifying the mechanical properties of breast tissues has been a subject of interest for the past two decades. This has been motivated in part by interest in modelling soft tissue response for surgery planning and virtual-reality-based surgical training. Interpreting elastography images for diagnostic purposes also requires a sound understanding of normal and pathological tissue mechanical properties. Reliable data on tissue elastic properties are very limited and those which are available tend to be inconsistent, in part as a result of measurement methodology. We have developed specialized techniques to measure tissue elasticity of breast normal tissues and tumour specimens and applied them to 169 fresh ex vivo breast tissue samples including fat and fibroglandular tissue as well as a range of benign and malignant breast tumour types. Results show that, under small deformation conditions, the elastic modulus of normal breast fat and fibroglandular tissues are similar while fibroadenomas were approximately twice the stiffness. Fibrocystic disease and malignant tumours exhibited a 3-6-fold increased stiffness with high-grade invasive ductal carcinoma exhibiting up to a 13-fold increase in stiffness compared to fibrogalndular tissue. A statistical analysis showed that differences between the elastic modulus of the majority of those tissues were statistically significant. Implications for the specificity advantages of elastography are reviewed

  7. Elastic moduli of normal and pathological human breast tissues: an inversion-technique-based investigation of 169 samples

    Energy Technology Data Exchange (ETDEWEB)

    Samani, Abbas [Department of Medical Biophysics/Electrical and Computer Engineering, University of Western Ontario, Medical Sciences Building, London, Ontario, N6A 5C1 (Canada); Zubovits, Judit [Department of Anatomic Pathology, Sunnybrook Health Sciences Centre, 2075 Bayview Avenue, Toronto, Ontario, M4N 3M5 (Canada); Plewes, Donald [Department of Medical Biophysics, University of Toronto, 2075 Bayview Avenue, Toronto, Ontario, M4N 3M5 (Canada)

    2007-03-21

    Understanding and quantifying the mechanical properties of breast tissues has been a subject of interest for the past two decades. This has been motivated in part by interest in modelling soft tissue response for surgery planning and virtual-reality-based surgical training. Interpreting elastography images for diagnostic purposes also requires a sound understanding of normal and pathological tissue mechanical properties. Reliable data on tissue elastic properties are very limited and those which are available tend to be inconsistent, in part as a result of measurement methodology. We have developed specialized techniques to measure tissue elasticity of breast normal tissues and tumour specimens and applied them to 169 fresh ex vivo breast tissue samples including fat and fibroglandular tissue as well as a range of benign and malignant breast tumour types. Results show that, under small deformation conditions, the elastic modulus of normal breast fat and fibroglandular tissues are similar while fibroadenomas were approximately twice the stiffness. Fibrocystic disease and malignant tumours exhibited a 3-6-fold increased stiffness with high-grade invasive ductal carcinoma exhibiting up to a 13-fold increase in stiffness compared to fibrogalndular tissue. A statistical analysis showed that differences between the elastic modulus of the majority of those tissues were statistically significant. Implications for the specificity advantages of elastography are reviewed.

  8. Radiation synthesized protein-based nanoparticles: A technique overview

    International Nuclear Information System (INIS)

    Varca, Gustavo H.C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-01-01

    Seeking for alternative routes for protein engineering a novel technique – radiation induced synthesis of protein nanoparticles – to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0–35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5–50 mg mL −1 ) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation. - Highlights: • Synthesis of protein-based nanoparticles by γ-irradiation. • Optimization of the technique. • Overview of mechanism involved in the nanoparticle formation. • Engineered papain nanoparticles for biomedical applications

  9. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  10. New imaging technique based on diffraction of a focused x-ray beam

    Energy Technology Data Exchange (ETDEWEB)

    Kazimirov, A [Cornell High Energy Synchrotron Source (CHESS), Cornell University, Ithaca, NY 14853 (United States); Kohn, V G [Russian Research Center ' Kurchatov Institute, 123182 Moscow (Russian Federation); Cai, Z-H [Advanced Photon Source, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)], E-mail: ayk7@cornell.edu

    2009-01-07

    We present first experimental results from a new diffraction depth-sensitive imaging technique. It is based on the diffraction of a focused x-ray beam from a crystalline sample and recording the intensity pattern on a high-resolution CCD detector positioned at a focal plane. Structural non-uniformity inside the sample results in a region of enhanced intensity in the diffraction pattern. The technique was applied to study silicon-on-insulator thin layers of various thicknesses which revealed a complex strain profile within the layers. A circular Fresnel zone plate was used as a focusing optic. Incoherent diffuse scattering spreads out of the diffraction plane and results in intensity recorded outside of the focal spot providing a new approach to separately register x-rays scattered coherently and incoherently from the sample. (fast track communication)

  11. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  12. Development of core sampling technique for ITER Type B radwaste

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. G.; Hong, K. P.; Oh, W. H.; Park, M. C.; Jung, S. H.; Ahn, S. B. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Type B radwaste (intermediate level and long lived radioactive waste) imported from ITER vacuum vessel are to be treated and stored in basement of hot cell building. The Type B radwaste treatment process is composed of buffer storage, cutting, sampling/tritium measurement, tritium removal, characterization, pre-packaging, inspection/decontamination, and storage etc. The cut slices of Type B radwaste components generated from cutting process undergo sampling process before and after tritium removal process. The purpose of sampling is to obtain small pieces of samples in order to investigate the tritium content and concentration of Type B radwaste. Core sampling, which is the candidates of sampling technique to be applied to ITER hot cell, is available for not thick (less than 50 mm) metal without use of coolant. Experimented materials were SS316L and CuCrZr in order to simulate ITER Type B radwaste. In core sampling, substantial secondary wastes from cutting chips will be produced unavoidably. Thus, core sampling machine will have to be equipped with disposal system such as suction equipment. Core sampling is considered an unfavorable method for tool wear compared to conventional drilling.

  13. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  14. Analysis of pure and malachite green doped polysulfone sample using FT-IR technique

    Science.gov (United States)

    Nayak, Rashmi J.; Khare, P. K.; Nayak, J. G.

    2018-05-01

    The sample of pure and malachite green doped Polysulfone in the form of foil was prepared by isothermal immersion technique. For the preparation of pure sample 4 gm of Polysulfone was dissolved in 50 ml of Dimethyl farmamide (DMF) solvent, while for the preparation of doped sample 10 mg, 50 mg and 100 mg Malachite Green was mixed with 4 gm of Polysulfone respectively. For the study of structural characterization of these pure and doped sample, Fourier Transform Infra-Red Spectroscopy (FT-IR) technique was used. This study shows that the intensity of transmittance decreases as the ratio of doping increases in pure polysulfone. The reduction in intensity of transmittance is clearly apparent in the present case more over the bands were broader which indicates towards charge transfer interaction between the donar and acceptor molecule.

  15. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  16. Use of a multigrid technique to study effects of limited sampling of heterogeneity on transport prediction

    International Nuclear Information System (INIS)

    Cole, C.R.; Foote, H.P.

    1987-02-01

    Reliable ground water transport prediction requires accurate spatial and temporal characterization of a hydrogeologic system. However, cost constraints and the desire to maintain site integrity by minimizing drilling can restrict the amount of spatial sampling that can be obtained to resolve the flow parameter variability associated with heterogeneities. This study quantifies the errors in subsurface transport predictions resulting from incomplete characterization of hydraulic conductivity heterogeneity. A multigrid technique was used to simulate two-dimensional flow velocity fields with high resolution. To obtain these velocity fields, the finite difference code MGRID, which implements a multigrid solution technique, was applied to compute stream functions on a 256-by-256 grid for a variety of hypothetical systems having detailed distributions of hydraulic conductivity. Spatial variability in hydraulic conductivity distributions was characterized by the components in the spectrum of spatial frequencies. A low-pass spatial filtering technique was applied to the base case hydraulic conductivity distribution to produce a data set with lower spatial frequency content. Arrival time curves were then calculated for filtered hydraulic conductivity distribution and compared to base case results to judge the relative importance of the higher spatial frequency components. Results indicate a progression from multimode to single-mode arrival time curves as the number and extent of distinct flow pathways are reduced by low-pass filtering. This relationship between transport predictions and spatial frequencies was used to judge the consequences of sampling the hydraulic conductivity with reduced spatial resolution. 22 refs., 17 figs

  17. Determination of elemental in soil samples from Gebeng area using NAA technique

    International Nuclear Information System (INIS)

    Md Suhaimi Elias; Wo, Y.M.; Mohd Suhaimi Hamzah

    2016-01-01

    Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (I_g_e_o) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (I_g_e_o) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated). (author)

  18. Optical waveguide lightmode spectroscopy technique-based immunosensor development for aflatoxin B1 determination in spice paprika samples.

    Science.gov (United States)

    Majer-Baranyi, Krisztina; Zalán, Zsolt; Mörtl, Mária; Juracsek, Judit; Szendrő, István; Székács, András; Adányi, Nóra

    2016-11-15

    Optical waveguide lightmode spectroscopy (OWLS) technique has been applied to label-free detection of aflatoxin B1 in a competitive immunoassay format, with the aim to compare the analytical goodness of the developed OWLS immunosenor with HPLC and enzyme-linked immunosorbent assay (ELISA) methods for the detection of aflatoxin in spice paprika matrix. We have also assessed applicability of the QuEChERS method prior to ELISA measurements, and the results were compared to those obtained by traditional solvent extraction followed by immunoaffinity clean-up. The AFB1 content of sixty commercial spice paprika samples from different countries were measured with the developed and optimized OWLS immunosensor. Comparing the results from the indirect immunosensor to that obtained by HPLC or ELISA provided excellent correlation (with regression coefficients above 0.94) indicating that the competitive OWLS immunosensor has a potential for quick determination of aflatoxin B1 in paprika samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  20. Separation Techniques for Quantification of Radionuclides in Environmental Samples

    Directory of Open Access Journals (Sweden)

    Dusan Galanda

    2009-01-01

    Full Text Available The reliable and quantitative measurement of radionuclides is important in order to determine environmental quality and radiation safety, and to monitor regulatory compliance. We examined soil samples from Podunajske Biskupice, near the city of Bratislava in the Slovak Republic, for the presence of several natural (238U, 232Th, 40K and anthropogenic (137Cs, 90Sr, 239Pu, 240Pu, 241Am radionuclides. The area is adjacent to a refinery and hazardous waste processing center, as well as the municipal incinerator plant, and so might possess an unusually high level of ecotoxic metals. We found that the levels of both naturally occurring and anthropogenic radionuclides fell within the expected ranges, indicating that these facilities pose no radiological threat to the local environment. During the course of our analysis, we modified existing techniques in order to allow us to handle the unusually large and complex samples that were needed to determine the levels of 239Pu, 240Pu, and 241Am activity. We also rated three commercial techniques for the separation of 90Sr from aqueous solutions and found that two of them, AnaLig Sr-01 and Empore Extraction Disks, were suitable for the quantitative and reliable separation of 90Sr, while the third, Sr-Spec Resin, was less so. The main criterion in evaluating these methods was the chemical recovery of 90Sr, which was less than we had expected. We also considered speed of separation and additional steps needed to prepare the sample for separation.

  1. Accelerated Solvent Extraction: An Innovative Sample Extraction Technique for Natural Products

    International Nuclear Information System (INIS)

    Hazlina Ahmad Hassali; Azfar Hanif Abd Aziz; Rosniza Razali

    2015-01-01

    Accelerated solvent extraction (ASE) is one of the novel techniques that have been developed for the extraction of phytochemicals from plants in order to shorten the extraction time, decrease the solvent consumption, increase the extraction yield and enhance the quality of extracts. This technique combines elevated temperatures and pressure with liquid solvents. This paper gives a brief overview of accelerated solvent extraction technique for sample preparation and its application to the extraction of natural products. Through practical examples, the effects of operational parameters such as temperature, volume of solvent used, extraction time and extraction yields on the performance of ASE are discussed. It is demonstrated that ASE technique allows reduced solvent consumption and shorter extraction time, while the extraction yields are even higher than those obtained with conventional methods. (author)

  2. Practical aspects of the resin bead technique for mass spectrometric sample loading

    International Nuclear Information System (INIS)

    Walker, R.L.; Pritchard, C.A.; Carter, J.A.; Smith, D.H.

    1976-07-01

    Using an anion resin bead as a loading vehicle for uranium and plutonium samples which are to be analyzed isotopically in a mass spectrometer has many advantages over conventional techniques. It is applicable to any laboratory routinely performing such analyses, but should be particularly relevant for Safeguards' purposes. Because the techniques required differ markedly from those of conventional methods, this report has been written to describe them in detail to enable those unfamiliar with the technique to master it with a minimum of trouble

  3. Water stable isotope measurements of Antarctic samples by means of IRMS and WS-CRDS techniques

    Science.gov (United States)

    Michelini, Marzia; Bonazza, Mattia; Braida, Martina; Flora, Onelio; Dreossi, Giuliano; Stenni, Barbara

    2010-05-01

    In the last years in the scientific community there has been an increasing interest for the application of stable isotope techniques to several environmental problems such as drinking water safeguarding, groundwater management, climate change, soils and paleoclimate studies etc. For example, the water stable isotopes, being natural tracers of the hydrological cycle, have been extensively used as tools to characterize regional aquifers and to reconstruct past temperature changes from polar ice cores. Here the need for improvements in analytical techniques: the high request for information calls for technologies that can offer a great quantity of analyses in short times and with low costs. Furthermore, sometimes it is difficult to obtain big amount of samples (as is the case for Antarctic ice cores or interstitial water) preventing the possibility to replicate the analyses. Here, we present oxygen and hydrogen measurements performed on water samples covering a big range of isotopic values (from very negative antarctic precipitation to mid-latitude precipitation values) carried out with both the conventional Isotope Ratio Mass Spectrometry (IRMS) technique and with a new method based on laser absorption techniques, the Wavelenght Scanned Cavity Ringdown Spectroscopy (WS-CRDS). This study is focusing on improving the precision of the measurements carried out with WS-CRDS in order to extensively apply this method to Antarctic ice core paleoclimate studies. The WS-CRDS is a variation of the CRDS developed in 1988 by O'Keef and Deacon. In CRDS a pulse of light goes through a box with high reflective inner surfaces; when there is no sample in the box the light beam doesn't find any obstacle in its path, but the reflectivity of the walls is not perfect so eventually there will be an absorption of the light beam; when the sample is injected in the box there is absorption and the difference between the time of absorption without and with sample is proportional to the quantity

  4. Laser Based In Situ Techniques: Novel Methods for Generating Extreme Conditions in TEM Samples

    Energy Technology Data Exchange (ETDEWEB)

    Taheri, M; Lagrange, T; Reed, B; Armstrong, M; Campbell, G; DeHope, W; Kim, J; King, W; Masiel, D; Browning, N

    2008-02-25

    The Dynamic Transmission Electron Microscope (DTEM) is introduced as a novel tool for in situ processing of materials. Examples of various types of dynamic studies outline the advantages and differences of laser-based heating in the DTEM in comparison to conventional (resistive) heating in situ TEM methods. We demonstrate various unique capabilities of the drive laser, namely, in situ processing of nanoscale materials, rapid and high temperature phase transformations, and controlled thermal activation of materials. These experiments would otherwise be impossible without the use of the DTEM drive laser. Thus, the potential of the DTEM to as a new technique to process and characterize the growth of a myriad of micro and nanostructures is demonstrated.

  5. A fully blanketed early B star LTE model atmosphere using an opacity sampling technique

    International Nuclear Information System (INIS)

    Phillips, A.P.; Wright, S.L.

    1980-01-01

    A fully blanketed LTE model of a stellar atmosphere with Tsub(e) = 21914 K (thetasub(e) = 0.23), log g = 4 is presented. The model includes an explicit representation of the opacity due to the strongest lines, and uses a statistical opacity sampling technique to represent the weaker line opacity. The sampling technique is subjected to several tests and the model is compared with an atmosphere calculated using the line-distribution function method. The limitations of the distribution function method and the particular opacity sampling method used here are discussed in the light of the results obtained. (author)

  6. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  7. Microwave Heating of Synthetic Skin Samples for Potential Treatment of Gout Using the Metal-Assisted and Microwave-Accelerated Decrystallization Technique.

    Science.gov (United States)

    Toker, Salih; Boone-Kukoyi, Zainab; Thompson, Nishone; Ajifa, Hillary; Clement, Travis; Ozturk, Birol; Aslan, Kadir

    2016-11-30

    Physical stability of synthetic skin samples during their exposure to microwave heating was investigated to demonstrate the use of the metal-assisted and microwave-accelerated decrystallization (MAMAD) technique for potential biomedical applications. In this regard, optical microscopy and temperature measurements were employed for the qualitative and quantitative assessment of damage to synthetic skin samples during 20 s intermittent microwave heating using a monomode microwave source (at 8 GHz, 2-20 W) up to 120 s. The extent of damage to synthetic skin samples, assessed by the change in the surface area of skin samples, was negligible for microwave power of ≤7 W and more extensive damage (>50%) to skin samples occurred when exposed to >7 W at initial temperature range of 20-39 °C. The initial temperature of synthetic skin samples significantly affected the extent of change in temperature of synthetic skin samples during their exposure to microwave heating. The proof of principle use of the MAMAD technique was demonstrated for the decrystallization of a model biological crystal (l-alanine) placed under synthetic skin samples in the presence of gold nanoparticles. Our results showed that the size (initial size ∼850 μm) of l-alanine crystals can be reduced up to 60% in 120 s without damage to synthetic skin samples using the MAMAD technique. Finite-difference time-domain-based simulations of the electric field distribution of an 8 GHz monomode microwave radiation showed that synthetic skin samples are predicted to absorb ∼92.2% of the microwave radiation.

  8. Microwave Heating of Synthetic Skin Samples for Potential Treatment of Gout Using the Metal-Assisted and Microwave-Accelerated Decrystallization Technique

    Science.gov (United States)

    2016-01-01

    Physical stability of synthetic skin samples during their exposure to microwave heating was investigated to demonstrate the use of the metal-assisted and microwave-accelerated decrystallization (MAMAD) technique for potential biomedical applications. In this regard, optical microscopy and temperature measurements were employed for the qualitative and quantitative assessment of damage to synthetic skin samples during 20 s intermittent microwave heating using a monomode microwave source (at 8 GHz, 2–20 W) up to 120 s. The extent of damage to synthetic skin samples, assessed by the change in the surface area of skin samples, was negligible for microwave power of ≤7 W and more extensive damage (>50%) to skin samples occurred when exposed to >7 W at initial temperature range of 20–39 °C. The initial temperature of synthetic skin samples significantly affected the extent of change in temperature of synthetic skin samples during their exposure to microwave heating. The proof of principle use of the MAMAD technique was demonstrated for the decrystallization of a model biological crystal (l-alanine) placed under synthetic skin samples in the presence of gold nanoparticles. Our results showed that the size (initial size ∼850 μm) of l-alanine crystals can be reduced up to 60% in 120 s without damage to synthetic skin samples using the MAMAD technique. Finite-difference time-domain-based simulations of the electric field distribution of an 8 GHz monomode microwave radiation showed that synthetic skin samples are predicted to absorb ∼92.2% of the microwave radiation. PMID:27917407

  9. Reverse sample genome probing, a new technique for identification of bacteria in environmental samples by DNA hybridization, and its application to the identification of sulfate-reducing bacteria in oil field samples

    International Nuclear Information System (INIS)

    Voordouw, G.; Voordouw, J.K.; Karkhoff-Schweizer, R.R.; Fedorak, P.M.; Westlake, D.W.S.

    1991-01-01

    A novel method for identification of bacteria in environmental samples by DNA hybridization is presented. It is based on the fact that, even within a genus, the genomes of different bacteria may have little overall sequence homology. This allows the use of the labeled genomic DNA of a given bacterium (referred to as a standard) to probe for its presence and that of bacteria with highly homologous genomes in total DNA obtained from an environmental sample. Alternatively, total DNA extracted from the sample can be labeled and used to probe filters on which denatured chromosomal DNA from relevant bacterial standards has been spotted. The latter technique is referred to as reverse sample genome probing, since it is the reverse of the usual practice of deriving probes from reference bacteria for analyzing a DNA sample. Reverse sample genome probing allows identification of bacteria in a sample in a single step once a master filter with suitable standards has been developed. Application of reverse sample genome probing to the identification of sulfate-reducing bacteria in 31 samples obtained primarily from oil fields in the province of Alberta has indicated that there are at least 20 genotypically different sulfate-reducing bacteria in these samples

  10. Determination of rock fragmentation based on a photographic technique

    International Nuclear Information System (INIS)

    Dehgan Banadaki, M.M.; Majdi, A.; Raessi Gahrooei, D.

    2002-01-01

    The paper represents a physical blasting model in laboratory scale along with a photographic approach to describe the distribution of blasted rock materials. For this purpose, based on wobble probability distribution function, eight samples each weighted 100 kg,were obtained. Four pictures from four different section of each sample were taken. Then, pictures were converted into graphic files with characterizing boundary of each piece of rocks in the samples. Error caused due to perspective were eliminated. Volume of each piece of the blasted rock materials and hence the required sieve size, each piece of rock to pass through, were calculated. Finally, original blasted rock size distribution was compared with that obtained from the photographic method. The paper concludes with presenting an approach to convert the results of photographic technique into size distribution obtained by seine analysis with sufficient verification

  11. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    International Nuclear Information System (INIS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Dziadowicz, M.; Kopeć, E.; Majewska, U.; Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I.; Wudarczyk-Moćko, J.; Góźdź, S.

    2015-01-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  12. X-ray spectrometry and X-ray microtomography techniques for soil and geological samples analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Dziadowicz, M.; Kopeć, E. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Majewska, U. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Mazurek, M.; Pajek, M.; Sobisz, M.; Stabrawa, I. [Institute of Physics, Jan Kochanowski University, ul. Świetokrzyska 15, 25-406 Kielce (Poland); Wudarczyk-Moćko, J. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Góźdź, S. [Holycross Cancer Center, ul. Artwińskiego 3, 25-734 Kielce (Poland); Institute of Public Health, Jan Kochanowski University, IX Wieków Kielc 19, 25-317 Kielce (Poland)

    2015-12-01

    A particular subject of X-ray fluorescence analysis is its application in studies of the multielemental sample of composition in a wide range of concentrations, samples with different matrices, also inhomogeneous ones and those characterized with different grain size. Typical examples of these kinds of samples are soil or geological samples for which XRF elemental analysis may be difficult due to XRF disturbing effects. In this paper the WDXRF technique was applied in elemental analysis concerning different soil and geological samples (therapeutic mud, floral soil, brown soil, sandy soil, calcium aluminum cement). The sample morphology was analyzed using X-ray microtomography technique. The paper discusses the differences between the composition of samples, the influence of procedures with respect to the preparation of samples as regards their morphology and, finally, a quantitative analysis. The results of the studies were statistically tested (one-way ANOVA and correlation coefficients). For lead concentration determination in samples of sandy soil and cement-like matrix, the WDXRF spectrometer calibration was performed. The elemental analysis of the samples was complemented with knowledge of chemical composition obtained by X-ray powder diffraction.

  13. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  14. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  15. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  16. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    International Nuclear Information System (INIS)

    Festa, G; Andreani, C; Pietropaolo, A; Grazzi, F; Scherillo, A; Barzagli, E; Sutton, L F; Bognetti, L; Bini, A; Schooneveld, E

    2013-01-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics. (paper)

  17. 100 GHz pulse waveform measurement based on electro-optic sampling

    Science.gov (United States)

    Feng, Zhigang; Zhao, Kejia; Yang, Zhijun; Miao, Jingyuan; Chen, He

    2018-05-01

    We present an ultrafast pulse waveform measurement system based on an electro-optic sampling technique at 1560 nm and prepare LiTaO3-based electro-optic modulators with a coplanar waveguide structure. The transmission and reflection characteristics of electrical pulses on a coplanar waveguide terminated with an open circuit and a resistor are investigated by analyzing the corresponding time-domain pulse waveforms. We measure the output electrical pulse waveform of a 100 GHz photodiode and the obtained rise times of the impulse and step responses are 2.5 and 3.4 ps, respectively.

  18. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  19. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds

    DEFF Research Database (Denmark)

    Munk, Patrick; Dalhoff Andersen, Vibe; de Knegt, Leonardo

    2016-01-01

    Objectives Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read...... mapping shows promise for quantitative resistance monitoring. Methods We evaluated the ability of: (i) MIC determination for Escherichia coli; (ii) cfu counting of E. coli; (iii) cfu counting of aerobic bacteria; and (iv) metagenomic shotgun sequencing to predict expected tetracycline resistance based...... cultivation-based techniques in terms of predicting expected tetracycline resistance based on antimicrobial consumption. Our metagenomic approach had sufficient resolution to detect antimicrobial-induced changes to individual resistance gene abundances. Pen floor manure samples were found to represent rectal...

  20. Case-based reasoning diagnostic technique based on multi-attribute similarity

    Energy Technology Data Exchange (ETDEWEB)

    Makoto, Takahashi [Tohoku University, Miyagi (Japan); Akio, Gofuku [Okayama University, Okayamaa (Japan)

    2014-08-15

    Case-based diagnostic technique has been developed based on the multi-attribute similarity. Specific feature of the developed system is to use multiple attributes of process signals for similarity evaluation to retrieve a similar case stored in a case base. The present technique has been applied to the measurement data from Monju with some simulated anomalies. The results of numerical experiments showed that the present technique can be utilizes as one of the methods for a hybrid-type diagnosis system.

  1. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  2. Two-compartment, two-sample technique for accurate estimation of effective renal plasma flow: Theoretical development and comparison with other methods

    International Nuclear Information System (INIS)

    Lear, J.L.; Feyerabend, A.; Gregory, C.

    1989-01-01

    Discordance between effective renal plasma flow (ERPF) measurements from radionuclide techniques that use single versus multiple plasma samples was investigated. In particular, the authors determined whether effects of variations in distribution volume (Vd) of iodine-131 iodohippurate on measurement of ERPF could be ignored, an assumption implicit in the single-sample technique. The influence of Vd on ERPF was found to be significant, a factor indicating an important and previously unappreciated source of error in the single-sample technique. Therefore, a new two-compartment, two-plasma-sample technique was developed on the basis of the observations that while variations in Vd occur from patient to patient, the relationship between intravascular and extravascular components of Vd and the rate of iodohippurate exchange between the components are stable throughout a wide range of physiologic and pathologic conditions. The new technique was applied in a series of 30 studies in 19 patients. Results were compared with those achieved with the reference, single-sample, and slope-intercept techniques. The new two-compartment, two-sample technique yielded estimates of ERPF that more closely agreed with the reference multiple-sample method than either the single-sample or slope-intercept techniques

  3. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  4. Surface plasmon resonance sensor based on golden nanoparticles and cold vapour generation technique for the detection of mercury in aqueous samples

    Science.gov (United States)

    Castillo, Jimmy; Chirinos, José; Gutiérrez, Héctor; La Cruz, Marie

    2017-09-01

    In this work, a surface plasmon resonance sensor for determination of Hg based on golden nanoparticles was developed. The sensor follows the change of the signal from solutions in contact with atomic mercury previously generated by the reaction with sodium borohydride. Mie theory predicts that Hg film, as low as 5 nm, induced a significant reduction of the surface plasmon resonance signal of 40 nm golden nanoparticles. This property was used for quantification purposes in the sensor. The device provide limits of detection of 172 ng/L that can compared with the 91 ng/L obtained with atomic fluorescence, a common technique used for Hg quantification in drinking water. This result was relevant, considering that it was not necessary to functionalize the nanoparticles or use nanoparticles deposited in a substrate. Also, thanks that Hg is released from the matrix, the surface plasmon resonance signal was not affected by concomitant elements in the sample.

  5. Silicon based ultrafast optical waveform sampling

    DEFF Research Database (Denmark)

    Ji, Hua; Galili, Michael; Pu, Minhao

    2010-01-01

    A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode-locker as th......A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode......-locker as the sampling source. A clear eye-diagram of a 320 Gbit/s data signal is obtained. The temporal resolution of the sampling system is estimated to 360 fs....

  6. All-optical optoacoustic microscopy based on probe beam deflection technique

    OpenAIRE

    Maswadi, Saher M.; Ibey, Bennett L.; Roth, Caleb C.; Tsyboulski, Dmitri A.; Beier, Hope T.; Glickman, Randolph D.; Oraevsky, Alexander A.

    2016-01-01

    Optoacoustic (OA) microscopy using an all-optical system based on the probe beam deflection technique (PBDT) for detection of laser-induced acoustic signals was investigated as an alternative to conventional piezoelectric transducers. PBDT provides a number of advantages for OA microscopy including (i) efficient coupling of laser excitation energy to the samples being imaged through the probing laser beam, (ii) undistorted coupling of acoustic waves to the detector without the need for separa...

  7. Neutron activation analysis technique and X-ray fluorescence in bovine liver sample

    International Nuclear Information System (INIS)

    Maihara, V.A.; Favaro, D.I.T.; Vasconcellos, M.B.A.; Sato, I.M.; Salvador, V.L.

    2002-01-01

    Many analytical techniques have been used in food and diet analysis in order to determine a great number of nutritional elements, ranging from percentage to ng g -1 , with high sensitivity and accuracy. Instrumental Neutron activation Analysis (INAA) has been employed to certificate many trace elements in biological reference materials. More recently, the X-Ray Fluorescence (FRX-WD) has been also used to determine some essential elements in food samples. The INAA has been applied in nutrition studies in our laboratory at IPEN since the 80 s. For the development of analytical methodologies the use of the reference materials with the same characteristics of the sample analyzed is essential. Several Brazilian laboratories do not have conditions to use these materials due their high cost.In this paper preliminary results of commercial bovine liver sample analyses obtained by INAA and WD-XRF methods are presented. This sample was prepared to be a Brazilian candidate of reference material for a group of laboratories participating in a research project sponsored by FAPESP. The concentrations of some elements like Cl, K, Na, P, S and trace elements Br, Ca, Co, Cu, Fe, Mg, Mn, Mo, Rb, Se and Zn were determined by INAA and WD-XFR. For validation methodology of both techniques, NIST SRM 1577b Bovine Liver reference material was analyzed and the detection limits were calculated. The concentrations of elements determined by both analytical techniques were compared by using the Student's t-test and for Cl, Cu, Fe, K, Mg, Na, Rn and Zn the results do show no statistical difference for 95% significance level. (author)

  8. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  9. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for

  10. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for

  11. Uranium content measurement in drinking water samples using track etch technique

    International Nuclear Information System (INIS)

    Kumar, Mukesh; Kumar, Ajay; Singh, Surinder; Mahajan, R.K.; Walia, T.P.S.

    2003-01-01

    The concentration of uranium has been assessed in drinking water samples collected from different locations in Bathinda district, Punjab, India. The water samples are taken from hand pumps and tube wells. Uranium is determined using fission track technique. Uranium concentration in the water samples varies from 1.65±0.06 to 74.98±0.38 μg/l. These values are compared with safe limit values recommended for drinking water. Most of the water samples are found to have uranium concentration above the safe limit. Analysis of some heavy metals (Zn, Cd, Pb and Cu) in water is also done in order to see if some correlation exists between the concentration of uranium and these heavy metals. A weak positive correlation has been observed between the concentration of uranium and heavy metals of Pb, Cd and Cu

  12. Attempts to develop a new nuclear measurement technique of β-glucuronidase levels in biological samples

    International Nuclear Information System (INIS)

    Unak, T.; Avcibasi, U.; Yildirim, Y.; Cetinkaya, B.

    2003-01-01

    β-Glucuronidase is one of the most important hydrolytic enzymes in living systems and plays an essential role in the detoxification pathway of toxic materials incorporated into the metabolism. Some organs, especially liver and some tumour tissues, have high level of β-glucuronidase activity. As a result the enzymatic activity of some kind of tumour cells, the radiolabelled glucuronide conjugates of cytotoxic, as well as radiotoxic compounds have potentially very valuable diagnostic and therapeutic applications in cancer research. For this reason, a sensitive measurement of β-glucuronidase levels in normal and tumour tissues is a very important step for these kinds of applications. According to the classical measurement method of β-glucuronidase activity, in general, the quantity of phenolphthalein liberated from its glucuronide conjugate, i.e. phenolphthalein-glucuronide, by β-glucuronidase has been measured by use of the spectrophotometric technique. The lower detection limit of phenolphthalein by the spectrophotometric technique is about 1-3 mg. This means that the β-glucuronidase levels could not be detected in biological samples having lower levels of β-glucuronidase activity and therefore the applications of the spectrophotometric technique in cancer research are very seriously limited. Starting from this consideration, we recently attempted to develop a new nuclear technique to measure much lower concentrations of β-glucuronidase in biological samples. To improve the detection limit, phenolphthalein-glucuronide and also phenyl-N-glucuronide were radioiodinated with 131 I and their radioactivity was measured by use of the counting technique. Therefore, the quantity of phenolphthalein or aniline radioiodinated with 131 I and liberated by the deglucuronidation reactivity of β-glucuronidase was used in an attempt to measure levels lower than the spectrophotometric measurement technique. The results obtained clearly verified that 0.01 pg level of

  13. Composite Techniques Based Color Image Compression

    Directory of Open Access Journals (Sweden)

    Zainab Ibrahim Abood

    2017-03-01

    Full Text Available Compression for color image is now necessary for transmission and storage in the data bases since the color gives a pleasing nature and natural for any object, so three composite techniques based color image compression is implemented to achieve image with high compression, no loss in original image, better performance and good image quality. These techniques are composite stationary wavelet technique (S, composite wavelet technique (W and composite multi-wavelet technique (M. For the high energy sub-band of the 3rd level of each composite transform in each composite technique, the compression parameters are calculated. The best composite transform among the 27 types is the three levels of multi-wavelet transform (MMM in M technique which has the highest values of energy (En and compression ratio (CR and least values of bit per pixel (bpp, time (T and rate distortion R(D. Also the values of the compression parameters of the color image are nearly the same as the average values of the compression parameters of the three bands of the same image.

  14. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  15. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    Science.gov (United States)

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  16. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  17. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  18. A genetic algorithm-based framework for wavelength selection on sample categorization.

    Science.gov (United States)

    Anzanello, Michel J; Yamashita, Gabrielli; Marcelo, Marcelo; Fogliatto, Flávio S; Ortiz, Rafael S; Mariotti, Kristiane; Ferrão, Marco F

    2017-08-01

    In forensic and pharmaceutical scenarios, the application of chemometrics and optimization techniques has unveiled common and peculiar features of seized medicine and drug samples, helping investigative forces to track illegal operations. This paper proposes a novel framework aimed at identifying relevant subsets of attenuated total reflectance Fourier transform infrared (ATR-FTIR) wavelengths for classifying samples into two classes, for example authentic or forged categories in case of medicines, or salt or base form in cocaine analysis. In the first step of the framework, the ATR-FTIR spectra were partitioned into equidistant intervals and the k-nearest neighbour (KNN) classification technique was applied to each interval to insert samples into proper classes. In the next step, selected intervals were refined through the genetic algorithm (GA) by identifying a limited number of wavelengths from the intervals previously selected aimed at maximizing classification accuracy. When applied to Cialis®, Viagra®, and cocaine ATR-FTIR datasets, the proposed method substantially decreased the number of wavelengths needed to categorize, and increased the classification accuracy. From a practical perspective, the proposed method provides investigative forces with valuable information towards monitoring illegal production of drugs and medicines. In addition, focusing on a reduced subset of wavelengths allows the development of portable devices capable of testing the authenticity of samples during police checking events, avoiding the need for later laboratorial analyses and reducing equipment expenses. Theoretically, the proposed GA-based approach yields more refined solutions than the current methods relying on interval approaches, which tend to insert irrelevant wavelengths in the retained intervals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Lot quality assurance sampling techniques in health surveys in developing countries: advantages and current constraints.

    Science.gov (United States)

    Lanata, C F; Black, R E

    1991-01-01

    Traditional survey methods, which are generally costly and time-consuming, usually provide information at the regional or national level only. The utilization of lot quality assurance sampling (LQAS) methodology, developed in industry for quality control, makes it possible to use small sample sizes when conducting surveys in small geographical or population-based areas (lots). This article describes the practical use of LQAS for conducting health surveys to monitor health programmes in developing countries. Following a brief description of the method, the article explains how to build a sample frame and conduct the sampling to apply LQAS under field conditions. A detailed description of the procedure for selecting a sampling unit to monitor the health programme and a sample size is given. The sampling schemes utilizing LQAS applicable to health surveys, such as simple- and double-sampling schemes, are discussed. The interpretation of the survey results and the planning of subsequent rounds of LQAS surveys are also discussed. When describing the applicability of LQAS in health surveys in developing countries, the article considers current limitations for its use by health planners in charge of health programmes, and suggests ways to overcome these limitations through future research. It is hoped that with increasing attention being given to industrial sampling plans in general, and LQAS in particular, their utilization to monitor health programmes will provide health planners in developing countries with powerful techniques to help them achieve their health programme targets.

  20. [A comparison of convenience sampling and purposive sampling].

    Science.gov (United States)

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. Direct sampling technique of bees on Vriesea philippocoburgii (Bromeliaceae, Tillandsioideae flowers

    Directory of Open Access Journals (Sweden)

    Afonso Inácio Orth

    2004-11-01

    Full Text Available In our study on Vriesea philippocoburgii Wawra pollination, due to the small proportion of flowers in anthesis on a single day and the damage caused to inflorescences when netting directly on flowers, we used the direct sampling technique (DST of bees on flowers. This technique was applied to 40 flowering plants and resulted in the capture of 160 specimens, belonging to nine genera of Apoidea and separated into 19 morph species. As DST maintains the integrity of flowers for later Bees’ visits, it can enhance the survey’s performance, constituting an alternative methodology for the collection of bees visiting flowering plants.

  2. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  3. Mercury in Environmental and Biological Samples Using Online Combustion with Sequential Atomic Absorption and Fluorescence Measurements: A Direct Comparison of Two Fundamental Techniques in Spectrometry

    Science.gov (United States)

    Cizdziel, James V.

    2011-01-01

    In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…

  4. Identification of unknown sample using NAA, EDXRF, XRD techniques

    International Nuclear Information System (INIS)

    Dalvi, Aditi A.; Swain, K.K.; Chavan, Trupti; Remya Devi, P.S.; Wagh, D.N.; Verma, R.

    2015-01-01

    Analytical Chemistry Division (ACD), Bhabha Atomic Research Centre (BARC) receives samples from law enforcement agencies such as Directorate of Revenue Intelligence, Customs for analysis. Five unknown grey powdered samples were received for identification and were suspected to be Iridium (Ir). Identification of unknown sample is always a challenging task and suitable analytical techniques have to be judiciously utilized for arriving at the conclusion. Qualitative analysis was carried out using Jordan Valley, EX-3600 M Energy dispersive X-ray fluorescence (EDXRF) spectrometer at ACD, BARC. A SLP series LEO Si (Li) detector (active area: 30 mm 2 ; thickness: 3.5 mm; resolution: 140 eV at 5.9 keV of Mn K X-ray) was used during the measurement and only characteristic X-rays of Ir (Lα: 9.17 keV and Lβ: 10.70 keV) was seen in the X-ray spectrum. X-ray diffraction (XRD) measurement results indicated that the Ir was in the form of metal. To confirm the XRD data, neutron activation analysis (NAA) was carried out by irradiating samples and elemental standards (as comparator) in graphite reflector position of Advanced Heavy Water Reactor Critical Facility (AHWR CF) reactor, BARC, Mumbai. After suitable decay period, gamma activity measurements were carried out using 45% HPGe detector coupled to 8 k multi channel analyzer. Characteristic gamma line at 328.4 keV of the activation product 194 Ir was used for quantification of iridium and relative method of NAA was used for concentration calculations. NAA results confirmed that all the samples were Iridium metal. (author)

  5. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  6. Evaluation of two membrane-based microextraction techniques for the determination of endocrine disruptors in aqueous samples by HPLC with diode array detection.

    Science.gov (United States)

    Luiz Oenning, Anderson; Lopes, Daniela; Neves Dias, Adriana; Merib, Josias; Carasek, Eduardo

    2017-11-01

    In this study, the viability of two membrane-based microextraction techniques for the determination of endocrine disruptors by high-performance liquid chromatography with diode array detection was evaluated: hollow fiber microporous membrane liquid-liquid extraction and hollow-fiber-supported dispersive liquid-liquid microextraction. The extraction efficiencies obtained for methylparaben, ethylparaben, bisphenol A, benzophenone, and 2-ethylhexyl-4-methoxycinnamate from aqueous matrices obtained using both approaches were compared and showed that hollow fiber microporous membrane liquid-liquid extraction exhibited higher extraction efficiency for most of the compounds studied. Therefore, a detailed optimization of the extraction procedure was carried out with this technique. The optimization of the extraction conditions and liquid desorption were performed by univariate analysis. The optimal conditions for the method were supported liquid membrane with 1-octanol for 10 s, sample pH 7, addition of 15% w/v of NaCl, extraction time of 30 min, and liquid desorption in 150 μL of acetonitrile/methanol (50:50 v/v) for 5 min. The linear correlation coefficients were higher than 0.9936. The limits of detection were 0.5-4.6 μg/L and the limits of quantification were 2-16 μg/L. The analyte relative recoveries were 67-116%, and the relative standard deviations were less than 15.5%. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Comparison of Techniques for Sampling Adult Necrophilous Insects From Pig Carcasses.

    Science.gov (United States)

    Cruise, Angela; Hatano, Eduardo; Watson, David W; Schal, Coby

    2018-02-06

    Studies of the pre-colonization interval and mechanisms driving necrophilous insect ecological succession depend on effective sampling of adult insects and knowledge of their diel and successional activity patterns. The number of insects trapped, their diversity, and diel periodicity were compared with four sampling methods on neonate pigs. Sampling method, time of day and decomposition age of the pigs significantly affected the number of insects sampled from pigs. We also found significant interactions of sampling method and decomposition day, time of sampling and decomposition day. No single method was superior to the other methods during all three decomposition days. Sampling times after noon yielded the largest samples during the first 2 d of decomposition. On day 3 of decomposition however, all sampling times were equally effective. Therefore, to maximize insect collections from neonate pigs, the method used to sample must vary by decomposition day. The suction trap collected the most species-rich samples, but sticky trap samples were the most diverse, when both species richness and evenness were factored into a Shannon diversity index. Repeated sampling during the noon to 18:00 hours period was most effective to obtain the maximum diversity of trapped insects. The integration of multiple sampling techniques would most effectively sample the necrophilous insect community. However, because all four tested methods were deficient at sampling beetle species, future work should focus on optimizing the most promising methods, alone or in combinations, and incorporate hand-collections of beetles. © The Author(s) 2018. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. PELE:  Protein Energy Landscape Exploration. A Novel Monte Carlo Based Technique.

    Science.gov (United States)

    Borrelli, Kenneth W; Vitalis, Andreas; Alcantara, Raul; Guallar, Victor

    2005-11-01

    Combining protein structure prediction algorithms and Metropolis Monte Carlo techniques, we provide a novel method to explore all-atom energy landscapes. The core of the technique is based on a steered localized perturbation followed by side-chain sampling as well as minimization cycles. The algorithm and its application to ligand diffusion are presented here. Ligand exit pathways are successfully modeled for different systems containing ligands of various sizes:  carbon monoxide in myoglobin, camphor in cytochrome P450cam, and palmitic acid in the intestinal fatty-acid-binding protein. These initial applications reveal the potential of this new technique in mapping millisecond-time-scale processes. The computational cost associated with the exploration is significantly less than that of conventional MD simulations.

  9. Photonic devices based on patterning by two photon induced polymerization techniques

    Science.gov (United States)

    Fortunati, I.; Dainese, T.; Signorini, R.; Bozio, R.; Tagliazucca, V.; Dirè, S.; Lemercier, G.; Mulatier, J.-C.; Andraud, C.; Schiavuta, P.; Rinaldi, A.; Licoccia, S.; Bottazzo, J.; Franco Perez, A.; Guglielmi, M.; Brusatin, G.

    2008-04-01

    Two and three dimensional structures with micron and submicron resolution have been achieved in commercial resists, polymeric materials and sol-gel materials by several lithographic techniques. In this context, silicon-based sol-gel materials are particularly interesting because of their versatility, chemical and thermal stability, amount of embeddable active compounds. Compared with other micro- and nano-fabrication schemes, the Two Photon Induced Polymerization is unique in its 3D processing capability. The photopolymerization is performed with laser beam in the near-IR region, where samples show less absorption and less scattering, giving rise to a deeper penetration of the light. The use of ultrashort laser pulses allows the starting of nonlinear processes like multiphoton absorption at relatively low average power without thermally damaging the samples. In this work we report results on the photopolymerization process in hybrid organic-inorganic films based photopolymerizable methacrylate-containing Si-nanobuilding blocks. Films, obtained through sol-gel synthesis, are doped with a photo-initiator allowing a radical polymerization of methacrylic groups. The photo-initiator is activated by femtosecond laser source, at different input energies. The development of the unexposed regions is performed with a suitable solvent and the photopolymerized structures are characterized by microscopy techniques.

  10. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  11. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    Science.gov (United States)

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  12. [Influence of Natural Dissolved Organic Matter on the Passive Sampling Technique and its Application].

    Science.gov (United States)

    Yu, Shang-yun; Zhou, Yan-mei

    2015-08-01

    This paper studied the effects of different concentrations of natural dissolved organic matter (DOM) on the passive sampling technique. The results showed that the presence of DOM affected the organic pollutant adsorption ability of the membrane. For lgK(OW), 3-5, DOM had less impact on the adsorption of organic matter by the membrane; for lgK(OW), > 5.5, DOM significantly increased the adsorption capacity of the membrane. Meanwhile, LDPE passive sampling technique was applied to monitor PAHs and PAEs in pore water of three surface sediments in Taizi River. All of the target pollutants were detected in varying degrees at each sampling point. Finally, the quotient method was used to assess the ecological risks of PAHs and PAEs. The results showed that fluoranthene exceeded the reference value of the aquatic ecosystem, meaning there was a big ecological risk.

  13. Petrosal sinus sampling: technique and rationale.

    Science.gov (United States)

    Miller, D L; Doppman, J L

    1991-01-01

    Bilateral simultaneous sampling of the inferior petrosal sinuses is an extremely sensitive, specific, and accurate test for diagnosing Cushing disease and distinguishing between that entity and the ectopic ACTH syndrome. It is also valuable for lateralizing small hormone-producing adenomas within the pituitary gland. The inferior petrosal sinuses connect the cavernous sinuses with the ipsilateral internal jugular veins. The anatomy of the anastomoses between the inferior petrosal sinus, the internal jugular vein, and the venous plexuses at the base of the skull varies, but it is almost always possible to catheterize the inferior petrosal sinus. In addition, variations in size and anatomy are often present between the two inferior petrosal sinuses in a patient. Advance preparation is required for petrosal sinus sampling. Teamwork is a critical element, and each member of the staff should know what he or she will be doing during the procedure. The samples must be properly labeled, processed, and stored. Specific needles, guide wires, and catheters are recommended for this procedure. The procedure is performed with specific attention to the three areas of potential technical difficulty: catheterization of the common femoral veins, crossing the valve at the base of the left internal jugular vein, and selective catheterization of the inferior petrosal sinuses. There are specific methods for dealing with each of these areas. The sine qua non of correct catheter position in the inferior petrosal sinus is demonstration of reflux of contrast material into the ipsilateral cavernous sinus. Images must always be obtained to document correct catheter position. Special attention must be paid to two points to prevent potential complications: The patient must be given an adequate dose of heparin, and injection of contrast material into the inferior petrosal sinuses and surrounding veins must be done gently and carefully. When the procedure is performed as outlined, both inferior

  14. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    Science.gov (United States)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  15. Heating and thermal control of brazing technique to break contamination path for potential Mars sample return

    Science.gov (United States)

    Bao, Xiaoqi; Badescu, Mircea; Sherrit, Stewart; Bar-Cohen, Yoseph; Campos, Sergio

    2017-04-01

    The potential return of Mars sample material is of great interest to the planetary science community, as it would enable extensive analysis of samples with highly sensitive laboratory instruments. It is important to make sure such a mission concept would not bring any living microbes, which may possibly exist on Mars, back to Earth's environment. In order to ensure the isolation of Mars microbes from Earth's Atmosphere, a brazing sealing and sterilizing technique was proposed to break the Mars-to-Earth contamination path. Effectively, heating the brazing zone in high vacuum space and controlling the sample temperature for integrity are key challenges to the implementation of this technique. The break-thechain procedures for container configurations, which are being considered, were simulated by multi-physics finite element models. Different heating methods including induction and resistive/radiation were evaluated. The temperature profiles of Martian samples in a proposed container structure were predicted. The results show that the sealing and sterilizing process can be controlled such that the samples temperature is maintained below the level that may cause damage, and that the brazing technique is a feasible approach to breaking the contamination path.

  16. Texture investigation in aluminium and iron - silicon samples by neutron diffraction technique

    International Nuclear Information System (INIS)

    Pugliese, R.; Yamasaki, J.M.

    1988-09-01

    By means of the neutron diffraction technique the texture of 5% and 98% rolled-aluminium and of iron-silicon steel used in the core of electric transformers, have been determined. The measurements were performed by using a neutron diffractometer installed at the IEA-R1 Nuclear Research Reactor, in the Beam-Hole n 0 . 6. To avoid corrections such as neutron absorption and sample luminosity the geometric form of the samples were approximated to spheric or octagonal prism, and its dimensions do not exceed that of the neutron beam. The texture of the samples were analysed with the help of a computer programme that analyses the intensity of the diffracted neutron beam and plot the pole figures. (author) [pt

  17. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  18. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  19. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    Science.gov (United States)

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Dosimetric characterization of BeO samples in alpha, beta and X radiation beams using luminescent techniques

    International Nuclear Information System (INIS)

    Groppo, Daniela Piai

    2013-01-01

    In the medical field, the ionizing radiation is used both for therapeutic and diagnostic purposes, in a wide range of radiation doses. In order to ensure that the objective is achieved in practice, detailed studies of detectors and devices in different types of radiations beams are necessary. In this work a dosimetric characterization of BeO samples was performed using the techniques of thermoluminescence (TL) and optically stimulated luminescence (OSL) by a comparison of their response for alpha, beta and X radiations and the establishment of an appropriated system for use in monitoring of these radiations beams. The main results are: the high sensitivity to beta radiation for both techniques, good reproducibility of TL and OSL response (coefficients of variation lower than 5%), maximum energy dependence of the X radiation of 28% for the TL technique, and only 7% for the OSL technique, within the studied energy range. The dosimetric characteristics obtained in this work show the possibility of applying BeO samples to dosimetry of alpha, beta and X radiations, considering the studied dose ranges, using the TL and OSL techniques. From the results obtained, the samples of BeO showed their potential use for beam dosimetry in diagnostic radiology and radiotherapy. (author)

  1. Applying a low energy HPGe detector gamma ray spectrometric technique for the evaluation of Pu/Am ratio in biological samples.

    Science.gov (United States)

    Singh, I S; Mishra, Lokpati; Yadav, J R; Nadar, M Y; Rao, D D; Pradeepkumar, K S

    2015-10-01

    The estimation of Pu/(241)Am ratio in the biological samples is an important input for the assessment of internal dose received by the workers. The radiochemical separation of Pu isotopes and (241)Am in a sample followed by alpha spectrometry is a widely used technique for the determination of Pu/(241)Am ratio. However, this method is time consuming and many times quick estimation is required. In this work, Pu/(241)Am ratio in the biological sample was estimated with HPGe detector based measurements using gamma/X-rays emitted by these radionuclides. These results were compared with those obtained from alpha spectroscopy of sample after radiochemical analysis and found to be in good agreement. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A Wearable Gait Phase Detection System Based on Force Myography Techniques

    Directory of Open Access Journals (Sweden)

    Xianta Jiang

    2018-04-01

    Full Text Available (1 Background: Quantitative evaluation of gait parameters can provide useful information for constructing individuals’ gait profile, diagnosing gait abnormalities, and better planning of rehabilitation schemes to restore normal gait pattern. Objective determination of gait phases in a gait cycle is a key requirement in gait analysis applications; (2 Methods: In this study, the feasibility of using a force myography-based technique for a wearable gait phase detection system is explored. In this regard, a force myography band is developed and tested with nine participants walking on a treadmill. The collected force myography data are first examined sample-by-sample and classified into four phases using Linear Discriminant Analysis. The gait phase events are then detected from these classified samples using a set of supervisory rules; (3 Results: The results show that the force myography band can correctly detect more than 99.9% of gait phases with zero insertions and only four deletions over 12,965 gait phase segments. The average temporal error of gait phase detection is 55.2 ms, which translates into 2.1% error with respect to the corresponding labelled stride duration; (4 Conclusions: This proof-of-concept study demonstrates the feasibility of force myography techniques as viable solutions in developing wearable gait phase detection systems.

  3. Long-term monitoring of the Danube river-Sampling techniques, radionuclide metrology and radioecological assessment

    International Nuclear Information System (INIS)

    Maringer, F.J.; Gruber, V.; Hrachowitz, M.; Baumgartner, A.; Weilner, S.; Seidel, C.

    2009-01-01

    Sampling techniques and radiometric methods, developed and applied in a comprehensive radioecological study of the Danube River are presented. Results and radiometric data of sediment samples, collected by sediment traps in Austria and additionally by grab sampling in the Danube during research cruises between Germany and the delta (Black sea) are shown and discussed. Goal of the investigation is the protection of public and environment, especially the sustainable use and conservation of human freshwater resources against harmful radioactive exposure.

  4. Separation of arsenic species by capillary electrophoresis with sample-stacking techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Zu Liang; Naidu, Ravendra [Adelaide Laboratory, CSIRO Land and Water, PMB2, 5064, Glen Osmond, SA (Australia); Lin, Jin-Ming [Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, P.O. Box 2871, 100085, Beijing (China)

    2003-03-01

    A simple capillary zone electrophoresis procedure was developed for the separation of arsenic species (AsO{sub 2}{sup 2-}, AsO{sub 4}{sup 2-}, and dimethylarsinic acid, DMA). Both counter-electroosmotic and co-electroosmotic (EOF) modes were investigated for the separation of arsenic species with direct UV detection at 185 nm using 20 mmol L{sup -1} sodium phosphate as the electrolyte. The separation selectivity mainly depends on the separation modes and electrolyte pH. Inorganic anions (Cl{sup -}, NO{sub 2}{sup -}, NO{sub 3}{sup -} and SO{sub 4}{sup 2-}) presented in real samples did not interfere with arsenic speciation in either separation mode. To improve the detection limits, sample-stacking techniques, including large-volume sample stacking (LVSS) and field-amplified sample injection (FASI), were investigated for the preconcentration of As species in co-CZE mode. Less than 1 {mu}mol L{sup -1} of detection limits for As species were achieved using FASI. The proposed method was demonstrated for the separation and detection of As species in water. (orig.)

  5. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  6. Development of analytical techniques for water and environmental samples (2)

    Energy Technology Data Exchange (ETDEWEB)

    Eum, Chul Hun; Jeon, Chi Wan; Jung, Kang Sup; Song, Kyung Sun; Kim, Sang Yeon [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The purpose of this study is to develop new analytical methods with good detection limit for toxic inorganic and organic compounds. The analyses of CN, organic acids, particulate materials in environmental samples have been done using several methods such as Ion Chromatography, SPE, SPME, GC/MS, GC/FID, SPLITT (split-flow thin cell fractionation) during the second year of this project. Advantage and disadvantage of several distillation method (by KS, JIS, EPA) for CN analysis in wastewater were investigated. As the results, we proposed new distillation apparatus for CN analysis, which was proved to be simpler, faster and to get better recovery than conventional apparatus. And ion chromatograph/pulsed amperometric detector (IC/PAD) system instead of colorimetry for CN detection was setup to solve matrix interference. And SPE(solid phase extraction) and SPME (solid phase micro extraction) as liquid-solid extraction technique were applied to the analysis of phenols in wastewater. Optimum experimental conditions and factors influencing analytical results were determined. From these results, It could be concluded that C{sub 18} cartridge and polystyrene-divinylbenzene disk in SPE method, polyacrylate fiber in SPME were proper solid phase adsorbent for phenol. Optimum conditions to analyze phenol derivatives simultaneously were established. Also, Continuous SPLITT (Split-flow thin cell) Fractionation (CSF) is a new preparative separation technique that is useful for fractionation of particulate and macromolecular materials. CSF is carried out in a thin ribbon-like channel equipped with two splitters at both inlet and outlet of the channel. In this work, we set up a new CSF system, and tested using polystyrene latex standard particles. And then we fractionated particles contained in air and underground water based on their sedimentation coefficients using CSF. (author). 27 refs., 13 tabs., 31 figs.

  7. Monitoring of persistent organic pollutants in seawater of the Pearl River Estuary with rapid on-site active SPME sampling technique

    International Nuclear Information System (INIS)

    Huang, Siming; He, Shuming; Xu, Hao; Wu, Peiyan; Jiang, Ruifen; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-01-01

    An on-site active solid-phase microextraction (SPME) sampling technique coupled with gas chromatography-mass spectrometry (GC–MS) for sampling and monitoring 16 polycyclic aromatic hydrocarbons (PAHs) and 8 organochlorine pesticides (OCPs) in seawater was developed. Laboratory experiments demonstrated that the sampling-rate calibration method was practical and could be used for the quantification of on-site sampling. The proposed method was employed for field tests which covered large amounts of water samples in the Pearl River Estuary in rainy and dry seasons. The on-site SPME sampling method can avoid the contamination of sample, the losses of analytes during sample transportation, as well as the usage of solvent and time-consuming sample preparation process. Results indicated that the technique with the designed device can address the requirement of modern environment water analysis. In addition, the sources, bioaccumulation and potential risk to human of the PAHs and OCPs in seawater of the Pearl River Estuary were discussed. - Highlights: • SPME on-site active sampling technique was developed and validated. • The technique was employed for field tests in the Pearl River Estuary. • 16 PAHs and 8 OCPs in the seawater of Pearl River Estuary were monitored. • The potential risk of the PAHs and OCPs in Pearl River Estuary were discussed. - An on-site active SPME sampling technique was developed and successfully applied for sampling and monitoring 16 PAHs and 8 OCPs in the Pearl River Estuary

  8. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  9. Three Proposed Compendia for Genesis Solar Wind Samples: Science Results, Collector Materials Characterization and Cleaning Techniques

    Science.gov (United States)

    Allton, J. H.; Calaway, M. J.; Nyquist, L. E.; Jurewicz, A. J. G.; Burnett, D. S.

    2018-01-01

    Final Paper and not the abstract is attached. Introduction: Planetary material and cosmochemistry research using Genesis solar wind samples (including the development and implementation of cleaning and analytical techniques) has matured sufficiently that compilations on several topics, if made publically accessible, would be beneficial for researchers and reviewers. We propose here three compendia based on content, organization and source of documents (e.g. published peer-reviewed, published, internal memos, archives). For planning purposes, suggestions are solicited from potential users of Genesis solar wind samples for the type of science content and/or organizational style that would be most useful to them. These compendia are proposed as living documents, periodically updated. Similar to the existing compendia described below, the curation compendia are like library or archival finding aids, they are guides to published or archival documents and should not be cited as primary sources.

  10. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  11. Experimental study of laser ablation as sample introduction technique for inductively coupled plasma-mass spectrometry

    International Nuclear Information System (INIS)

    Van Winckel, S.

    2001-01-01

    The contribution consists of an abstract of a PhD thesis. In the PhD study, several complementary applications of laser-ablation were investigated in order to characterise experimentally laser ablation (LA) as a sample introduction technique for ICP-MS. Three applications of LA as a sample introduction technique are discussed: (1) the microchemical analysis of the patina of weathered marble; (2) the possibility to measure isotope ratios (in particular Pb isotope ratios in archaeological bronze artefacts); and (3) the determination of Si in Al as part of a dosimetric study of the BR2 reactor vessel

  12. High Throughput Sample Preparation and Analysis for DNA Sequencing, PCR and Combinatorial Screening of Catalysis Based on Capillary Array Technique

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yonghua [Iowa State Univ., Ames, IA (United States)

    2000-01-01

    Sample preparation has been one of the major bottlenecks for many high throughput analyses. The purpose of this research was to develop new sample preparation and integration approach for DNA sequencing, PCR based DNA analysis and combinatorial screening of homogeneous catalysis based on multiplexed capillary electrophoresis with laser induced fluorescence or imaging UV absorption detection. The author first introduced a method to integrate the front-end tasks to DNA capillary-array sequencers. protocols for directly sequencing the plasmids from a single bacterial colony in fused-silica capillaries were developed. After the colony was picked, lysis was accomplished in situ in the plastic sample tube using either a thermocycler or heating block. Upon heating, the plasmids were released while chromsomal DNA and membrane proteins were denatured and precipitated to the bottom of the tube. After adding enzyme and Sanger reagents, the resulting solution was aspirated into the reaction capillaries by a syringe pump, and cycle sequencing was initiated. No deleterious effect upon the reaction efficiency, the on-line purification system, or the capillary electrophoresis separation was observed, even though the crude lysate was used as the template. Multiplexed on-line DNA sequencing data from 8 parallel channels allowed base calling up to 620 bp with an accuracy of 98%. The entire system can be automatically regenerated for repeated operation. For PCR based DNA analysis, they demonstrated that capillary electrophoresis with UV detection can be used for DNA analysis starting from clinical sample without purification. After PCR reaction using cheek cell, blood or HIV-1 gag DNA, the reaction mixtures was injected into the capillary either on-line or off-line by base stacking. The protocol was also applied to capillary array electrophoresis. The use of cheaper detection, and the elimination of purification of DNA sample before or after PCR reaction, will make this approach an

  13. Sample Data Synchronization and Harmonic Analysis Algorithm Based on Radial Basis Function Interpolation

    Directory of Open Access Journals (Sweden)

    Huaiqing Zhang

    2014-01-01

    Full Text Available The spectral leakage has a harmful effect on the accuracy of harmonic analysis for asynchronous sampling. This paper proposed a time quasi-synchronous sampling algorithm which is based on radial basis function (RBF interpolation. Firstly, a fundamental period is evaluated by a zero-crossing technique with fourth-order Newton’s interpolation, and then, the sampling sequence is reproduced by the RBF interpolation. Finally, the harmonic parameters can be calculated by FFT on the synchronization of sampling data. Simulation results showed that the proposed algorithm has high accuracy in measuring distorted and noisy signals. Compared to the local approximation schemes as linear, quadric, and fourth-order Newton interpolations, the RBF is a global approximation method which can acquire more accurate results while the time-consuming is about the same as Newton’s.

  14. Failure Mechanism of Rock Bridge Based on Acoustic Emission Technique

    Directory of Open Access Journals (Sweden)

    Guoqing Chen

    2015-01-01

    Full Text Available Acoustic emission (AE technique is widely used in various fields as a reliable nondestructive examination technology. Two experimental tests were carried out in a rock mechanics laboratory, which include (1 small scale direct shear tests of rock bridge with different lengths and (2 large scale landslide model with locked section. The relationship of AE event count and record time was analyzed during the tests. The AE source location technology and comparative analysis with its actual failure model were done. It can be found that whether it is small scale test or large scale landslide model test, AE technique accurately located the AE source point, which reflected the failure generation and expansion of internal cracks in rock samples. Large scale landslide model with locked section test showed that rock bridge in rocky slope has typical brittle failure behavior. The two tests based on AE technique well revealed the rock failure mechanism in rocky slope and clarified the cause of high speed and long distance sliding of rocky slope.

  15. Comparison of two novel in-syringe dispersive liquid-liquid microextraction techniques for the determination of iodide in water samples using spectrophotometry.

    Science.gov (United States)

    Kaykhaii, Massoud; Sargazi, Mona

    2014-01-01

    Two new, rapid methodologies have been developed and applied successfully for the determination of trace levels of iodide in real water samples. Both techniques are based on a combination of in-syringe dispersive liquid-liquid microextraction (IS-DLLME) and micro-volume UV-Vis spectrophotometry. In the first technique, iodide is oxidized with nitrous acid to the colorless anion of ICl2(-) at high concentration of hydrochloric acid. Rhodamine B is added and by means of one step IS-DLLME, the ion-pair formed was extracted into toluene and measured spectrophotometrically. Acetone is used as dispersive solvent. The second method is based on the IS-DLLME microextraction of iodide as iodide/1, 10-phenanthroline-iron((II)) chelate cation ion-pair (colored) into nitrobenzene. Methanol was selected as dispersive solvent. Optimal conditions for iodide extraction were determined for both approaches. Methods are compared in terms of analytical parameters such as precision, accuracy, speed and limit of detection. Both methods were successfully applied to determining iodide in tap and river water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    Energy Technology Data Exchange (ETDEWEB)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J., E-mail: rc.nardes@gmail.com, E-mail: ramonziosp@yahoo.com.br, E-mail: francissanches@gmail.com, E-mail: hamiltongamafilho@hotmail.com, E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  17. Determination of metals in air samples using X-Ray fluorescence associated the APDC preconcentration technique

    International Nuclear Information System (INIS)

    Nardes, Raysa C.; Santos, Ramon S.; Sanches, Francis A.C.R.A.; Gama Filho, Hamilton S.; Oliveira, Davi F.; Anjos, Marcelino J.

    2015-01-01

    Air pollution has become one of the leading quality degradation factors of life for people in large urban centers. Studies indicate that the suspended particulate matter in the atmosphere is directly associated with risks to public health, in addition, it can cause damage to fauna, flora and public / cultural patrimonies. The inhalable particulate materials can cause the emergence and / or worsening of chronic diseases related to respiratory system and other diseases, such as reduced physical strength. In this study, we propose a new method to measure the concentration of total suspended particulate matter (TSP) in the air using an impinger as an air cleaning apparatus, preconcentration with APDC and Total Reflection X-ray Fluorescence technique (TXRF) to analyze the heavy metals present in the air. The samples were collected from five random points in the city of Rio de Janeiro/Brazil. Analyses of TXRF were performed at the Brazilian Synchrotron Light Laboratory (LNLS). The technique proved viable because it was able to detect five important metallic elements to environmental studies: Cr, Fe, Ni, Cu and Zn. This technique presented substantial efficiency in determining the elementary concentration of air pollutants, in addition to low cost. It can be concluded that the metals analysis technique in air samples using an impinger as sample collection instrument associated with a complexing agent (APDC) was viable because it is a low-cost technique, moreover, it was possible the detection of five important metal elements in environmental studies associated with industrial emissions and urban traffic. (author)

  18. Diurnal activity of four species of thrips (Thysanoptera: Thripidae) and efficiencies of three nondestructive sampling techniques for thrips in mango inflorescences.

    Science.gov (United States)

    Aliakbarpour, H; Rawi, Che Salmah Md

    2010-06-01

    Thrips cause considerable economic loss to mango, Mangifera indica L., in Penang, Malaysia. Three nondestructive sampling techniques--shaking mango panicles over a moist plastic tray, washing the panicles with ethanol, and immobilization of thrips by using CO2--were evaluated for their precision to determine the most effective technique to capture mango flower thrips (Thysanoptera: Thripidae) in an orchard located at Balik Pulau, Penang, Malaysia, during two flowering seasons from December 2008 to February 2009 and from August to September 2009. The efficiency of each of the three sampling techniques was compared with absolute population counts on whole panicles as a reference. Diurnal flight activity of thrips species was assessed using yellow sticky traps. All three sampling methods and sticky traps were used at two hourly intervals from 0800 to 1800 hours to get insight into diurnal periodicity of thrips abundance in the orchard. Based on pooled data for the two seasons, the CO2 method was the most efficient procedure extracting 80.7% adults and 74.5% larvae. The CO2 method had the lowest relative variation and was the most accurate procedure compared with the absolute method as shown by regression analysis. All collection techniques showed that the numbers of all thrips species in mango panicles increased after 0800 hours, reaching a peak between 1200 and 1400 hours. Adults thrips captured on the sticky traps were the most abundant between 0800-1000 and 1400-1600 hours. According to results of this study, the CO2 method is recommended for sampling of thrips in the field. It is a nondestructive sampling procedure that neither damages flowers nor diminishes fruit production. Management of thrips populations in mango orchards with insecticides would be more effectively carried out during their peak population abundance on the flower panicles at midday to 1400 hours.

  19. Base Oils Biodegradability Prediction with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Malika Trabelsi

    2010-02-01

    Full Text Available In this paper, we apply various data mining techniques including continuous numeric and discrete classification prediction models of base oils biodegradability, with emphasis on improving prediction accuracy. The results show that highly biodegradable oils can be better predicted through numeric models. In contrast, classification models did not uncover a similar dichotomy. With the exception of Memory Based Reasoning and Decision Trees, tested classification techniques achieved high classification prediction. However, the technique of Decision Trees helped uncover the most significant predictors. A simple classification rule derived based on this predictor resulted in good classification accuracy. The application of this rule enables efficient classification of base oils into either low or high biodegradability classes with high accuracy. For the latter, a higher precision biodegradability prediction can be obtained using continuous modeling techniques.

  20. A radioanalytical technique using (n,2n) reaction for the elemental analysis of samples

    International Nuclear Information System (INIS)

    Labor, M.

    1985-11-01

    A technique to determine elemental composition of samples is reported. The principle of the technique employs the internal standard method and involves the resolution of complex annihilation spectra. The technique has been applied to the determination of the mass of nitrogen, msub(N), and that of potassium, msub(K), in known masses of potassium nitrate. The percentage difference between the calculated mass and actual masses in 2g and 3g of potassium nitrate is 1.0 and 0.7 respectively for potassium, and 1.0 for nitrogen. The use of more simultaneous equations than necessary in solving for msub(N) and msub(K) offers one of the advantages of the technique. (author)

  1. A novel non-invasive diagnostic sampling technique for cutaneous leishmaniasis.

    Directory of Open Access Journals (Sweden)

    Yasaman Taslimi

    2017-07-01

    Full Text Available Accurate diagnosis of cutaneous leishmaniasis (CL is important for chemotherapy and epidemiological studies. Common approaches for Leishmania detection involve the invasive collection of specimens for direct identification of amastigotes by microscopy and the culturing of promastigotes from infected tissues. Although these techniques are highly specific, they require highly skilled health workers and have the inherent risks of all invasive procedures, such as pain and risk of bacterial and fungal super-infection. Therefore, it is essential to reduce discomfort, potential infection and scarring caused by invasive diagnostic approaches especially for children. In this report, we present a novel non-invasive method, that is painless, rapid and user-friendly, using sequential tape strips for sampling and isolation of DNA from the surface of active and healed skin lesions of CL patients. A total of 119 patients suspected of suffering from cutaneous leishmaniasis with different clinical manifestations were recruited and samples were collected both from their lesions and from uninfected areas. In addition, 15 fungal-infected lesions and 54 areas of healthy skin were examined. The duration of sampling is short (less than one minute and species identification by PCR is highly specific and sensitive. The sequential tape stripping sampling method is a sensitive, non-invasive and cost-effective alternative to traditional diagnostic assays and it is suitable for field studies as well as for use in health care centers.

  2. Use of X-ray diffraction technique and chemometrics to aid soil sampling strategies in traceability studies.

    Science.gov (United States)

    Bertacchini, Lucia; Durante, Caterina; Marchetti, Andrea; Sighinolfi, Simona; Silvestri, Michele; Cocchi, Marina

    2012-08-30

    Aim of this work is to assess the potentialities of the X-ray powder diffraction technique as fingerprinting technique, i.e. as a preliminary tool to assess soil samples variability, in terms of geochemical features, in the context of food geographical traceability. A correct approach to sampling procedure is always a critical issue in scientific investigation. In particular, in food geographical traceability studies, where the cause-effect relations between the soil of origin and the final foodstuff is sought, a representative sampling of the territory under investigation is certainly an imperative. This research concerns a pilot study to investigate the field homogeneity with respect to both field extension and sampling depth, taking also into account the seasonal variability. Four Lambrusco production sites of the Modena district were considered. The X-Ray diffraction spectra, collected on the powder of each soil sample, were treated as fingerprint profiles to be deciphered by multivariate and multi-way data analysis, namely PCA and PARAFAC. The differentiation pattern observed in soil samples, as obtained by this fast and non-destructive analytical approach, well matches with the results obtained by characterization with other costly analytical techniques, such as ICP/MS, GFAAS, FAAS, etc. Thus, the proposed approach furnishes a rational basis to reduce the number of soil samples to be collected for further analytical characterization, i.e. metals content, isotopic ratio of radiogenic element, etc., while maintaining an exhaustive description of the investigated production areas. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Use of an oscillation technique to measure effective cross-sections of fissionable samples in critical assemblies

    International Nuclear Information System (INIS)

    Tretiakoff, O.; Vidal, R.; Carre, J.C.; Robin, M.

    1964-01-01

    The authors describe the technique used to measure the effective absorption and neutron-yield cross-sections of a fissionable sample. These two values are determined by analysing the signals due to the variation in reactivity (over-all signal) and the local perturbation in the flux (local signal) produced by the oscillating sample. These signals are standardized by means of a set of samples containing quantities of fissionable material ( 235 U) and an absorber, boron, which are well known. The measurements are made for different neutron spectra characterized by lattice parameters which constitute the central zone within which the sample moves. This technique is used to study the effective cross-sections of uranium-plutonium alloys for different heavy-water and graphite lattices in the MINERVE and MARIUS critical assemblies. The same experiments are carried out on fuel samples of different irradiations in order to determine the evolution of effective cross-sections as a function of the spectrum and the irradiations. (authors) [fr

  4. A comparative examination of several techniques for the routine determination of mercury in biological samples by neutron activation analysis

    International Nuclear Information System (INIS)

    Faanhof, A.; Das, H.A.

    1978-01-01

    A comparative examination of the most important techniques for the separation of mercury from irradiated biological material was made. Procedures for routine analysis and results for standard materials are given. Activation was performed at a thermal neutron flux of approximately 5x10 12 nxcm -2 xs -1 during ( 3 ) 2 offers a convenient solution to this problem. The variation of the neutron flux with the irradiation position can be measured by the application of thin iron rings as flux monitors. Losses of mercury due to uptake in the wall of the irradiation containers are negligible. The most powerful destruction technique for large samples is that based on a stainless-steel bomb. (T. I.)

  5. A cost-effective technique for integrating personal radiation dose assessment with personal gravimetric sampling

    International Nuclear Information System (INIS)

    Strydom, R.; Rolle, R.; Van der Linde, A.

    1992-01-01

    During recent years there has been an increasing awareness internationally of radiation levels in the mining and milling of radioactive ores, including those from non-uranium mines. A major aspect of radiation control is concerned with the measurement of radiation levels and the assessment of radiation doses incurred by individual workers. Current techniques available internationally for personnel monitoring of radiation exposures are expensive and there is a particular need to reduce the cost of personal radiation monitoring in South African gold mines because of the large labour force employed. In this regard the obvious benefits of integrating personal radiation monitoring with existing personal monitoring systems already in place in South African gold mines should be exploited. A system which can be utilized for this purpose is personal gravimetric sampling. A new cost-effective technique for personal radiation monitoring, which can be fully integrated with the personal gravimetric sampling strategy being implemented on mines, has been developed in South Africa. The basic principles of this technique and its potential in South African mines are described. 9 refs., 7 figs

  6. X-Ray Micro-Computed Tomography of Apollo Samples as a Curation Technique Enabling Better Research

    Science.gov (United States)

    Ziegler, R. A.; Almeida, N. V.; Sykes, D.; Smith, C. L.

    2014-01-01

    X-ray micro-computed tomography (micro-CT) is a technique that has been used to research meteorites for some time and many others], and recently it is becoming a more common tool for the curation of meteorites and Apollo samples. Micro-CT is ideally suited to the characterization of astromaterials in the curation process as it can provide textural and compositional information at a small spatial resolution rapidly, nondestructively, and without compromising the cleanliness of the samples (e.g., samples can be scanned sealed in Teflon bags). This data can then inform scientists and curators when making and processing future sample requests for meteorites and Apollo samples. Here we present some preliminary results on micro-CT scans of four Apollo regolith breccias. Methods: Portions of four Apollo samples were used in this study: 14321, 15205, 15405, and 60639. All samples were 8-10 cm in their longest dimension and approximately equant. These samples were micro-CT scanned on the Nikon HMXST 225 System at the Natural History Museum in London. Scans were made at 205-220 kV, 135-160 microamps beam current, with an effective voxel size of 21-44 microns. Results: Initial examination of the data identify a variety of mineral clasts (including sub-voxel FeNi metal grains) and lithic clasts within the regolith breccias. Textural information within some of the lithic clasts was also discernable. Of particular interest was a large basalt clast (approx.1.3 cc) found within sample 60639, which appears to have a sub-ophitic texture. Additionally, internal void space, e.g., fractures and voids, is readily identifiable. Discussion: It is clear from the preliminary data that micro-CT analyses are able to identify important "new" clasts within the Apollo breccias, and better characterize previously described clasts or igneous samples. For example, the 60639 basalt clast was previously believed to be quite small based on its approx.0.5 sq cm exposure on the surface of the main mass

  7. 238U And 232Th Concentration In Rock Samples using Alpha Autoradiography and Gamma Spectroscopy Techniques

    International Nuclear Information System (INIS)

    Hafez, A.F.; El-Farrash, A.H.; Yousef, H.A.

    2009-01-01

    The activity concentrations of uranium and thorium were measured for some rock samples selected from Dahab region in the south tip of Sinai. In order to detect any harmful radiation that would affect on the tourists and is becoming economic resource because Dahab have open fields of tourism in Egypt. The activity concentration of uranium and thorium in rocks samples was measured using two techniques. The first is .-autoradiography technique with LR-115 and CR-39 detectors and the second is gamma spectroscopic technique with NaI(Tl) detector. It was found that the average activity concentrations of uranium and thorium using .-autoradiography technique ranged from 6.41-49.31 Bqkg-1, 4.86- 40.87 Bqkg-1 respectively and by gamma detector are ranged from 6.70- 49.50 Bqkg-1, 4.47- 42.33 Bqkg-1 respectively. From the obtained data we can conclude that there is no radioactive healthy hazard for human and living beings in the area under investigation. It was found that there are no big differences between the calculated thorium to uranium ratios in both techniques

  8. Atmospheric pressure surface sampling/ionization techniques for direct coupling of planar separations with mass spectrometry.

    Science.gov (United States)

    Pasilis, Sofie P; Van Berkel, Gary J

    2010-06-18

    Planar separations, which include thin layer chromatography and gel electrophoresis, are in widespread use as important and powerful tools for conducting separations of complex mixtures. To increase the utility of planar separations, new methods are needed that allow in situ characterization of the individual components of the separated mixtures. A large number of atmospheric pressure surface sampling and ionization techniques for use with mass spectrometry have emerged in the past several years, and several have been investigated as a means for mass spectrometric read-out of planar separations. In this article, we review the atmospheric pressure surface sampling and ionization techniques that have been used for the read-out of planar separation media. For each technique, we briefly explain the operational basics and discuss the analyte type for which it is appropriate and some specific applications from the literature. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  9. Laser-Assisted Sampling Techniques in Combination with ICP-MS: A Novel Approach for Particle Analysis at the IAEA Environmental Samples Laboratory

    International Nuclear Information System (INIS)

    Dzigal, N.; Chinea-Cano, E.

    2015-01-01

    Researchers have found many applications for lasers. About two decades ago, scientists started using lasers as sample introduction instruments for mass spectrometry measurements. Similarly, lasers as micro-dissection tools have also been increasingly on demand in the fields of life sciences, materials science, forensics, etc. This presentation deals with the interception of these aforementioned laser-assisted techniques to the field of particle analysis. Historically, the use of a nanosecond laser to ablate material has been used in materials science. Recently, it has been proven that in the analysis of particulate materials the disadvantages associated with the utilization of nanosecond lasers such as overheating and melting of the sample are suppressed when using femtosecond lasers. Further, due to the length of a single laser shot, fs-LA allows a more controlled ablation to occur and therefore the sample plasma is more homogeneous and less mass-fractionation events are detected. The use of laser micro-dissection devices enables the physical segmentation of microsized artefacts previously performed by a laborious manual procedure. By combining the precision of the laser cutting inherent to the LMD technique together with a particle identification methodology, one can increase the efficiency of single particle isolation. Further, besides the increase in throughput of analyses, this combination enhances the signal-to-noise ratio by removing matrix particles effectively. Specifically, this contribution describes the use of an Olympus+MMI laser microdissection device in improving the sample preparation of environmental swipe samples and the installation of an Applied Spectra J200 fs-LA/LIBS (laser ablation/laser inducedbreakdown spectroscopy) system as a sample introduction device to a quadrupole mass spectrometer, the iCap Q from Thermofisher Scientific at the IAEA Environmental Samples Laboratory are explored. Preliminary results of the ongoing efforts for the

  10. Thermophilic Campylobacter spp. in turkey samples: evaluation of two automated enzyme immunoassays and conventional microbiological techniques

    DEFF Research Database (Denmark)

    Borck, Birgitte; Stryhn, H.; Ersboll, A.K.

    2002-01-01

    Aims: To determine the sensitivity and specificity of two automated enzyme immunoassays (EIA), EiaFoss and Minividas, and a conventional microbiological culture technique for detecting thermophilic Campylobacter spp. in turkey samples. Methods and Results: A total of 286 samples (faecal, meat...

  11. Solving mercury (Hg) speciation in soil samples by synchrotron X-ray microspectroscopic techniques.

    Science.gov (United States)

    Terzano, Roberto; Santoro, Anna; Spagnuolo, Matteo; Vekemans, Bart; Medici, Luca; Janssens, Koen; Göttlicher, Jörg; Denecke, Melissa A; Mangold, Stefan; Ruggiero, Pacifico

    2010-08-01

    Direct mercury (Hg) speciation was assessed for soil samples with a Hg concentration ranging from 7 up to 240 mg kg(-1). Hg chemical forms were identified and quantified by sequential extractions and bulk- and micro-analytical techniques exploiting synchrotron generated X-rays. In particular, microspectroscopic techniques such as mu-XRF, mu-XRD and mu-XANES were necessary to solve bulk Hg speciation, in both soil fractions soil samples were metacinnabar (beta-HgS), cinnabar (alpha-HgS), corderoite (Hg(3)S(2)Cl(2)), and an amorphous phase containing Hg bound to chlorine and sulfur. The amount of metacinnabar and amorphous phases increased in the fraction soil components was observed. All the observed Hg-species originated from the slow weathering of an inert Hg-containing waste material (K106, U.S. EPA) dumped in the area several years ago, which is changing into a relatively more dangerous source of pollution. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. Simultaneous multicopter-based air sampling and sensing of meteorological variables

    Science.gov (United States)

    Brosy, Caroline; Krampf, Karina; Zeeman, Matthias; Wolf, Benjamin; Junkermann, Wolfgang; Schäfer, Klaus; Emeis, Stefan; Kunstmann, Harald

    2017-08-01

    The state and composition of the lowest part of the planetary boundary layer (PBL), i.e., the atmospheric surface layer (SL), reflects the interactions of external forcing, land surface, vegetation, human influence and the atmosphere. Vertical profiles of atmospheric variables in the SL at high spatial (meters) and temporal (1 Hz and better) resolution increase our understanding of these interactions but are still challenging to measure appropriately. Traditional ground-based observations include towers that often cover only a few measurement heights at a fixed location. At the same time, most remote sensing techniques and aircraft measurements have limitations to achieve sufficient detail close to the ground (up to 50 m). Vertical and horizontal transects of the PBL can be complemented by unmanned aerial vehicles (UAV). Our aim in this case study is to assess the use of a multicopter-type UAV for the spatial sampling of air and simultaneously the sensing of meteorological variables for the study of the surface exchange processes. To this end, a UAV was equipped with onboard air temperature and humidity sensors, while wind conditions were determined from the UAV's flight control sensors. Further, the UAV was used to systematically change the location of a sample inlet connected to a sample tube, allowing the observation of methane abundance using a ground-based analyzer. Vertical methane gradients of about 0.3 ppm were found during stable atmospheric conditions. Our results showed that both methane and meteorological conditions were in agreement with other observations at the site during the ScaleX-2015 campaign. The multicopter-type UAV was capable of simultaneous in situ sensing of meteorological state variables and sampling of air up to 50 m above the surface, which extended the vertical profile height of existing tower-based infrastructure by a factor of 5.

  13. Application of a microwave-based desolvation system for multi-elemental analysis of wine by inductively coupled plasma based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Grindlay, Guillermo [Department of Analytical Chemistry, Nutrition and Food Sciences, University of Alicante, P.O. Box 99, 03080 Alicante (Spain)], E-mail: guillermo.grindlay@ua.es; Mora, Juan; Maestre, Salvador; Gras, Luis [Department of Analytical Chemistry, Nutrition and Food Sciences, University of Alicante, P.O. Box 99, 03080 Alicante (Spain)

    2008-11-23

    Elemental wine analysis is often required from a nutritional, toxicological, origin and authenticity point of view. Inductively coupled plasma based techniques are usually employed for this analysis because of their multi-elemental capabilities and good limits of detection. However, the accurate analysis of wine samples strongly depends on their matrix composition (i.e. salts, ethanol, organic acids) since they lead to both spectral and non-spectral interferences. To mitigate ethanol (up to 10% w/w) related matrix effects in inductively coupled plasma atomic emission spectrometry (ICP-AES), a microwave-based desolvation system (MWDS) can be successfully employed. This finding suggests that the MWDS could be employed for elemental wine analysis. The goal of this work is to evaluate the applicability of the MWDS for elemental wine analysis in ICP-AES and inductively coupled plasma mass spectrometry (ICP-MS). For the sake of comparison a conventional sample introduction system (i.e. pneumatic nebulizer attached to a spray chamber) was employed. Matrix effects, precision, accuracy and analysis throughput have been selected as comparison criteria. For ICP-AES measurements, wine samples can be directly analyzed without any sample treatment (i.e. sample dilution or digestion) using pure aqueous standards although internal standardization (IS) (i.e. Sc) is required. The behaviour of the MWDS operating with organic solutions in ICP-MS has been characterized for the first time. In this technique the MWDS has shown its efficiency to mitigate ethanol related matrix effects up to concentrations of 1% (w/w). Therefore, wine samples must be diluted to reduce the ethanol concentration up to this value. The results obtained have shown that the MWDS is a powerful device for the elemental analysis of wine samples in both ICP-AES and ICP-MS. In general, the MWDS has some attractive advantages for elemental wine analysis when compared to a conventional sample introduction system such

  14. Application of a microwave-based desolvation system for multi-elemental analysis of wine by inductively coupled plasma based techniques

    International Nuclear Information System (INIS)

    Grindlay, Guillermo; Mora, Juan; Maestre, Salvador; Gras, Luis

    2008-01-01

    Elemental wine analysis is often required from a nutritional, toxicological, origin and authenticity point of view. Inductively coupled plasma based techniques are usually employed for this analysis because of their multi-elemental capabilities and good limits of detection. However, the accurate analysis of wine samples strongly depends on their matrix composition (i.e. salts, ethanol, organic acids) since they lead to both spectral and non-spectral interferences. To mitigate ethanol (up to 10% w/w) related matrix effects in inductively coupled plasma atomic emission spectrometry (ICP-AES), a microwave-based desolvation system (MWDS) can be successfully employed. This finding suggests that the MWDS could be employed for elemental wine analysis. The goal of this work is to evaluate the applicability of the MWDS for elemental wine analysis in ICP-AES and inductively coupled plasma mass spectrometry (ICP-MS). For the sake of comparison a conventional sample introduction system (i.e. pneumatic nebulizer attached to a spray chamber) was employed. Matrix effects, precision, accuracy and analysis throughput have been selected as comparison criteria. For ICP-AES measurements, wine samples can be directly analyzed without any sample treatment (i.e. sample dilution or digestion) using pure aqueous standards although internal standardization (IS) (i.e. Sc) is required. The behaviour of the MWDS operating with organic solutions in ICP-MS has been characterized for the first time. In this technique the MWDS has shown its efficiency to mitigate ethanol related matrix effects up to concentrations of 1% (w/w). Therefore, wine samples must be diluted to reduce the ethanol concentration up to this value. The results obtained have shown that the MWDS is a powerful device for the elemental analysis of wine samples in both ICP-AES and ICP-MS. In general, the MWDS has some attractive advantages for elemental wine analysis when compared to a conventional sample introduction system such

  15. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    Science.gov (United States)

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  16. Research on test of product based on spatial sampling criteria and variable step sampling mechanism

    Science.gov (United States)

    Li, Ruihong; Han, Yueping

    2014-09-01

    This paper presents an effective approach for online testing the assembly structures inside products using multiple views technique and X-ray digital radiography system based on spatial sampling criteria and variable step sampling mechanism. Although there are some objects inside one product to be tested, there must be a maximal rotary step for an object within which the least structural size to be tested is predictable. In offline learning process, Rotating the object by the step and imaging it and so on until a complete cycle is completed, an image sequence is obtained that includes the full structural information for recognition. The maximal rotary step is restricted by the least structural size and the inherent resolution of the imaging system. During online inspection process, the program firstly finds the optimum solutions to all different target parts in the standard sequence, i.e., finds their exact angles in one cycle. Aiming at the issue of most sizes of other targets in product are larger than that of the least structure, the paper adopts variable step-size sampling mechanism to rotate the product specific angles with different steps according to different objects inside the product and match. Experimental results show that the variable step-size method can greatly save time compared with the traditional fixed-step inspection method while the recognition accuracy is guaranteed.

  17. Development and application of the analyzer-based imaging technique with hard synchrotron radiation

    International Nuclear Information System (INIS)

    Coan, P.

    2006-07-01

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  18. Nickel speciation in several serpentine (ultramafic) topsoils via bulk synchrotron-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Siebecker, Matthew G.; Chaney, Rufus L.; Sparks, Donald L.

    2017-07-01

    Serpentine soils have elevated concentrations of trace metals including nickel, cobalt, and chromium compared to non-serpentine soils. Identifying the nickel bearing minerals allows for prediction of potential mobility of nickel. Synchrotron-based techniques can identify the solid-phase chemical forms of nickel with minimal sample treatment. Element concentrations are known to vary among soil particle sizes in serpentine soils. Sonication is a useful method to physically disperse sand, silt and clay particles in soils. Synchrotron-based techniques and sonication were employed to identify nickel species in discrete particle size fractions in several serpentine (ultramafic) topsoils to better understand solid-phase nickel geochemistry. Nickel commonly resided in primary serpentine parent material such as layered-phyllosilicate and chain-inosilicate minerals and was associated with iron oxides. In the clay fractions, nickel was associated with iron oxides and primary serpentine minerals, such as lizardite. Linear combination fitting (LCF) was used to characterize nickel species. Total metal concentration did not correlate with nickel speciation and is not an indicator of the major nickel species in the soil. Differences in soil texture were related to different nickel speciation for several particle size fractionated samples. A discussion on LCF illustrates the importance of choosing standards based not only on statistical methods such as Target Transformation but also on sample mineralogy and particle size. Results from the F-test (Hamilton test), which is an underutilized tool in the literature for LCF in soils, highlight its usefulness to determine the appropriate number of standards to for LCF. EXAFS shell fitting illustrates that destructive interference commonly found for light and heavy elements in layered double hydroxides and in phyllosilicates also can occur in inosilicate minerals, causing similar structural features and leading to false positive results in

  19. New technique using [125I]labeled rose bengal for the quantification in blood samples of pipecuronium bromide, a muscle relaxant drug

    International Nuclear Information System (INIS)

    Schopfer, C.; Benakis, A.; Pittet, J.-F.; Tassonyi, E.

    1991-01-01

    A new technique involving the use of [ 125 I]labeled rose bengal for the quantification of pipecuronium bromide (a muscle relaxant drug) is presented. This technique, which is based on the ability of rose bengal to react with pipecuronium and then form a complex which can be extracted into an organic solvent, involves two steps: the purification and labeling of rose bengal with 125 I, and the quantification of pipecuronium. The specific activity of the compound (106 μCi/mg) allows for the quantification of pipecuronium in biological samples at concentrations as low as 5 ng/ml. (author)

  20. Sampling and sample preparation development for analytical and on-line measurement techniques of process liquids; Naeytteenoton ja kaesittelyn kehittaeminen prosessinesteiden analytiikan ja on-line mittaustekniikan tarpeisiin - MPKT 11

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K. [Oulu Univ. (Finland)

    1998-12-31

    Main goal of the research project is to develop sampling and sample handling methods and techniques for pulp and paper industry to be used for analysis and on-line purposes. The research focus specially on the research and development of the classification and separation methods and techniques needed for liquid and colloidal substances as well as in ion analysis. (orig.)

  1. Sampling and sample preparation development for analytical and on-line measurement techniques of process liquids; Naeytteenoton ja kaesittelyn kehittaeminen prosessinesteiden analytiikan ja on-line mittaustekniikan tarpeisiin - MPKT 11

    Energy Technology Data Exchange (ETDEWEB)

    Karttunen, K [Oulu Univ. (Finland)

    1999-12-31

    Main goal of the research project is to develop sampling and sample handling methods and techniques for pulp and paper industry to be used for analysis and on-line purposes. The research focus specially on the research and development of the classification and separation methods and techniques needed for liquid and colloidal substances as well as in ion analysis. (orig.)

  2. Comparison of laser fluorimetry, high resolution gamma-ray spectrometry and neutron activation analysis techniques for determination of uranium content in soil samples

    International Nuclear Information System (INIS)

    Ghods, A.; Asgharizadeh, F.; Salimi, B.; Abbasi, A.

    2004-01-01

    Much more concern is given nowadays for exposure of the world population to natural radiation especially to uranium since 57% of that exposure is due to radon-222, which is a member of uranium decay series. Most of the methods used for uranium determination is low concentration require either tedious separation and preconcentration or the accessibility to special instrumentation for detection of uranium at this low level. this study compares three techniques and methods for uranium analysis among different soil sample with variable uranium contents. Two of these techniques, neutron activation analysis and high resolution gamma-ray spectrometry , are non-destructive while the other, laser fluorimetry is done via chemical extraction of uranium. Analysis of standard materials is done also to control the quality and accuracy of the work. In spite of having quite variable ranges of detection limit, results obtained by high resolution gamma-ray spectrometry based on the assumption of having secular equilibrium between uranium and its daughters, which causes deviation whenever this condition was missed. For samples with reasonable uranium content, neutron activation analysis would be a rapid and reliable technique, while for low uranium content laser fluorimetry would be the most appropriate and accurate technique

  3. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai

    2015-09-16

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  4. Shrinkage-based diagonal Hotelling’s tests for high-dimensional small sample size data

    KAUST Repository

    Dong, Kai; Pang, Herbert; Tong, Tiejun; Genton, Marc G.

    2015-01-01

    DNA sequencing techniques bring novel tools and also statistical challenges to genetic research. In addition to detecting differentially expressed genes, testing the significance of gene sets or pathway analysis has been recognized as an equally important problem. Owing to the “large pp small nn” paradigm, the traditional Hotelling’s T2T2 test suffers from the singularity problem and therefore is not valid in this setting. In this paper, we propose a shrinkage-based diagonal Hotelling’s test for both one-sample and two-sample cases. We also suggest several different ways to derive the approximate null distribution under different scenarios of pp and nn for our proposed shrinkage-based test. Simulation studies show that the proposed method performs comparably to existing competitors when nn is moderate or large, but it is better when nn is small. In addition, we analyze four gene expression data sets and they demonstrate the advantage of our proposed shrinkage-based diagonal Hotelling’s test.

  5. Applications of synchrotron-based micro-imaging techniques for the analysis of Cultural Heritage materials

    International Nuclear Information System (INIS)

    Cotte, Marine; Chilida, Javier; Walter, Philippe; Taniguchi, Yoko; Susini, Jean

    2009-01-01

    The analysis of cultural Heritage objects is often technically challenging. When analyzing micro-fragments, the amount of matter is usually very tiny, hence requiring sensitive techniques. These samples, in particular painting fragments, may present multi-layered structures, with layer thickness of ∼10 μm. It leads to favor micro-imaging techniques, with a good lateral resolution (about one micrometer), that manage the discriminative study of each layer. Besides, samples are usually very complex in term of chemistry, as they are made of mineral and organic matters, amorphous and crystallized phases, major and minor elements. Accordingly, a multi-modal approach is generally essential to solve the chemical complexity of such hybrid materials. Different examples will be given, to illustrate the various possibilities of synchrotron-based micro-imaging techniques, such as micro X-ray diffraction, micro X-ray fluorescence, micro X-ray absorption spectroscopy and micro FTIR spectroscopy. Focus will be made on paintings, but the whole range of museum objects (going from soft matter like paper or wood to hard matter like metal and glass) will be also considered.

  6. Estimation of trace levels of plutonium in urine samples by fission track technique

    International Nuclear Information System (INIS)

    Sawant, P.D.; Prabhu, S.; Pendharkar, K.A.; Kalsi, P.C.

    2009-01-01

    Individual monitoring of radiation workers handling Pu in various nuclear installations requires the detection of trace levels of plutonium in bioassay samples. It is necessary to develop methods that can detect urinary excretion of Pu in fraction of mBq range. Therefore, a sensitive method such as fission track analysis has been developed for the measurement of trace levels of Pu in bioassay samples. In this technique, chemically separated plutonium from the sample and a Pu standard were electrodeposited on planchettes and covered with Lexan solid state nuclear track detector (SSNTD) and irradiated with thermal neutrons in APSARA reactor of Bhabha Atomic Research Centre, India. The fission track densities in the Lexan films of the sample and the standard were used to calculate the amount of Pu in the sample. The minimum amount of Pu that can be analyzed by this method using doubly distilled electronic grade (E. G.) reagents is about 12 μBq/L. (author)

  7. Fabrication Techniques of Stretchable and Cloth Electroadhesion Samples for Implementation on Devices with Space Application

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this study is to determine materials and fabrication techniques for efficient space-rated electroadhesion (EA) samples. Liquid metals, including...

  8. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    Science.gov (United States)

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. A review of analytical techniques for the determination of carbon-14 in environmental samples

    International Nuclear Information System (INIS)

    Milton, G.M.; Brown, R.M.

    1993-11-01

    This report contains a brief summary of analytical techniques commonly used for the determination of radiocarbon in a variety of environmental samples. Details of the applicable procedures developed and tested in the Environmental Research Branch at Chalk River Laboratories are appended

  10. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    Directory of Open Access Journals (Sweden)

    Magdalena Jabłońska-Czapla

    2015-01-01

    Full Text Available Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples. An important issue addressed is the preparation of environmental samples for speciation analysis.

  11. Office-based narrow band imaging-guided flexible laryngoscopy tissue sampling: A cost-effectiveness analysis evaluating its impact on Taiwanese health insurance program.

    Science.gov (United States)

    Fang, Tuan-Jen; Li, Hsueh-Yu; Liao, Chun-Ta; Chiang, Hui-Chen; Chen, I-How

    2015-07-01

    Narrow band imaging (NBI)-guided flexible laryngoscopy tissue sampling for laryngopharyngeal lesions is a novel technique. Patients underwent the procedure in an office-based setting without being sedated, which is different from the conventional technique performed using direct laryngoscopy. Although the feasibility and effects of this procedure were established, its financial impact on the institution and Taiwanese National Health Insurance program was not determined. This is a retrospective case-control study. From May 2010 to April 2011, 20 consecutive patients who underwent NBI flexible laryngoscopy tissue sampling were recruited. During the same period, another 20 age-, sex-, and lesion-matched cases were enrolled in the control group. The courses for procedures and financial status were analyzed and compared between groups. Office-based NBI flexible laryngoscopy tissue sampling procedure took 27 minutes to be completed, while 191 minutes were required for the conventional technique. Average reimbursement for each case was New Taiwan Dollar (NT$)1264 for patients undergoing office-based NBI flexible laryngoscopy tissue sampling, while NT$10,913 for those undergoing conventional direct laryngoscopy in the operation room (p institution suffered a loss of at least NT$690 when performing NBI flexible laryngoscopy tissue sampling. Office-based NBI flexible laryngoscopy tissue sampling is a cost-saving procedure for patients and the Taiwanese National Health Insurance program. It also saves the procedure time. However, the net financial loss for the institution and physician would limit its popularization unless reimbursement patterns are changed. Copyright © 2013. Published by Elsevier B.V.

  12. Techniques for the detection of pathogenic Cryptococcus species in wood decay substrata and the evaluation of viability in stored samples

    Directory of Open Access Journals (Sweden)

    Christian Alvarez

    2013-02-01

    Full Text Available In this study, we evaluated several techniques for the detection of the yeast form of Cryptococcus in decaying wood and measured the viability of these fungi in environmental samples stored in the laboratory. Samples were collected from a tree known to be positive for Cryptococcus and were each inoculated on 10 Niger seed agar (NSA plates. The conventional technique (CT yielded a greater number of positive samples and indicated a higher fungal density [in colony forming units per gram of wood (CFU.g-1] compared to the humid swab technique (ST. However, the difference in positive and false negative results between the CT-ST was not significant. The threshold of detection for the CT was 0.05.10³ CFU.g-1, while the threshold for the ST was greater than 0.1.10³ CFU-1. No colonies were recovered using the dry swab technique. We also determined the viability of Cryptococcus in wood samples stored for 45 days at 25ºC using the CT and ST and found that samples not only continued to yield a positive response, but also exhibited an increase in CFU.g-1, suggesting that Cryptococcus is able to grow in stored environmental samples. The ST.1, in which samples collected with swabs were immediately plated on NSA medium, was more efficient and less laborious than either the CT or ST and required approximately 10 min to perform; however, additional studies are needed to validate this technique.

  13. A survey on OFDM channel estimation techniques based on denoising strategies

    Directory of Open Access Journals (Sweden)

    Pallaviram Sure

    2017-04-01

    Full Text Available Channel estimation forms the heart of any orthogonal frequency division multiplexing (OFDM based wireless communication receiver. Frequency domain pilot aided channel estimation techniques are either least squares (LS based or minimum mean square error (MMSE based. LS based techniques are computationally less complex. Unlike MMSE ones, they do not require a priori knowledge of channel statistics (KCS. However, the mean square error (MSE performance of the channel estimator incorporating MMSE based techniques is better compared to that obtained with the incorporation of LS based techniques. To enhance the MSE performance using LS based techniques, a variety of denoising strategies have been developed in the literature, which are applied on the LS estimated channel impulse response (CIR. The advantage of denoising threshold based LS techniques is that, they do not require KCS but still render near optimal MMSE performance similar to MMSE based techniques. In this paper, a detailed survey on various existing denoising strategies, with a comparative discussion of these strategies is presented.

  14. Optical transmission testing based on asynchronous sampling techniques: images analysis containing chromatic dispersion using convolutional neural network

    Science.gov (United States)

    Mrozek, T.; Perlicki, K.; Tajmajer, T.; Wasilewski, P.

    2017-08-01

    The article presents an image analysis method, obtained from an asynchronous delay tap sampling (ADTS) technique, which is used for simultaneous monitoring of various impairments occurring in the physical layer of the optical network. The ADTS method enables the visualization of the optical signal in the form of characteristics (so called phase portraits) that change their shape under the influence of impairments such as chromatic dispersion, polarization mode dispersion and ASE noise. Using this method, a simulation model was built with OptSim 4.0. After the simulation study, data were obtained in the form of images that were further analyzed using the convolutional neural network algorithm. The main goal of the study was to train a convolutional neural network to recognize the selected impairment (distortion); then to test its accuracy and estimate the impairment for the selected set of test images. The input data consisted of processed binary images in the form of two-dimensional matrices, with the position of the pixel. This article focuses only on the analysis of images containing chromatic dispersion.

  15. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    Science.gov (United States)

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  16. Laser-based direct-write techniques for cell printing

    Energy Technology Data Exchange (ETDEWEB)

    Schiele, Nathan R; Corr, David T [Biomedical Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States); Huang Yong [Department of Mechanical Engineering, Clemson University, Clemson, SC (United States); Raof, Nurazhani Abdul; Xie Yubing [College of Nanoscale Science and Engineering, University at Albany, SUNY, Albany, NY (United States); Chrisey, Douglas B, E-mail: schien@rpi.ed, E-mail: chrisd@rpi.ed [Material Science and Engineering Department, Rensselaer Polytechnic Institute, Troy, NY (United States)

    2010-09-15

    Fabrication of cellular constructs with spatial control of cell location ({+-}5 {mu}m) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  17. Laser-based direct-write techniques for cell printing

    International Nuclear Information System (INIS)

    Schiele, Nathan R; Corr, David T; Huang Yong; Raof, Nurazhani Abdul; Xie Yubing; Chrisey, Douglas B

    2010-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. (topical review)

  18. Detection of sunn pest-damaged wheat samples using visible/near-infrared spectroscopy based on pattern recognition.

    Science.gov (United States)

    Basati, Zahra; Jamshidi, Bahareh; Rasekh, Mansour; Abbaspour-Gilandeh, Yousef

    2018-05-30

    The presence of sunn pest-damaged grains in wheat mass reduces the quality of flour and bread produced from it. Therefore, it is essential to assess the quality of the samples in collecting and storage centers of wheat and flour mills. In this research, the capability of visible/near-infrared (Vis/NIR) spectroscopy combined with pattern recognition methods was investigated for discrimination of wheat samples with different percentages of sunn pest-damaged. To this end, various samples belonging to five classes (healthy and 5%, 10%, 15% and 20% unhealthy) were analyzed using Vis/NIR spectroscopy (wavelength range of 350-1000 nm) based on both supervised and unsupervised pattern recognition methods. Principal component analysis (PCA) and hierarchical cluster analysis (HCA) as the unsupervised techniques and soft independent modeling of class analogies (SIMCA) and partial least squares-discriminant analysis (PLS-DA) as supervised methods were used. The results showed that Vis/NIR spectra of healthy samples were correctly clustered using both PCA and HCA. Due to the high overlapping between the four unhealthy classes (5%, 10%, 15% and 20%), it was not possible to discriminate all the unhealthy samples in individual classes. However, when considering only the two main categories of healthy and unhealthy, an acceptable degree of separation between the classes can be obtained after classification with supervised pattern recognition methods of SIMCA and PLS-DA. SIMCA based on PCA modeling correctly classified samples in two classes of healthy and unhealthy with classification accuracy of 100%. Moreover, the power of the wavelengths of 839 nm, 918 nm and 995 nm were more than other wavelengths to discriminate two classes of healthy and unhealthy. It was also concluded that PLS-DA provides excellent classification results of healthy and unhealthy samples (R 2  = 0.973 and RMSECV = 0.057). Therefore, Vis/NIR spectroscopy based on pattern recognition techniques

  19. A comparative analysis of five chrome green pigments based on different spectroscopic techniques

    International Nuclear Information System (INIS)

    Desnica, V.; Furic, K.; Hochleitner, B.; Mantler, M.

    2003-01-01

    A detailed study of five chrome-based green pigments belonging to a large pigment collection at the Academy of Fine Arts in Vienna, Austria has been performed. The samples were analyzed and compared using the two X-ray methods--X-ray fluorescence and X-ray diffraction, and the two optical methods--Raman spectroscopy and infrared spectroscopy. The composition differences between the similarly denoted samples of the collection were determined and the significant sensitivity differences of the investigated methods to specific compounds have been established. This relative discrepancy of the obtained results depending on the technique used proved once again the need of a combined use of the investigated methods

  20. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  1. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  2. Development and application of the analyzer-based imaging technique with hard synchrotron radiation; Developpement et application d'une technique d'imagerie par rayonnement synchrotron basee sur l'utilisation d'un cristal analyseur

    Energy Technology Data Exchange (ETDEWEB)

    Coan, P

    2006-07-15

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  3. A multi-variate discrimination technique based on range-searching

    International Nuclear Information System (INIS)

    Carli, T.; Koblitz, B.

    2003-01-01

    We present a fast and transparent multi-variate event classification technique, called PDE-RS, which is based on sampling the signal and background densities in a multi-dimensional phase space using range-searching. The employed algorithm is presented in detail and its behaviour is studied with simple toy examples representing basic patterns of problems often encountered in High Energy Physics data analyses. In addition an example relevant for the search for instanton-induced processes in deep-inelastic scattering at HERA is discussed. For all studied examples, the new presented method performs as good as artificial Neural Networks and has furthermore the advantage to need less computation time. This allows to carefully select the best combination of observables which optimally separate the signal and background and for which the simulations describe the data best. Moreover, the systematic and statistical uncertainties can be easily evaluated. The method is therefore a powerful tool to find a small number of signal events in the large data samples expected at future particle colliders

  4. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    Science.gov (United States)

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass

  5. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    International Nuclear Information System (INIS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-01-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0–10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL 1 /TL 2 ). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets. - Highlights: • Irradiation has potential to improve hygienic quality of raw and processed seafood. • Detection of irradiated food is important to enforce the applied regulations. • Different techniques were compared to separate silicate minerals from frozen fish. • Limitations were observed in TL analysis on minerals isolated by density separation. • Hydrolysis methods provided more clear identification using TL and ESR techniques

  6. Comparison of mobile and stationary spore-sampling techniques for estimating virulence frequencies in aerial barley powdery mildew populations

    DEFF Research Database (Denmark)

    Hovmøller, M.S.; Munk, L.; Østergård, Hanne

    1995-01-01

    Gene frequencies in samples of aerial populations of barley powdery mildew (Erysiphe graminis f.sp. hordei), which were collected in adjacent barley areas and in successive periods of time, were compared using mobile and stationary sampling techniques. Stationary samples were collected from trap ...

  7. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    Science.gov (United States)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  8. Alpha Matting with KL-Divergence Based Sparse Sampling.

    Science.gov (United States)

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  9. Developing a hybrid dictionary-based bio-entity recognition technique

    Science.gov (United States)

    2015-01-01

    Background Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. Methods This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. Results The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. Conclusions The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall. PMID:26043907

  10. Developing a hybrid dictionary-based bio-entity recognition technique.

    Science.gov (United States)

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2015-01-01

    Bio-entity extraction is a pivotal component for information extraction from biomedical literature. The dictionary-based bio-entity extraction is the first generation of Named Entity Recognition (NER) techniques. This paper presents a hybrid dictionary-based bio-entity extraction technique. The approach expands the bio-entity dictionary by combining different data sources and improves the recall rate through the shortest path edit distance algorithm. In addition, the proposed technique adopts text mining techniques in the merging stage of similar entities such as Part of Speech (POS) expansion, stemming, and the exploitation of the contextual cues to further improve the performance. The experimental results show that the proposed technique achieves the best or at least equivalent performance among compared techniques, GENIA, MESH, UMLS, and combinations of these three resources in F-measure. The results imply that the performance of dictionary-based extraction techniques is largely influenced by information resources used to build the dictionary. In addition, the edit distance algorithm shows steady performance with three different dictionaries in precision whereas the context-only technique achieves a high-end performance with three difference dictionaries in recall.

  11. Effect of postmortem sampling technique on the clinical significance of autopsy blood cultures.

    Science.gov (United States)

    Hove, M; Pencil, S D

    1998-02-01

    Our objective was to investigate the value of postmortem autopsy blood cultures performed with an iodine-subclavian technique relative to the classical method of atrial heat searing and antemortem blood cultures. The study consisted of a prospective autopsy series with each case serving as its own control relative to subsequent testing, and a retrospective survey of patients coming to autopsy who had both autopsy blood cultures and premortem blood cultures. A busy academic autopsy service (600 cases per year) at University of Texas Medical Branch Hospitals, Galveston, Texas, served as the setting for this work. The incidence of non-clinically relevant (false-positive) culture results were compared using different methods for collecting blood samples in a prospective series of 38 adult autopsy specimens. One hundred eleven adult autopsy specimens in which both postmortem and antemortem blood cultures were obtained were studied retrospectively. For both studies, positive culture results were scored as either clinically relevant or false positives based on analysis of the autopsy findings and the clinical summary. The rate of false-positive culture results obtained by an iodine-subclavian technique from blood drawn soon after death were statistically significantly lower (13%) than using the classical method of obtaining blood through the atrium after heat searing at the time of the autopsy (34%) in the same set of autopsy subjects. When autopsy results were compared with subjects' antemortem blood culture results, there was no significant difference in the rate of non-clinically relevant culture results in a paired retrospective series of antemortem blood cultures and postmortem blood cultures using the iodine-subclavian postmortem method (11.7% v 13.5%). The results indicate that autopsy blood cultures obtained using the iodine-subclavian technique have reliability equivalent to that of antemortem blood cultures.

  12. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Axmann, H.; Sebastianelli, A.; Arrillaga, J.L.

    1990-01-01

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15 N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs

  13. Preparation of quality control samples for thyroid hormones T3 and T4 in radioimmunoassay techniques

    International Nuclear Information System (INIS)

    Ahmed, F.O.A.

    2006-03-01

    Today, the radioimmunoassay becomes one of the best techniques for quantitative analysis of very low concentration of different substances. RIA is being widely used in medical and research laboratories. To maintain high specificity and accuracy in RIA and other related techniques the quality controls must be introduced. In this dissertation quality control samples for thyroid hormones (Triiodothyronine T3 and Thyroxin T4), using RIA techniques. Ready made chinese T4, T3 RIA kits were used. IAEA statistical package were selected.(Author)

  14. Improved sample preparation and counting techniques for enhanced tritium measurement sensitivity

    Science.gov (United States)

    Moran, J.; Aalseth, C.; Bailey, V. L.; Mace, E. K.; Overman, C.; Seifert, A.; Wilcox Freeburg, E. D.

    2015-12-01

    Tritium (T) measurements offer insight to a wealth of environmental applications including hydrologic tracking, discerning ocean circulation patterns, and aging ice formations. However, the relatively short half-life of T (12.3 years) limits its effective age dating range. Compounding this limitation is the decrease in atmospheric T content by over two orders of magnitude (from 1000-2000 TU in 1962 to testing in the 1960's. We are developing sample preparation methods coupled to direct counting of T via ultra-low background proportional counters which, when combined, offer improved T measurement sensitivity (~4.5 mmoles of H2 equivalent) and will help expand the application of T age dating to smaller sample sizes linked to persistent environmental questions despite the limitations above. For instance, this approach can be used to T date ~ 2.2 mmoles of CH4 collected from sample-limited systems including microbial communities, soils, or subsurface aquifers and can be combined with radiocarbon dating to distinguish the methane's formation age from C age in a system. This approach can also expand investigations into soil organic C where the improved sensitivity will permit resolution of soil C into more descriptive fractions and provide direct assessments of the stability of specific classes of organic matter in soils environments. We are employing a multiple step sample preparation system whereby organic samples are first combusted with resulting CO2 and H2O being used as a feedstock to synthesize CH4. This CH4 is mixed with Ar and loaded directly into an ultra-low background proportional counter for measurement of T β decay in a shallow underground laboratory. Analysis of water samples requires only the addition of geologic CO2 feedstock with the sample for methane synthesis. The chemical nature of the preparation techniques enable high sample throughput with only the final measurement requiring T decay with total sample analysis time ranging from 2 -5 weeks

  15. Quantifying ruminal nitrogen metabolism using the omasal sampling technique in cattle--a meta-analysis.

    Science.gov (United States)

    Broderick, G A; Huhtanen, P; Ahvenjärvi, S; Reynal, S M; Shingfield, K J

    2010-07-01

    Mixed model analysis of data from 32 studies (122 diets) was used to evaluate the precision and accuracy of the omasal sampling technique for quantifying ruminal-N metabolism and to assess the relationships between nonammonia-N flow at the omasal canal and milk protein yield. Data were derived from experiments in cattle fed North American diets (n=36) based on alfalfa silage, corn silage, and corn grain and Northern European diets (n=86) composed of grass silage and barley-based concentrates. In all studies, digesta flow was quantified using a triple-marker approach. Linear regressions were used to predict microbial-N flow to the omasum from intake of dry matter (DM), organic matter (OM), or total digestible nutrients. Efficiency of microbial-N synthesis increased with DM intake and there were trends for increased efficiency with elevated dietary concentrations of crude protein (CP) and rumen-degraded protein (RDP) but these effects were small. Regression of omasal rumen-undegraded protein (RUP) flow on CP intake indicated that an average 32% of dietary CP escaped and 68% was degraded in the rumen. The slope from regression of observed omasal flows of RUP on flows predicted by the National Research Council (2001) model indicated that NRC predicted greater RUP supply. Measured microbial-N flow was, on average, 26% greater than that predicted by the NRC model. Zero ruminal N-balance (omasal CP flow=CP intake) was obtained at dietary CP and RDP concentrations of 147 and 106 g/kg of DM, corresponding to ruminal ammonia-N and milk urea N concentrations of 7.1 and 8.3mg/100mL, respectively. Milk protein yield was positively related to the efficiency of microbial-N synthesis and measured RUP concentration. Improved efficiency of microbial-N synthesis and reduced ruminal CP degradability were positively associated with efficiency of capture of dietary N as milk N. In conclusion, the results of this study indicate that the omasal sampling technique yields valuable estimates

  16. Performance evaluation of an importance sampling technique in a Jackson network

    Science.gov (United States)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  17. Using rule-based machine learning for candidate disease gene prioritization and sample classification of cancer gene expression data.

    Science.gov (United States)

    Glaab, Enrico; Bacardit, Jaume; Garibaldi, Jonathan M; Krasnogor, Natalio

    2012-01-01

    Microarray data analysis has been shown to provide an effective tool for studying cancer and genetic diseases. Although classical machine learning techniques have successfully been applied to find informative genes and to predict class labels for new samples, common restrictions of microarray analysis such as small sample sizes, a large attribute space and high noise levels still limit its scientific and clinical applications. Increasing the interpretability of prediction models while retaining a high accuracy would help to exploit the information content in microarray data more effectively. For this purpose, we evaluate our rule-based evolutionary machine learning systems, BioHEL and GAssist, on three public microarray cancer datasets, obtaining simple rule-based models for sample classification. A comparison with other benchmark microarray sample classifiers based on three diverse feature selection algorithms suggests that these evolutionary learning techniques can compete with state-of-the-art methods like support vector machines. The obtained models reach accuracies above 90% in two-level external cross-validation, with the added value of facilitating interpretation by using only combinations of simple if-then-else rules. As a further benefit, a literature mining analysis reveals that prioritizations of informative genes extracted from BioHEL's classification rule sets can outperform gene rankings obtained from a conventional ensemble feature selection in terms of the pointwise mutual information between relevant disease terms and the standardized names of top-ranked genes.

  18. Towards a plant-based technique to measure utilization of Karoo ...

    African Journals Online (AJOL)

    The sparseness of Karoo veld renders the destructive sampling of areas less efficient than clipping individual plants. However, sampling of whole plants and their separation into edible and inedible fractions is laborious and expensive. There is thus a need to develop suitable non-destructive techniques.Language: English.

  19. Respiratory motion sampling in 4DCT reconstruction for radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chi Yuwei; Liang Jian; Qin Xu; Yan Di [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States); Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, Michigan 48073 (United States)

    2012-04-15

    Purpose: Phase-based and amplitude-based sorting techniques are commonly used in four-dimensional CT (4DCT) reconstruction. However, effect of these sorting techniques on 4D dose calculation has not been explored. In this study, the authors investigated a candidate 4DCT sorting technique by comparing its 4D dose calculation accuracy with that for phase-based and amplitude-based sorting techniques.Method: An optimization model was formed using organ motion probability density function (PDF) in the 4D dose convolution. The objective function for optimization was defined as the maximum difference between the expected 4D dose in organ of interest and the 4D dose calculated using a 4DCT sorted by a candidate sampling method. Sorting samples, as optimization variables, were selected on the respiratory motion PDF assessed during the CT scanning. Breathing curves obtained from patients' 4DCT scanning, as well as 3D dose distribution from treatment planning, were used in the study. Given the objective function, a residual error analysis was performed, and k-means clustering was found to be an effective sampling scheme to improve the 4D dose calculation accuracy and independent with the patient-specific dose distribution. Results: Patient data analysis demonstrated that the k-means sampling was superior to the conventional phase-based and amplitude-based sorting and comparable to the optimal sampling results. For phase-based sorting, the residual error in 4D dose calculations may not be further reduced to an acceptable accuracy after a certain number of phases, while for amplitude-based sorting, k-means sampling, and the optimal sampling, the residual error in 4D dose calculations decreased rapidly as the number of 4DCT phases increased to 6.Conclusion: An innovative phase sorting method (k-means method) is presented in this study. The method is dependent only on tumor motion PDF. It could provide a way to refine the phase sorting in 4DCT reconstruction and is effective

  20. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study.

    Science.gov (United States)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-04-11

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population-based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  1. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  2. Microfluidic Chip-based Nucleic Acid Testing using Gingival Crevicular Fluid as a New Technique for Detecting HIV-1 Infection

    Directory of Open Access Journals (Sweden)

    Alex Willyandre

    2013-05-01

    Full Text Available Transmission of HIV-1 infection by individuals in window period who are tested negative in conventional HIV-1 detection would pose the community with serious problems. Several diagnostic tools require specific labora-tory equipment, perfect timing of diagnosis, antibody to HIV-1, and invasive technique to get sample for examination, until high amount of time to process the sample as well as accessibility of remote areas. Many attempts have been made to solve those problems to come to a new detection technique. This review aims to give information about the current development technique for detection of HIV infection. Microfluidic Chip-based Nucleic Acid Testing is currently introduced for detection of HIV-1 infection. This review also cover the possible usage of gingival crevicular fluid as sample specimen that could be taken noninvasively from the individual.DOI: 10.14693/jdi.v18i2.63

  3. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    Science.gov (United States)

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2017-03-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  4. Derivative spectrum chromatographic method for the determination of trimethoprim in honey samples using an on-line solid-phase extraction technique.

    Science.gov (United States)

    Uchiyama, Kazuhisa; Kondo, Mari; Yokochi, Rika; Takeuchi, Yuri; Yamamoto, Atsushi; Inoue, Yoshinori

    2011-07-01

    A simple, selective and rapid analytical method for determination of trimethoprim (TMP) in honey samples was developed and validated. This method is based on a SPE technique followed by HPLC with photodiode array detection. After dilution and filtration, aliquots of 500 μL honey samples were directly injected to an on-line SPE HPLC system. TMP was extracted on an RP SPE column, and separated on a hydrophilic interaction chromatography column during HPLC analysis. At the first detection step, the noise level of the photodiode array data was reduced with two-dimensional equalizer filtering, and then the smoothed data were subjected to derivative spectrum chromatography. On the second-derivative chromatogram at 254 nm, the limit of detection and the limit of quantification of TMP in a honey sample were 5 and 10 ng/g, respectively. The proposed method showed high accuracy (60-103%) with adequate sensitivity for TMP monitoring in honey samples. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. A review of hydroxyapatite-based coating techniques: Sol-gel and electrochemical depositions on biocompatible metals.

    Science.gov (United States)

    Asri, R I M; Harun, W S W; Hassan, M A; Ghani, S A C; Buyong, Z

    2016-04-01

    New promising techniques for depositing biocompatible hydroxyapatite-based coatings on biocompatible metal substrates for biomedical applications have continuously been exploited for more than two decades. Currently, various experimental deposition processes have been employed. In this review, the two most frequently used deposition processes will be discussed: a sol-gel dip coating and an electrochemical deposition. This study deliberates the surface morphologies and chemical composition, mechanical performance and biological responses of sol-gel dip coating as well as the electrochemical deposition for two different sample conditions, with and without coating. The review shows that sol-gel dip coatings and electrochemical deposition were able to obtain the uniform and homogeneous coating thickness and high adherent biocompatible coatings even in complex shapes. It has been accepted that both coating techniques improve bone strength and initial osseointegration rate. The main advantages and limitations of those techniques of hydroxyapatite-based coatings are presented. Furthermore, the most significant challenges and critical issues are also highlighted. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Synchrotron radiation based analytical techniques (XAS and XRF)

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2014-01-01

    A brief description of the principles of X-ray absorption spectroscopy (XAS) and X-ray fluorescence (XRF) techniques is given in this article with emphasis on the advantages of using synchrotron radiation-based instrumentation/beamline. XAS technique is described in more detail to emphasize the strength of the technique as a local structural probe. (author)

  7. Comparing Four Touch-Based Interaction Techniques for an Image-Based Audience Response System

    NARCIS (Netherlands)

    Jorritsma, Wiard; Prins, Jonatan T.; van Ooijen, Peter M. A.

    2015-01-01

    This study aimed to determine the most appropriate touch-based interaction technique for I2Vote, an image-based audience response system for radiology education in which users need to accurately mark a target on a medical image. Four plausible techniques were identified: land-on, take-off,

  8. Comparison of sampling methods for animal manure

    NARCIS (Netherlands)

    Derikx, P.J.L.; Ogink, N.W.M.; Hoeksma, P.

    1997-01-01

    Currently available and recently developed sampling methods for slurry and solid manure were tested for bias and reproducibility in the determination of total phosphorus and nitrogen content of samples. Sampling methods were based on techniques in which samples were taken either during loading from

  9. Determination of trace elements in plant samples using XRF, PIXE and ICP-OES techniques

    International Nuclear Information System (INIS)

    Ahmed, Hassan Elzain Hassan

    2014-07-01

    The purpose of this study is to determine trace element concentration (Ca, Cu, Cr, K,Fe, Mn,Sr, and Za) in some sudanese wild plants namely, Ziziphus Abyssinica and Grewia Tenax. X-ray fluorescence ( X RF), particle-induced x-ray emission ( PIXE) and inductively coupled plasma-optical emission spectroscopy (ICP-OES) techniques were used for element determination. A series of plants standard references materials were used to check the reliability of the different employed techniques as well as to estimate possible factors for correcting the concentration of some elements that deviated significantly from their actual concentration. The results showed that, X RF, PIXE and ICP-OES are equally competitive methods for measuring Ca,K, Fe, Sr and Zn elements. Unlikely to ICP-OES seems to be superior techniques tend to be appropriate methods for Cu determination in plant samples however, for Mn element PIXE and ICP-OES are advisable techniques for measuring this element rather than X RF method. On the other hand, ICP-OES seems to be the superior techniques over PIXE and X RF methods for Cr and Ni determination in plant samples. The effect of geographical location on trace elements concentration in plants has been examined through determination of element in different species of Grewia Tenax than collected from different location. Most of measured elements showed similarity indicating there is no significant impact of locations on the difference of element contents. In addition, two plants with different genetic families namely, Ziziphus Spina Christi and Ziziphus Abyssinica were collected from the same location and screened for their trace element content. It was found that there were no difference between the two plants for Ca, K, Cu, Fe, and Sr element. However, significant variations were observed for Mn and Zn concentrations implying the possibility of using of those two elements for plant taxonomy purposes.(Author)

  10. Low-mass molecular dynamics simulation: A simple and generic technique to enhance configurational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Pang, Yuan-Ping, E-mail: pang@mayo.edu

    2014-09-26

    Highlights: • Reducing atomic masses by 10-fold vastly improves sampling in MD simulations. • CLN025 folded in 4 of 10 × 0.5-μs MD simulations when masses were reduced by 10-fold. • CLN025 folded as early as 96.2 ns in 1 of the 4 simulations that captured folding. • CLN025 did not fold in 10 × 0.5-μs MD simulations when standard masses were used. • Low-mass MD simulation is a simple and generic sampling enhancement technique. - Abstract: CLN025 is one of the smallest fast-folding proteins. Until now it has not been reported that CLN025 can autonomously fold to its native conformation in a classical, all-atom, and isothermal–isobaric molecular dynamics (MD) simulation. This article reports the autonomous and repeated folding of CLN025 from a fully extended backbone conformation to its native conformation in explicit solvent in multiple 500-ns MD simulations at 277 K and 1 atm with the first folding event occurring as early as 66.1 ns. These simulations were accomplished by using AMBER forcefield derivatives with atomic masses reduced by 10-fold on Apple Mac Pros. By contrast, no folding event was observed when the simulations were repeated using the original AMBER forcefields of FF12SB and FF14SB. The results demonstrate that low-mass MD simulation is a simple and generic technique to enhance configurational sampling. This technique may propel autonomous folding of a wide range of miniature proteins in classical, all-atom, and isothermal–isobaric MD simulations performed on commodity computers—an important step forward in quantitative biology.

  11. Review of cleaning techniques and their effects on the chemical composition of foliar samples

    Energy Technology Data Exchange (ETDEWEB)

    Rossini Oliva, S.; Raitio, H.

    2003-07-01

    Chemical foliar analysis is a tool widely used to study tree nutrition and to monitor the impact and extent of air pollutants. This paper reviews a number of cleaning methods, and the effects of cleaning on foliar chemistry. Cleaning may include mechanical techniques such as the use of dry or moistened tissues, shaking, blowing, and brushing, or use various washing techniques with water or other solvents. Owing to the diversity of plant species, tissue differences, etc., there is no standard procedure for all kinds of samples. Analysis of uncleaned leaves is considered a good method for assessing the degree of air contamination because it provides an estimate of the element content of the deposits on leaf surfaces or when the analysis is aimed at the investigation of transfer of elements along the food chain. Sample cleaning is recommended in order (1) to investigate the transfer rate of chemical elements from soil to plants, (2) to qualify the washoff of dry deposition from foliage and (3) to separate superficially absorbed and biomass-incorporated elements. Since there is not a standard cleaning procedure for all kinds of samples and aims, it is advised to conduct a pilot study in order to be able to establish a cleaning procedure to provide reliable foliar data. (orig.)

  12. Detection of equine herpesvirus in horses with idiopathic keratoconjunctivitis and comparison of three sampling techniques.

    Science.gov (United States)

    Hollingsworth, Steven R; Pusterla, Nicola; Kass, Philip H; Good, Kathryn L; Brault, Stephanie A; Maggs, David J

    2015-09-01

    To determine the role of equine herpesvirus (EHV) in idiopathic keratoconjunctivitis in horses and to determine whether sample collection method affects detection of EHV DNA by quantitative polymerase chain reaction (qPCR). Twelve horses with idiopathic keratoconjunctivitis and six horses without signs of ophthalmic disease. Conjunctival swabs, corneal scrapings, and conjunctival biopsies were collected from 18 horses: 12 clinical cases with idiopathic keratoconjunctivitis and six euthanized controls. In horses with both eyes involved, the samples were taken from the eye judged to be more severely affected. Samples were tested with qPCR for EHV-1, EHV-2, EHV-4, and EHV-5 DNA. Quantity of EHV DNA and viral replicative activity were compared between the two populations and among the different sampling techniques; relative sensitivities of the sampling techniques were determined. Prevalence of EHV DNA as assessed by qPCR did not differ significantly between control horses and those with idiopathic keratoconjunctivitis. Sampling by conjunctival swab was more likely to yield viral DNA as assessed by qPCR than was conjunctival biopsy. EHV-1 and EHV-4 DNA were not detected in either normal or IKC-affected horses; EHV-2 DNA was detected in two of 12 affected horses but not in normal horses. EHV-5 DNA was commonly found in ophthalmically normal horses and horses with idiopathic keratoconjunctivitis. Because EHV-5 DNA was commonly found in control horses and in horses with idiopathic keratoconjunctivitis, qPCR was not useful for the etiological diagnosis of equine keratoconjunctivitis. Conjunctival swabs were significantly better at obtaining viral DNA samples than conjunctival biopsy in horses in which EHV-5 DNA was found. © 2015 American College of Veterinary Ophthalmologists.

  13. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  14. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  15. On-line nuclear ash gauge for coal based on gamma-ray transmission techniques

    International Nuclear Information System (INIS)

    Rizk, R.A.M.; El-Kateb, A.H.; Abdul-Kader, A.M.

    1999-01-01

    Developments and applications of on-line nuclear gauges in the coal industry are highly requested. A nuclear ash gauge for coal, based on γ-ray transmission techniques is developed. Single and dual energy γ-ray beams are used to determine the ash content of coal. The percentage ash content as a function of the γ-ray intensities transmitted through coal samples is measured and sensitivity curves are obtained. An empirical formulation relating the ash content values to the γ-ray intensities is derived. Preliminary results show that both single and dual energy γ-ray transmission techniques can be used to give a rapid on-line estimation of the ash concentration values in coal with low cost and reasonable accuracy, but the dual one is much preferable. (author)

  16. Non-Conventional Techniques for the Study of Phase Transitions in NiTi-Based Alloys

    Science.gov (United States)

    Nespoli, Adelaide; Villa, Elena; Passaretti, Francesca; Albertini, Franca; Cabassi, Riccardo; Pasquale, Massimo; Sasso, Carlo Paolo; Coïsson, Marco

    2014-07-01

    Differential scanning calorimetry and electrical resistance measurements are the two most common techniques for the study of the phase transition path and temperatures of shape memory alloys (SMA) in stress-free condition. Besides, it is well known that internal friction measurements are also useful for this purpose. There are indeed some further techniques which are seldom used for the basic characterization of SMA transition: dilatometric analysis, magnetic measurements, and Seebeck coefficient study. In this work, we discuss the attitude of these techniques for the study of NiTi-based phase transition. Measurements were conducted on several fully annealed Ni50- x Ti50Cu x samples ranging from 3 to 10 at.% in Cu content, fully annealed at 850 °C for 1 h in vacuum and quenched in water at room temperature. Results show that all these techniques are sensitive to phase transition, and they provide significant information about the existence of intermediate phases.

  17. Effects of sampling conditions on DNA-based estimates of American black bear abundance

    Science.gov (United States)

    Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability

  18. Environmental gamma-ray measurements using in situ and core sampling techniques

    International Nuclear Information System (INIS)

    Dickson, H.W.; Kerr, G.D.; Perdue, P.T.; Abdullah, S.A.

    1976-01-01

    Dose rates from natural radionuclides and 137 Cs in soils of the Oak Ridge area have been determined from in situ and core sample measurements. In situ γ-ray measurements were made with a transportable spectrometer. A tape of spectral data and a soil core sample from each site were returned to ORNL for further analysis. Information on soil composition, density and moisture content and on the distribution of cesium in the soil was obtained from the core samples. In situ spectra were analyzed by a computer program which identified and assigned energies to peaks, integrated the areas under the peaks, and calculated radionuclide concentrations based on a uniform distribution in the soil. The assumption of a uniform distribution was adequate only for natural radionuclides, but simple corrections can be made to the computer calculations for man-made radionuclides distributed on the surface or exponentially in the soil. For 137 Cs a correction was used based on an exponential function fitted to the distribution measured in core samples. At typical sites in Oak Ridge, the dose rate determined from these measurements was about 5 μrad/hr. (author)

  19. Trace uranium analysis in Indian coal samples using the fission track technique

    International Nuclear Information System (INIS)

    Jojo, P.J.; Rawat, A.; Kumar, Ashavani; Prasad, Rajendra

    1993-01-01

    The ever-growing demand for energy has resulted in the extensive use of fossil fuels, especially coal, for power generation. Coal and its by-products often contain significant amounts of radionuclides, including uranium, which is the ultimate source of the radioactive gas Radon-222. The present study gives the concentration of uranium in coal samples of different collieries in India, collected from various thermal power plants in the state of Uttar Pradesh. The estimates were made using the fission track technique. Latent damage tracks were not found to be uniformly distributed but showed sun bursts and clusters. Non-uniform distributions of trace elements are a very common phenomenon in rocks. The levels of uranium in the coal samples were found to vary from 2.0 to 4.9 ppm in uniform distributions and from 21.3 to 41.0 ppm in non-uniform distributions. Measurements were also made on fly ash samples where the average uranium concentration was found to be 8.4 and 49.3 ppm in uniform and non-uniform distributions, respectively. (author)

  20. Single-particle characterization of ice-nucleating particles and ice particles residuals sampled by three different techniques

    Science.gov (United States)

    Kandler, Konrad; Worringen, Annette; Benker, Nathalie; Dirsch, Thomas; Mertes, Stephan; Schenk, Ludwig; Kästner, Udo; Frank, Fabian; Nillius, Björn; Bundke, Ulrich; Rose, Diana; Curtius, Joachim; Kupiszewski, Piotr; Weingartner, Ernest; Vochezer, Paul; Schneider, Johannes; Schmidt, Susan; Weinbruch, Stephan; Ebert, Martin

    2015-04-01

    During January/February 2013, at the High Alpine Research Station Jungfraujoch a measurement campaign was carried out, which was centered on atmospheric ice-nucleating particles (INP) and ice particle residuals (IPR). Three different techniques for separation of INP and IPR from the non-ice-active particles are compared. The Ice Selective Inlet (ISI) and the Ice Counterflow Virtual Impactor (Ice-CVI) sample ice particles from mixed phase clouds and allow for the analysis of the residuals. The combination of the Fast Ice Nucleus Chamber (FINCH) and the Ice Nuclei Pumped Counterflow Virtual Impactor (IN-PCVI) provides ice-activating conditions to aerosol particles and extracts the activated INP for analysis. Collected particles were analyzed by scanning electron microscopy and energy-dispersive X-ray microanalysis to determine size, chemical composition and mixing state. All INP/IPR-separating techniques had considerable abundances (median 20 - 70 %) of instrumental contamination artifacts (ISI: Si-O spheres, probably calibration aerosol; Ice-CVI: Al-O particles; FINCH+IN-PCVI: steel particles). Also, potential sampling artifacts (e.g., pure soluble material) occurred with a median abundance of separated by all three techniques. Soot was a minor contributor. Lead was detected in less than 10 % of the particles, of which the majority were internal mixtures with other particle types. Sea-salt and sulfates were identified by all three methods as INP/IPR. Most samples showed a maximum of the INP/IPR size distribution at 400 nm geometric diameter. In a few cases, a second super-micron maximum was identified. Soot/carbonaceous material and metal oxides were present mainly in the submicron range. ISI and FINCH yielded silicates and Ca-rich particles mainly with diameters above 1 µm, while the Ice-CVI also separated many submicron IPR. As strictly parallel sampling could not be performed, a part of the discrepancies between the different techniques may result from

  1. Using rule-based machine learning for candidate disease gene prioritization and sample classification of cancer gene expression data.

    Directory of Open Access Journals (Sweden)

    Enrico Glaab

    Full Text Available Microarray data analysis has been shown to provide an effective tool for studying cancer and genetic diseases. Although classical machine learning techniques have successfully been applied to find informative genes and to predict class labels for new samples, common restrictions of microarray analysis such as small sample sizes, a large attribute space and high noise levels still limit its scientific and clinical applications. Increasing the interpretability of prediction models while retaining a high accuracy would help to exploit the information content in microarray data more effectively. For this purpose, we evaluate our rule-based evolutionary machine learning systems, BioHEL and GAssist, on three public microarray cancer datasets, obtaining simple rule-based models for sample classification. A comparison with other benchmark microarray sample classifiers based on three diverse feature selection algorithms suggests that these evolutionary learning techniques can compete with state-of-the-art methods like support vector machines. The obtained models reach accuracies above 90% in two-level external cross-validation, with the added value of facilitating interpretation by using only combinations of simple if-then-else rules. As a further benefit, a literature mining analysis reveals that prioritizations of informative genes extracted from BioHEL's classification rule sets can outperform gene rankings obtained from a conventional ensemble feature selection in terms of the pointwise mutual information between relevant disease terms and the standardized names of top-ranked genes.

  2. Clustering of samples and elements based on multi-variable chemical data

    International Nuclear Information System (INIS)

    Op de Beeck, J.

    1984-01-01

    Clustering and classification are defined in the context of multivariable chemical analysis data. Classical multi-variate techniques, commonly used to interpret such data, are shown to be based on probabilistic and geometrical principles which are not justified for analytical data, since in that case one assumes or expects a system of more or less systematically related objects (samples) as defined by measurements on more or less systematically interdependent variables (elements). For the specific analytical problem of data set concerning a large number of trace elements determined in a large number of samples, a deterministic cluster analysis can be used to develop the underlying classification structure. Three main steps can be distinguished: diagnostic evaluation and preprocessing of the raw input data; computation of a symmetric matrix with pairwise standardized dissimilarity values between all possible pairs of samples and/or elements; and ultrametric clustering strategy to produce the final classification as a dendrogram. The software packages designed to perform these tasks are discussed and final results are given. Conclusions are formulated concerning the dangers of using multivariate, clustering and classification software packages as a black-box

  3. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  4. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    Science.gov (United States)

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  5. Extraction of Plutonium From Spiked INEEL Soil Samples Using the Ligand-Assisted Supercritical Fluid Extraction (LA-SFE) Technique

    International Nuclear Information System (INIS)

    Fox, R.V.; Mincher, B.J.; Holmes, R.G.G.

    1999-01-01

    In order to investigate the effectiveness of ligand-assisted supercritical fluid extraction for the removal of transuranic contaminations from soils an Idaho National Engineering and Environmental Laboratory (INEEL) silty-clay soil sample was obtained from near the Radioactive Waste Management Complex area and subjected to three different chemical preparations before being spiked with plutonium. The spiked INEEL soil samples were subjected to a sequential aqueous extraction procedure to determine radionuclide portioning in each sample. Results from those extractions demonstrate that plutonium consistently partitioned into the residual fraction across all three INEEL soil preparations whereas americium partitioned 73% into the iron/manganese fraction for soil preparation A, with the balance partitioning into the residual fraction. Plutonium and americium were extracted from the INEEL soil samples using a ligand-assisted supercritical fluid extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction technique. Initial supercritical fluid extraction runs produced plutonium extraction efficiencies ranging from 14% to 19%. After a second round wherein the initial extraction parameters were changed, the plutonium extraction efficiencies increased to 60% and as high as 80% with the americium level in the post-extracted soil samples dropping near to the detection limits. The third round of experiments are currently underway. These results demonstrate that the ligand-assisted supercritical fluid extraction technique can effectively extract plutonium from the spiked INEEL soil preparations

  6. Impact of changing from staining to culture techniques on detection rates of Campylobacter spp. in routine stool samples in Chile.

    Science.gov (United States)

    Porte, Lorena; Varela, Carmen; Haecker, Thomas; Morales, Sara; Weitzel, Thomas

    2016-05-13

    Campylobacter is a leading cause of bacterial gastroenteritis, but sensitive diagnostic methods such as culture are expensive and often not available in resource limited settings. Therefore, direct staining techniques have been developed as a practical and economical alternative. We analyzed the impact of replacing Campylobacter staining with culture for routine stool examinations in a private hospital in Chile. From January to April 2014, a total of 750 consecutive stool samples were examined in parallel by Hucker stain and Campylobacter culture. Isolation rates of Campylobacter were determined and the performance of staining was evaluated against culture as the gold standard. Besides, isolation rates of Campylobacter and other enteric pathogens were compared to those of past years. Campylobacter was isolated by culture in 46 of 750 (6.1 %) stool samples. Direct staining only identified three samples as Campylobacter positive and reached sensitivity and specificity values of 6.5 and 100 %, respectively. In comparison to staining-based detection rates of previous years, we observed a significant increase of Campylobacter cases in our patients. Direct staining technique for Campylobacter had a very low sensitivity compared to culture. Staining methods might lead to a high rate of false negative results and an underestimation of the importance of campylobacteriosis. With the inclusion of Campylobacter culture, this pathogen became a leading cause of intestinal infection in our patient population.

  7. The Development of an In-Situ TEM Technique for Studying Corrosion Behavior as Applied to Zirconium-Based Alloys

    Science.gov (United States)

    Harlow, Wayne

    Zirconium-based alloys are a commonly used material for nuclear fuel rod cladding, due to its low neutron cross section and good corrosion properties. However, corrosion is still a limiting factor in fuel rod lifespan, which restricts burn up levels, and thus efficiency, that can be achieved. While long-term corrosion behavior has been studied through both reactor and autoclave samples, the oxide nucleation and growth behavior has not been extensively studied. This work develops a new technique to study the initial stages of corrosion in zirconium-based alloys and the microstructural effects on this process by developing an environmental cell system for the TEM. Nanoscale oxidation parameters are developed, as is a new FIB technique to support this method. Precession diffraction is used in conjunction with in-situ TEM to observe the initial stages of corrosion in these alloys, and oxide thickness is estimated using low-loss EELS. In addition, the stress stabilization of tetragonal ZrO 2 is explored in the context of sample preparation for TEM. It was found that in-situ environmental TEM using an environmental cell replicates the oxidation behavior observed in autoclaved samples in both oxide structure and phases. Utilizing this technique, it was shown that cracking of the oxide layer in zirconium-based alloys is related to oxide relaxation, and not thermal changes. The effect of secondary phase particles on oxidation behavior did not present significant results, however a new method for studying initial oxidation rates using low-loss EELS was developed.

  8. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O P; Chen, G P; Zhang, Y; El-Metwally, K [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  9. A novel in-situ sampling and VFA sensor technique for anaerobic systems

    DEFF Research Database (Denmark)

    Pind, Peter Frode; Angelidaki, Irini; Ahring, Birgitte Kiær

    2002-01-01

    A key information for understanding and controlling the anaerobic biogas process is the concentration of Volatile Fatty Acids (VFA). However, access to this information has so far been limited to off-line measurements by manual time and labour consuming methods. We have developed a new technique ...... than 1000 samples on both a fullscale biogas plant and lab-scale reactors. The measuring range covers specific measurements of acetate, propionate, iso-/n-butyrate and iso-/n-valerate from 0.1 to 50 mM (6–3,000 mg)....

  10. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  11. An Electromagnetic Gauge Technique for Measuring Shocked Particle Velocity in Electrically Conductive Samples

    Science.gov (United States)

    Cheng, David; Yoshinaka, Akio

    2014-11-01

    Electromagnetic velocity (EMV) gauges are a class of film gauges which permit the direct in-situ measurement of shocked material flow velocity. The active sensing element, typically a metallic foil, requires exposure to a known external magnetic field in order to produce motional electromotive force (emf). Due to signal distortion caused by mutual inductance between sample and EMV gauge, this technique is typically limited to shock waves in non-conductive materials. In conductive samples, motional emf generated in the EMV gauge has to be extracted from the measured signal which results from the combined effects of both motional emf and voltage changes from induced currents. An electromagnetic technique is presented which analytically models the dynamics of induced current between a copper disk moving as a rigid body with constant 1D translational velocity toward an EMV gauge, where both disk and gauge are exposed to a uniform external static magnetic field. The disk is modelled as a magnetic dipole loop where its Foucault current is evaluated from the characteristics of the fields, whereas the EMV gauge is modelled as a circuit loop immersed in the field of the magnetic dipole loop, the intensity of which is calculated as a function of space and, implicitly, time. Equations of mutual induction are derived and the current induced in the EMV gauge loop is solved, allowing discrimination of the motional emf. Numerical analysis is provided for the step response of the induced EMV gauge current with respect to the Foucault current in the moving copper sample.

  12. Success and failure rates of tumor genotyping techniques in routine pathological samples with non-small-cell lung cancer.

    Science.gov (United States)

    Vanderlaan, Paul A; Yamaguchi, Norihiro; Folch, Erik; Boucher, David H; Kent, Michael S; Gangadharan, Sidharta P; Majid, Adnan; Goldstein, Michael A; Huberman, Mark S; Kocher, Olivier N; Costa, Daniel B

    2014-04-01

    Identification of some somatic molecular alterations in non-small-cell lung cancer (NSCLC) has become evidence-based practice. The success and failure rate of using commercially available tumor genotyping techniques in routine day-to-day NSCLC pathology samples is not well described. We sought to evaluate the success and failure rate of EGFR mutation, KRAS mutation, and ALK FISH in a cohort of lung cancers subjected to routine clinical tumor genotype. Clinicopathologic data, tumor genotype success and failure rates were retrospectively compiled and analyzed from 381 patient-tumor samples. From these 381 patients with lung cancer, the mean age was 65 years, 61.2% were women, 75.9% were white, 27.8% were never smokers, 73.8% had advanced NSCLC and 86.1% had adenocarcinoma histology. The tumor tissue was obtained from surgical specimens in 48.8%, core needle biopsies in 17.9%, and as cell blocks from aspirates or fluid in 33.3% of cases. Anatomic sites for tissue collection included lung (49.3%), lymph nodes (22.3%), pleura (11.8%), bone (6.0%), brain (6.0%), among others. The overall success rate for EGFR mutation analysis was 94.2%, for KRAS mutation 91.6% and for ALK FISH 91.6%. The highest failure rates were observed when the tissue was obtained from image-guided percutaneous transthoracic core-needle biopsies (31.8%, 27.3%, and 35.3% for EGFR, KRAS, and ALK tests, respectively) and bone specimens (23.1%, 15.4%, and 23.1%, respectively). In specimens obtained from bone, the failure rates were significantly higher for biopsies than resection specimens (40% vs. 0%, p=0.024 for EGFR) and for decalcified compared to non-decalcified samples (60% vs. 5.5%, p=0.021 for EGFR). Tumor genotype techniques are feasible in most samples, outside small image-guided percutaneous transthoracic core-needle biopsies and bone samples from core biopsies with decalcification, and therefore expansion of routine tumor genotype into the care of patients with NSCLC may not require special

  13. Active sampling technique to enhance chemical signature of buried explosives

    Science.gov (United States)

    Lovell, John S.; French, Patrick D.

    2004-09-01

    Deminers and dismounted countermine engineers commonly use metal detectors, ground penetrating radar and probes to locate mines. Many modern landmines have a very low metal content, which severely limits the effectiveness of metal detectors. Canines have also been used for landmine detection for decades. Experiments have shown that canines smell the explosives which are known to leak from most types of landmines. The fact that dogs can detect landmines indicates that vapor sensing is a viable approach to landmine detection. Several groups are currently developing systems to detect landmines by "sniffing" for the ultra-trace explosive vapors above the soil. The amount of material that is available to passive vapor sensing systems is limited to no more than the vapor in equilibrium with the explosive related chemicals (ERCs) distributed in the surface soils over and near the landmine. The low equilibrium vapor pressure of TNT in the soil/atmosphere boundary layer and the limited volume of the boundary layer air imply that passive chemical vapor sensing systems require sensitivities in the picogram range, or lower. ADA is working to overcome many of the limitations of passive sampling methods, by the use of an active sampling method that employs a high-powered (1,200+ joules) strobe lamp to create a highly amplified plume of vapor and/or ERC-bearing fine particulates. Initial investigations have demonstrated that this approach can amplify the detectability of TNT by two or three orders of magnitude. This new active sampling technique could be used with any suitable explosive sensor.

  14. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study

    International Nuclear Information System (INIS)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-01-01

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population–based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  15. Sampling phased array, a new technique for ultrasonic signal processing and imaging now available to industry

    OpenAIRE

    Verkooijen, J.; Bulavinov, A.

    2008-01-01

    Over the past 10 years the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called "Sampling Phased Array" has been developed in the Fraunhofer Institute for non-destructive testing [1]. It realizes a unique approach of measurement and processing of ultrasonic signals. The s...

  16. Determination of multi-element in marine sediment samples collected in Angola by the k0-NAA technique

    International Nuclear Information System (INIS)

    Teixeira, M.C.P.; Ho Manh Dung; Cao Dong Vu; Nguyen Thi Sy; Nguyen Thanh Binh; Vuong Huu Tan

    2006-01-01

    The marine sediment samples were designed to collect in Angola for marine environmental pollution study. The k 0 -standardization method of neutron activation analysis (k 0 -NAA) on Dalat research reactor has been developed to determine of multi-element in the Angola marine sediment samples. The samples were irradiated in cell 7-1 for short- and middle-lived nuclides and rotary specimen rack for long-lived nuclides. The irradiation facilities were characterized for neutron spectrum parameters and post-activated samples were measured on the calibrated gamma-ray spectrometers using HPGe detectors. The analytical results for 9 marine sediment samples with 27 elements: Al, As, Br, Ca, Ce,Cl, Co, Cs, Dy, Fe, Hf, I, K, Mg, Mn, Na, Rb, Sb, Sc, Se, Sm, Th, Ti, U, V and Zn in term of mean concentration, standard deviation and their content range are shown in the report. The analytical quality assurance was done by analysis of a Japan's certified reference material namely marine sediment NMIJ-CRM-7302a. These preliminary results revealed that the k 0 -NAA technique on the Dalat research reactor is a good analytical technique for determination of multi-element in the marine sediment samples. Some heavy metals and trace elements determined in this work possibly connected to the human activities at the sampling region. (author)

  17. A Global Sampling Based Image Matting Using Non-Negative Matrix Factorization

    Directory of Open Access Journals (Sweden)

    NAVEED ALAM

    2017-10-01

    Full Text Available Image matting is a technique in which a foreground is separated from the background of a given image along with the pixel wise opacity. This foreground can then be seamlessly composited in a different background to obtain a novel scene. This paper presents a global non-parametric sampling algorithm over image patches and utilizes a dimension reduction technique known as NMF (Non-Negative Matrix Factorization. Although some existing non-parametric approaches use large nearby foreground and background regions to sample patches but these approaches fail to take the whole image to sample patches. It is because of the high memory and computational requirements. The use of NMF in the proposed algorithm allows the dimension reduction which reduces the computational cost and memory requirement. The use of NMF also allow the proposed approach to use the whole foreground and background region in the image and reduces the patch complexity and help in efficient patch sampling. The use of patches not only allows the incorporation of the pixel colour but also the local image structure. The use of local structures in the image is important to estimate a high-quality alpha matte especially in the images which have regions containing high texture. The proposed algorithm is evaluated on the standard data set and obtained results are comparable to the state-of-the-art matting techniques

  18. A global sampling based image matting using non-negative matrix factorization

    International Nuclear Information System (INIS)

    Alam, N.; Sarim, M.; Shaikh, A.B.

    2017-01-01

    Image matting is a technique in which a foreground is separated from the background of a given image along with the pixel wise opacity. This foreground can then be seamlessly composited in a different background to obtain a novel scene. This paper presents a global non-parametric sampling algorithm over image patches and utilizes a dimension reduction technique known as NMF (Non-Negative Matrix Factorization). Although some existing non-parametric approaches use large nearby foreground and background regions to sample patches but these approaches fail to take the whole image to sample patches. It is because of the high memory and computational requirements. The use of NMF in the proposed algorithm allows the dimension reduction which reduces the computational cost and memory requirement. The use of NMF also allow the proposed approach to use the whole foreground and background region in the image and reduces the patch complexity and help in efficient patch sampling. The use of patches not only allows the incorporation of the pixel colour but also the local image structure. The use of local structures in the image is important to estimate a high-quality alpha matte especially in the images which have regions containing high texture. The proposed algorithm is evaluated on the standard data set and obtained results are comparable to the state-of-the-art matting techniques. (author)

  19. Critical assessment of the deposition based dosimetric technique for radon/thoron decay products

    International Nuclear Information System (INIS)

    Mayya, Y.S.

    2010-01-01

    Inhalation doses due to radon ( 222 Rn) and thoron ( 220 Rn) are predominantly contributed by their decay products and not due to the gases themselves. Decay product measurements are being carried out essentially by either short-term active measurement like by air-sampling on a substrate followed by alpha or beta counting or by continuous active monitoring techniques based on silicon barrier detector. However, due to non-availability of satisfactory passive measurement techniques for the progeny species, it has been a usual practice to estimate the long time averaged progeny concentration from measured gas concentration using an assumed equilibrium factor. To be accurate, one is required to measure the equilibrium factor in situ along with the gas concentration. This being not practical, the assigned equilibrium factor (0.4 for indoor and 0.8 for outdoor for 222 Rn) approach has been an inevitable, though uncertain, part of the dosimetric strategies in both occupational and public domains. Further, in the case of thoron decay products however, equilibrium factor is of far more questionable validity. Thus, there is a need to shift from gas based dosimetric paradigm to that based on direct detection of progeny species

  20. Determination of lead at nanogram level in water samples by resonance light scattering technique using tetrabutyl ammonium bromide as a molecular probe

    Directory of Open Access Journals (Sweden)

    Yanru Yun

    2012-12-01

    Full Text Available A novel method of chemistry applicable to the determination of trace lead in water samples based on the resonance light scattering (RLS technique has been developed. In dilute phosphoric acid medium, in the presence of a large excess of I-, Pb(II can form [PbI4]2-, which further reacts with tetrabutyl ammonium bromide (TBAB to form an ion-association compound. This results in significant enhancement of RLS intensity and the appearance of the corresponding RLS spectral characteristics. The maximum scattering peak of the system exists at 402 nm. Under optimum conditions, there is a linear relationship between the relative intensity of RLS and concentration of Pb(II in the range of 0.04–1.8 μg/mL for the system with a low detection limit of 0.74 ng/mL for Pb(II. Based on this fact, a simple, rapid, and sensitive method has been developed for the determination of Pb(II at nanogram level by RLS technique using a common spectrofluorimeter. This analytical system was successfully applied to determining trace amounts of Pb(II in water samples that agree well with the results by atomic absorbance spectrometry (AAS.DOI: http://dx.doi.org/10.4314/bcse.v26i1.1

  1. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques.

    Science.gov (United States)

    Bergquist, Magnus; Nilsson, Andreas; Hansla, André

    2017-01-01

    Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants ( n = 347) were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1) and higher personal norms for non-targeted pro-environmental behaviors (Study 2). These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  2. Contests versus Norms: Implications of Contest-Based and Norm-Based Intervention Techniques

    Directory of Open Access Journals (Sweden)

    Magnus Bergquist

    2017-11-01

    Full Text Available Interventions using either contests or norms can promote environmental behavioral change. Yet research on the implications of contest-based and norm-based interventions is lacking. Based on Goal-framing theory, we suggest that a contest-based intervention frames a gain goal promoting intensive but instrumental behavioral engagement. In contrast, the norm-based intervention was expected to frame a normative goal activating normative obligations for targeted and non-targeted behavior and motivation to engage in pro-environmental behaviors in the future. In two studies participants (n = 347 were randomly assigned to either a contest- or a norm-based intervention technique. Participants in the contest showed more intensive engagement in both studies. Participants in the norm-based intervention tended to report higher intentions for future energy conservation (Study 1 and higher personal norms for non-targeted pro-environmental behaviors (Study 2. These findings suggest that contest-based intervention technique frames a gain goal, while norm-based intervention frames a normative goal.

  3. Characterization of Some Iraqi Archaeological Samples Using IBA, Analytical X-ray and Other Complementary Techniques

    International Nuclear Information System (INIS)

    Al-Sarraj, Ziyad Shihab; Damboos, Hassan I; Roumie, Mohamad

    2012-01-01

    The present work aimed at investigating the compositions and microstructures of some archaeological samples which dated back to various periods of the ancient Iraqi civilizations using PIXE, XRF, XRD, and SEM techniques. The models selected for the study (ceramics, glaze, etc.) were diverse in size and nature, therefore a limited number of samples were then butted from them by a small diamond wheel. Conventional powder metallurgy method was then used to prepare the samples. Dried samples were then coated with a thin layer of carbon, and analyzed using the ion beam accelerator of the LAEC. Three other groups of samples were also prepared for the purpose of analysis by X-ray fluorescence (XRF), X-ray diffraction (XRD), and scanning electron microscope (SEM). Analysis results of the chemical composition showed good agreement between the various techniques as well as for phases, while the fine structure analysis obtained by optical and scanning microscopy exhibited features of a structure where it got an intensified densification in the final stage of sintering and accompanied by quasi-homogeneous distribution of the closed pores. This will lead to the conclusion that the temperature used for sintering by ancient Iraqi was sufficient and it may fall in the range between 950-1200°C, also the mixes and the forming methods used by them, were both suitable to obtain good sintered bodies with even distribution of pores. A ring-shaped trace noticed in SEM micrographs need more work and study to explain what it is?

  4. PCR-based identification of eight Lactobacillus species and 18 hr-HPV genotypes in fixed cervical samples of South African women at risk of HIV and BV.

    NARCIS (Netherlands)

    Dols, J.A.M.; Reid, G.; Kort, R.; Schuren, F.H.J.; Tempelman, H.; Bontekoe, T.R.; Korporaal, H.; van der Veer, E.M.; Smit, P.W,; Boon, M.E.

    2012-01-01

    Vaginal lactobacilli assessed by PCR-based microarray and PCR-based genotyping of HPV in South African women at risk for HIV and BV. Vaginal lactobacilli can be defined by microarray techniques in fixed cervical samples of South African women. Cervical brush samples suspended in the coagulant

  5. PCR-based identification of eight lactobacillus species and 18 hr-HPV genotypes in fixed cervical samples of south african women at risk of HIV and BV

    NARCIS (Netherlands)

    Dols, J.A.M.; Reid, G.; Kort, R.; Schuren, F.H.J.; Tempelman, H.; Bontekoe, T.R.; Korporaal, H.; Veer, E.M. van der; Smit, P.W.; Boon, M.E.

    2012-01-01

    Vaginal lactobacilli assessed by PCR-based microarray and PCR-based genotyping of HPV in South African women at risk for HIV and BV. Vaginal lactobacilli can be defined by microarray techniques in fixed cervical samples of South African women. Cervical brush samples suspended in the coagulant

  6. Potentiometric chip-based multipumping flow system for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples.

    Science.gov (United States)

    Chango, Gabriela; Palacio, Edwin; Cerdà, Víctor

    2018-08-15

    A simple potentiometric chip-based multipumping flow system (MPFS) has been developed for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples. The proposed system was developed by using a poly(methyl methacrylate) chip microfluidic-conductor using the advantages of flow techniques with potentiometric detection. For this purpose, an automatic system has been designed and built by optimizing the variables involved in the process, such as: pH, ionic strength, stirring and sample volume. This system was applied successfully to water samples getting a versatile system with an analysis frequency of 12 samples per hour. Good correlation between chloride and fluoride concentration measured with ISE and ionic chromatography technique suggests satisfactory reliability of the system. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Fabrication of superconducting MgB2 nanostructures by an electron beam lithography-based technique

    Science.gov (United States)

    Portesi, C.; Borini, S.; Amato, G.; Monticone, E.

    2006-03-01

    In this work, we present the results obtained in fabrication and characterization of magnesium diboride nanowires realized by an electron beam lithography (EBL)-based method. For fabricating MgB2 thin films, an all in situ technique has been used, based on the coevaporation of B and Mg by means of an e-gun and a resistive heater, respectively. Since the high temperatures required for the fabrication of good quality MgB2 thin films do not allow the nanostructuring approach based on the lift-off technique, we structured the samples combining EBL, optical lithography, and Ar milling. In this way, reproducible nanowires 1 μm long have been obtained. To illustrate the impact of the MgB2 film processing on its superconducting properties, we measured the temperature dependence of the resistance on a nanowire and compared it to the original magnesium diboride film. The electrical properties of the films are not degraded as a consequence of the nanostructuring process, so that superconducting nanodevices may be obtained by this method.

  8. Sport Sampling Is Associated With Improved Landing Technique in Youth Athletes.

    Science.gov (United States)

    DiStefano, Lindsay J; Beltz, Eleanor M; Root, Hayley J; Martinez, Jessica C; Houghton, Andrew; Taranto, Nicole; Pearce, Katherine; McConnell, Erin; Muscat, Courtney; Boyle, Steve; Trojian, Thomas H

    Sport sampling is recommended to promote fundamental movement skill acquisition and physical activity. In contrast, sport specialization is associated with musculoskeletal injury risk, burnout, and attrition from sport. There is limited evidence to support the influence of sport sampling on neuromuscular control, which is associated with injury risk, in youth athletes. Athletes who participated in only 1 sport during the previous year would demonstrate higher Landing Error Scoring System (LESS) scores than their counterparts. Cross-sectional study. Level 3. A total of 355 youth athletes (age range, 8-14 years) completed a test session with a jump-landing task, which was evaluated using the LESS. Participants were categorized as single sport (SS) or multisport (MS) based on their self-reported sport participation in the past year. Their duration of sport sampling (low, moderate, high) was determined based on their sport participation history. Participants were dichotomized into good (LESS sampling duration (low, moderate, high). The MS group was 2.5 times (95% CI, 1.9-3.1) as likely to be categorized as having good control compared with the SS group (χ 2 (355) = 10.10, P sampling duration group were 5.8 times (95% CI, 3.1-8.5) and 5.4 times (95% CI, 4.0-6.8) as likely to be categorized as having good control compared with the moderate and low groups (χ 2 (216) = 11.20, P sampling at a young age is associated with improved neuromuscular control, which may reduce injury risk in youth athletes. Youth athletes should be encouraged to try participating in multiple sports to enhance their neuromuscular control and promote long-term physical activity.

  9. Advances in modern sample preparation techniques using microwaves assisted chemistry for metal species determination (W1)

    International Nuclear Information System (INIS)

    Ponard, O.F.X.

    2002-01-01

    Full text: Sample preparation has long been the bottleneck of environmental analysis for both total and species specific analysis. Digestion, extraction and preparation of the analytes are relying on a series of chemical reactions. The introduction of microwave assisted sample preparation has first been viewed as a mean to accelerate the kinetics of digestion of the matrix for total elements and fast samples preparation procedures. However, the extensive development and success of microwave digestion procedures in total elemental analysis has now allowed to have a larger insight of the perspectives offered by this technique. Microwave technologies now offer to have a precise control of the temperature and indirectly control the reaction kinetics taking place during the sample preparation procedures. Microwave assisted chemistry permits to perform simultaneously the fundamental steps required for metal species extraction and derivatization. The number of sample preparation steps used for organotin or organomercury species have been reduced to one and the total time of sample preparation brought down for a few hours to some minutes. Further, the developments of GC/ICP/MS techniques allow to routinely use speciated isotopic dilution methods has internal probe of the chemical reactions. These new approaches allow us to use the addition of the labeled species for isotopic dilution as a mean to evaluate and follow the chemical processes taking place during the extraction procedure. These procedures will help us to understand and check for the stability of the analytes during the chemistry of the sample preparation procedure and bring some insights of the chemistry taking place during the extraction. Understanding the different mechanisms involved in the sample preparation steps will allow us in return to further improve all theses procedures and bring us to the horizon of 'on-line sample preparation and detection'. (author)

  10. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  11. Technique for fast and efficient hierarchical clustering

    Science.gov (United States)

    Stork, Christopher

    2013-10-08

    A fast and efficient technique for hierarchical clustering of samples in a dataset includes compressing the dataset to reduce a number of variables within each of the samples of the dataset. A nearest neighbor matrix is generated to identify nearest neighbor pairs between the samples based on differences between the variables of the samples. The samples are arranged into a hierarchy that groups the samples based on the nearest neighbor matrix. The hierarchy is rendered to a display to graphically illustrate similarities or differences between the samples.

  12. Hyphenated analytical techniques for materials characterisation

    International Nuclear Information System (INIS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-01-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  13. Hyphenated analytical techniques for materials characterisation

    Science.gov (United States)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  14. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    Science.gov (United States)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  15. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  16. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    Science.gov (United States)

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  17. Nasal base narrowing: the combined alar base excision technique.

    Science.gov (United States)

    Foda, Hossam M T

    2007-01-01

    To evaluate the role of the combined alar base excision technique in narrowing the nasal base and correcting excessive alar flare. The study included 60 cases presenting with a wide nasal base and excessive alar flaring. The surgical procedure combined an external alar wedge resection with an internal vestibular floor excision. All cases were followed up for a mean of 32 (range, 12-144) months. Nasal tip modification and correction of any preexisting caudal septal deformities were always completed before the nasal base narrowing. The mean width of the external alar wedge excised was 7.2 (range, 4-11) mm, whereas the mean width of the sill excision was 3.1 (range, 2-7) mm. Completing the internal excision first resulted in a more conservative external resection, thus avoiding any blunting of the alar-facial crease. No cases of postoperative bleeding, infection, or keloid formation were encountered, and the external alar wedge excision healed with an inconspicuous scar that was well hidden in the depth of the alar-facial crease. Finally, the risk of notching of the alar rim, which can occur at the junction of the external and internal excisions, was significantly reduced by adopting a 2-layered closure of the vestibular floor (P = .01). The combined alar base excision resulted in effective narrowing of the nasal base with elimination of excessive alar flare. Commonly feared complications, such as blunting of the alar-facial crease or notching of the alar rim, were avoided by using simple modifications in the technique of excision and closure.

  18. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  19. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  20. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    Science.gov (United States)

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  1. Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.

    Science.gov (United States)

    Dołęgowska, Sabina

    2016-11-01

    In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  2. Indirect Fluorescent Antibody Technique based Prevalence of Surra in Equines

    Directory of Open Access Journals (Sweden)

    Ahsan Nadeem, Asim Aslam*, Zafar Iqbal Chaudhary, Kamran Ashraf1, Khalid Saeed1, Nisar Ahmad1, Ishtiaq Ahmed and Habib ur Rehman2

    2011-04-01

    Full Text Available This project was carried out to find the prevalence of trypanosomiasis in equine in District Gujranwala by using indirect fluorescent antibody technique and thin smear method. Blood samples were collected from a total of 200 horses and donkeys of different ages and either sex. Duplicate thin blood smears were prepared from each sample and remaining blood samples were centrifuged to separate the serum. Smears from each animal were processed for giemsa staining and indirect fluorescent antibody test (IFAT. Giemsa stained smears revealed Trypanosome infection in 4/200 (2.0% samples and IFAT in 12/200 (6.0% animals.

  3. Output Information Based Fault-Tolerant Iterative Learning Control for Dual-Rate Sampling Process with Disturbances and Output Delay

    Directory of Open Access Journals (Sweden)

    Hongfeng Tao

    2018-01-01

    Full Text Available For a class of single-input single-output (SISO dual-rate sampling processes with disturbances and output delay, this paper presents a robust fault-tolerant iterative learning control algorithm based on output information. Firstly, the dual-rate sampling process with output delay is transformed into discrete system in state-space model form with slow sampling rate without time delay by using lifting technology; then output information based fault-tolerant iterative learning control scheme is designed and the control process is turned into an equivalent two-dimensional (2D repetitive process. Moreover, based on the repetitive process stability theory, the sufficient conditions for the stability of system and the design method of robust controller are given in terms of linear matrix inequalities (LMIs technique. Finally, the flow control simulations of two flow tanks in series demonstrate the feasibility and effectiveness of the proposed method.

  4. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  5. Newly developed liquid-based cytology. TACAS™: cytological appearance and HPV testing using liquid-based sample.

    Science.gov (United States)

    Kubushiro, Kaneyuki; Taoka, Hideki; Sakurai, Nobuyuki; Yamamoto, Yasuhiro; Kurasaki, Akiko; Asakawa, Yasuyuki; Iwahara, Minoru; Takahashi, Kei

    2011-09-01

    Cell profiles determined by the thin-layer advanced cytology assay system (TACAS™), a liquid-based cytology technique newly developed in Japan, were analyzed in this study. Hybrid capture 2 (HC-2) was also performed using the liquid-based samples prepared by TACAS to ascertain its ability to detect human papillomavirus (HPV). Cell collection samples from uterine cervix were obtained from 359 patients and examined cytologically. A HC-2 assay for HPV was carried out in the cell specimens. All specimens were found to show background factors such as leukocytes. After excluding the 5 unsatisfactory cases from the total 354 cases, 82 cases (23.2%) were positive and 272 cases (76.8%) were negative for HPV. Cell specimens from 30 HPV-positive cases and 166 HPV-negative cases were subjected to 4 weeks of preservation at room temperature. Then, when subsequently re-assayed, 28 cases (93.3%) in the former group were found to be HPV positive and 164 cases (98.8%) in the latter group were found to be HPV negative. These results supported the excellent reproducibility of TACAS for HPV testing. A reasonable inference from the foregoing analysis is that TACAS may be distinguished from other liquid-based cytological approaches, such as ThinPrep and SurePath, in that it can retain the cell backgrounds. Furthermore, this study raises the possibility that cell specimens prepared using TACAS could be preserved for at least 4 weeks prior to carrying out a HC-2 assay for HPV.

  6. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  7. Single-particle characterization of ice-nucleating particles and ice particle residuals sampled by three different techniques

    Science.gov (United States)

    Worringen, A.; Kandler, K.; Benker, N.; Dirsch, T.; Mertes, S.; Schenk, L.; Kästner, U.; Frank, F.; Nillius, B.; Bundke, U.; Rose, D.; Curtius, J.; Kupiszewski, P.; Weingartner, E.; Vochezer, P.; Schneider, J.; Schmidt, S.; Weinbruch, S.; Ebert, M.

    2015-04-01

    In the present work, three different techniques to separate ice-nucleating particles (INPs) as well as ice particle residuals (IPRs) from non-ice-active particles are compared. The Ice Selective Inlet (ISI) and the Ice Counterflow Virtual Impactor (Ice-CVI) sample ice particles from mixed-phase clouds and allow after evaporation in the instrument for the analysis of the residuals. The Fast Ice Nucleus Chamber (FINCH) coupled with the Ice Nuclei Pumped Counterflow Virtual Impactor (IN-PCVI) provides ice-activating conditions to aerosol particles and extracts the activated particles for analysis. The instruments were run during a joint field campaign which took place in January and February 2013 at the High Alpine Research Station Jungfraujoch (Switzerland). INPs and IPRs were analyzed offline by scanning electron microscopy and energy-dispersive X-ray microanalysis to determine their size, chemical composition and mixing state. Online analysis of the size and chemical composition of INP activated in FINCH was performed by laser ablation mass spectrometry. With all three INP/IPR separation techniques high abundances (median 20-70%) of instrumental contamination artifacts were observed (ISI: Si-O spheres, probably calibration aerosol; Ice-CVI: Al-O particles; FINCH + IN-PCVI: steel particles). After removal of the instrumental contamination particles, silicates, Ca-rich particles, carbonaceous material and metal oxides were the major INP/IPR particle types obtained by all three techniques. In addition, considerable amounts (median abundance mostly a few percent) of soluble material (e.g., sea salt, sulfates) were observed. As these soluble particles are often not expected to act as INP/IPR, we consider them as potential measurement artifacts. Minor types of INP/IPR include soot and Pb-bearing particles. The Pb-bearing particles are mainly present as an internal mixture with other particle types. Most samples showed a maximum of the INP/IPR size distribution at 200

  8. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  9. A comparative study of 232Th and 238U activity estimation in soil samples by gamma spectrometry and neutron activation analysis technique

    International Nuclear Information System (INIS)

    Anilkumar, Rekha; Anilkumar, S.; Narayani, K.; Babu, D.A.R.; Sharma, D.N.

    2012-01-01

    Neutron activation analysis (NAA) is a well-established analytical technique. It has many advantages as compared to the other commonly used techniques. NAA can be performed in a variety of ways depending on the element, its activity level in the sample, interference from the sample matrix and other elements, etc. This technique is used to get high analytical sensitivity and low detection limits (ppm to ppb). The high sensitivity is due to the irradiation at high neutron flux available from the research reactors and the activity measurement is done using high resolution HPGe detectors. In this paper, the activity estimation of soil samples using neutron activation and direct gamma spectrometry methods are compared. Even though the weights of samples considered and samples preparation methods are different for these two methods, the estimated activity values are comparable. (author)

  10. Development and application of the analyzer-based imaging technique with hard synchrotron radiation; Developpement et application d'une technique d'imagerie par rayonnement synchrotron basee sur l'utilisation d'un cristal analyseur

    Energy Technology Data Exchange (ETDEWEB)

    Coan, P

    2006-07-15

    The objective of this thesis is twofold: from one side the application of the analyser-based X-ray phase contrast imaging to study cartilage, bone and bone implants using ESRF synchrotron radiation sources and on the other to contribute to the development of the phase contrast techniques from the theoretical and experimental point of view. Several human samples have been studied in vitro using the analyser based imaging (ABI) technique. Examination included projection and computed tomography imaging and 3-dimensional volume rendering of hip, big toe and ankle articular joints. X-ray ABI images have been critically compared with those obtained with conventional techniques, including radiography, computed tomography, ultrasound, magnetic resonance and histology, the latter taken as gold standard. Results show that only ABI imaging was able to either visualize or correctly estimate the early pathological status of the cartilage. The status of the bone ingrowth in sheep implants have also been examined in vitro: ABI images permitted to correctly distinguish between good and incomplete bone healing. Pioneering in-vivo ABI on guinea pigs were also successfully performed, confirming the possible use of the technique to follow up the progression of joint diseases, the bone/metal ingrowth and the efficacy of drugs treatments. As part of the development of the phase contrast techniques, two objectives have been reached. First, it has been experimentally demonstrated for the first time that the ABI and the propagation based imaging (PBI) can be combined to create images with original features (hybrid imaging, HI). Secondly, it has been proposed and experimentally tested a new simplified set-up capable to produce images with properties similar to those obtained with the ABI technique or HI. Finally, both the ABI and the HI have been theoretically studied with an innovative, wave-based simulation program, which was able to correctly reproduce experimental results. (author)

  11. Nuclear techniques for trace element analysis. PIXE and its applications to biomedical samples

    International Nuclear Information System (INIS)

    Cata-Danil, I.; Moro, R.; Gialanella, G.

    1996-01-01

    Problems in understanding the role of trace elements in the functioning of life processes are discussed. A brief review of the state of the PIXE technique is given. Principles and recent advances in beam systems, instrumentation and sample handling are covered. A rather comprehensive list of references regarding varies methodological aspects and biomedical applications is given. Some applications are discussed. In particular, preliminary results of an investigation regarding pediatric obesity are presented. (author) 5 tabs., 21 refs

  12. Determination of elements in industrial waste sample and TENORM using XRF Technique in Nuclear Malaysia

    International Nuclear Information System (INIS)

    Paulus, W.; Sarimah Mahat; Meor Yusoff Meor Sulaiman

    2011-01-01

    Industrial waste such as aluminium dross and TENORM waste, oil sludge has been used as sample in this research. Determination of main elements by using X-Ray Fluorescence (XRF) in Material Technology Group, Malaysian Nuclear Agency. Results shows that main elements in these samples, aluminium and silicon, respectively. Thereby, this research shows that XRF can be considered as one of the techniques that can be used in waste characterization and furthermore, it can help researchers and engineer in the research related to waste treatment especially radioactive waste. (author)

  13. Effect of sample preparation techniques on the concentrations and distributions of elements in biological tissues using µSRXRF: a comparative study

    International Nuclear Information System (INIS)

    Al-Ebraheem, A; Dao, E; Desouza, E; McNeill, F E; Farquharson, M J; Li, C; Wainman, B C

    2015-01-01

    Routine tissue sample preparation using chemical fixatives is known to preserve the morphology of the tissue being studied. A competitive method, cryofixation followed by freeze drying, involves no chemical agents and maintains the biological function of the tissue. The possible effects of both sample preparation techniques in terms of the distribution of bio-metals (calcium (Ca), copper (Cu) zinc (Zn), and iron (Fe) specifically) in human skin tissue samples was investigated. Micro synchrotron radiation x-ray fluorescence (μSRXRF) was used to map bio-metal distribution in epidermal and dermal layers of human skin samples from various locations of the body that have been prepared using both techniques. For Ca, Cu and Zn, there were statistically significant differences between the epidermis and dermis using the freeze drying technique (p = 0.02, p < 0.01, and p < 0.01, respectively). Also using the formalin fixed, paraffin embedded technique the levels of Ca, Cu and Zn, were significantly different between the epidermis and dermis layers (p = 0.03, p < 0.01, and p < 0.01, respectively). However, the difference in levels of Fe between the epidermis and dermis was unclear and further analysis was required. The epidermis was further divided into two sub-layers, one mainly composed of the stratum corneum and the other deeper layer, the stratum basale. It was found that the difference between the distribution of Fe in the two epidermal layers using the freeze drying technique resulted in a statistically significant difference (p = 0.012). This same region also showed a difference in Fe using the formalin fixed, paraffin embedded technique (p < 0.01). The formalin fixed, paraffin embedded technique also showed a difference between the deeper epidermal layer and the dermis (p < 0.01). It can be concluded that studies involving Ca, Cu and Zn might show similar results using both sample preparation techniques, however studies involving Fe would need more

  14. Application of neutron activation analysis technique in elemental determination of lichen samples

    International Nuclear Information System (INIS)

    Djoko Prakoso Dwi Atmodjo; Syukria Kurniawati; Woro Yatu Niken Syahfitri; Nana Suherman; Dadang Supriatna

    2010-01-01

    Lichen is one of the biological materials as pollution monitor that can give information about level, direction, and history of various pollutants in environment. Small sample weights and elemental content of lichens is on the order of ppm, so that its characterization requires advanced analytical techniques that has high sensitivity and capable of analyzing samples with weight of - 25 mg, such as neutron activation analysis. In this research, determination of elements was done in lichen samples obtained from Kiaracondong and Holis areas in Bandung city, to understanding the difference of industrial exposure level on surrounding environment. Samples were irradiated in RSG GA Siwabessy, Serpong, at 15 MW for 1-2 and 60 minutes for short irradiation and long irradiation, respectively. The samples were then counted using HPGe detector with GENIE 2000 software. The level of element in lichen for Kiaracondong area were Co, Cr, Cs, Fe, Mg, Mn, Sb, Sc, and V in the range of 0.55-0.86, 1.47-2.57, 0.87-1.19, 540-1005, 949-1674, 34.91-45.94, 0.08-0.14, 0.16-0.31, and ≤ 2.33 mg/kg, respectively, while for Holis area were 1.04-2.37, 4.41-10.36, 0.41-0.89, 3166-709, 1131-1422, 40.97-72.51, 0.33-0.50, 0.98-2.18, and 5.30-13.05 mg/kg respectively. From these results, it is known that pollution exposure from the semi industrial area Holis provide greater influence than in the semi industrial area Kiaracondong. (author)

  15. Graphics for the multivariate two-sample problem

    International Nuclear Information System (INIS)

    Friedman, J.H.; Rafsky, L.C.

    1981-01-01

    Some graphical methods for comparing multivariate samples are presented. These methods are based on minimal spanning tree techniques developed for multivariate two-sample tests. The utility of these methods is illustrated through examples using both real and artificial data

  16. Influence of austempering heat treatment on mechanical and corrosion properties of ductile iron samples

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2016-07-01

    Full Text Available Mechanical properties and corrosion resistance of metals are closely related to the microstructure characteristics of the material. The paper compares the results of these two sets of properties after investigating samples of base ductile iron and heat-treated samples of the base austempered ductile iron (ADI. The basic material is perlite ferritic iron alloyed with copper and nickel. To test the corrosion rate of the base material (ductile iron and the heattreated samples (ADI, electrochemical techniques of potentiostatic polarization were used (the technique of Tafel curves extrapolation and the potentiodynamic polarization technique.

  17. Effect of DNA extraction methods and sampling techniques on the apparent structure of cow and sheep rumen microbial communities.

    Directory of Open Access Journals (Sweden)

    Gemma Henderson

    Full Text Available Molecular microbial ecology techniques are widely used to study the composition of the rumen microbiota and to increase understanding of the roles they play. Therefore, sampling and DNA extraction methods that result in adequate yields of microbial DNA that also accurately represents the microbial community are crucial. Fifteen different methods were used to extract DNA from cow and sheep rumen samples. The DNA yield and quality, and its suitability for downstream PCR amplifications varied considerably, depending on the DNA extraction method used. DNA extracts from nine extraction methods that passed these first quality criteria were evaluated further by quantitative PCR enumeration of microbial marker loci. Absolute microbial numbers, determined on the same rumen samples, differed by more than 100-fold, depending on the DNA extraction method used. The apparent compositions of the archaeal, bacterial, ciliate protozoal, and fungal communities in identical rumen samples were assessed using 454 Titanium pyrosequencing. Significant differences in microbial community composition were observed between extraction methods, for example in the relative abundances of members of the phyla Bacteroidetes and Firmicutes. Microbial communities in parallel samples collected from cows by oral stomach-tubing or through a rumen fistula, and in liquid and solid rumen digesta fractions, were compared using one of the DNA extraction methods. Community representations were generally similar, regardless of the rumen sampling technique used, but significant differences in the abundances of some microbial taxa such as the Clostridiales and the Methanobrevibacter ruminantium clade were observed. The apparent microbial community composition differed between rumen sample fractions, and Prevotellaceae were most abundant in the liquid fraction. DNA extraction methods that involved phenol-chloroform extraction and mechanical lysis steps tended to be more comparable. However

  18. Investigation of an egyptian phosphate ore sample by neutron activation analysis technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Aly, R.A.; Rofail, N.B.; Hassan, A.M.

    1995-01-01

    A domestic phosphate ore sample has been analysed by means of prompt and delayed gamma-ray spectrometry following the activation by thermal neutron capture technique. The rabbit pneumatic transfer system (RPTS), long irradiation facility and two Pu/Be (2,5 Ci each) neutron sources set-Pu for prompt (n,gamma) were applied. The high purity germanium (HPGe) gamma-ray spectrometer with a personal computer analyzer (PCA) system were used for spectrum measurements. Programmes on the VAX computer were utilized for estimating the elemental concentrations of 22 out of 36 elements identified in this work. 2 tabs

  19. The role of model-based methods in the development of single scan techniques

    International Nuclear Information System (INIS)

    Laruelle, Marc

    2000-01-01

    Single scan techniques are highly desirable for clinical trials involving radiotracers because they increase logistical feasibility, improve patient compliance, and decrease the cost associated with the study. However, the information derived from single scans usually are biased by factors unrelated to the process of interest. Therefore, identification of these factors and evaluation of their impact on the proposed outcome measure is important. In this paper, the impact of confounding factors on single scan measurements is illustrated by discussing the effect of between-subject or between-condition differences in radiotracer plasma clearance on normalized activity ratios (specific to nonspecific ratios) in the tissue of interest. Computer simulation based on kinetic analyses are presented to demonstrate this effect. It is proposed that the presence of this and other confounding factors should not necessarily preclude clinical trials based on single scan techniques. First, knowledge of the distribution of plasma clearance values in a sample of the investigated population allows researchers to assign limits to this potential bias. This information can be integrated in the power analysis. Second, the impact of this problem will vary according to the characteristic of the radiotracer, and this information can be used in the development and selection of the radiotracer. Third, simple modification of the experimental design (such as administration of the radiotracer as a bolus, followed by constant infusion, rather than as a single bolus) might remove this potential confounding factor and allow appropriate quantification within the limits of a single scanning session. In conclusion, model-based kinetic characterization of radiotracer distribution and uptake is critical to the design and interpretation of clinical trials based on single scan techniques

  20. Comparison between two sampling methods by results obtained using petrographic techniques, specially developed for minerals of the Itataia uranium phosphate deposit, Ceara, Brazil

    International Nuclear Information System (INIS)

    Salas, H.T.; Murta, R.L.L.

    1985-01-01

    The results of comparison of two sampling methods applied to a gallery of the uranium-phosphate ore body of Itataia-Ceara State, Brazil, along 235 metres of mineralized zone, are presented. The results were obtained through petrographic techniques especially developed and applied to both samplings. In the first one it was studied hand samples from a systematically sampling made at intervals of 2 metres. After that, the estimated mineralogical composition studies were carried out. Some petrogenetic observations were for the first time verified. The second sampling was made at intervals of 20 metres and 570 tons of ore extracted and distributed in sections and a sample representing each section was studied after crushing at -65. Their mineralogy were quantified and the degree of liberation of apatite calculated. Based on the mineralogical data obtained it was possible to represent both samplings and to make the comparison of the main mineralogical groups (phosphates, carbonates and silicates). In spite of utilizing different methods and methodology and the kind of mineralization, stockwork, being quite irregular, the results were satisfactory. (Author) [pt

  1. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    Science.gov (United States)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-02-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults.

  2. Compressed sensing of roller bearing fault based on multiple down-sampling strategy

    International Nuclear Information System (INIS)

    Wang, Huaqing; Ke, Yanliang; Luo, Ganggang; Tang, Gang

    2016-01-01

    Roller bearings are essential components of rotating machinery and are often exposed to complex operating conditions, which can easily lead to their failures. Thus, to ensure normal production and the safety of machine operators, it is essential to detect the failures as soon as possible. However, it is a major challenge to maintain a balance between detection efficiency and big data acquisition given the limitations of sampling theory. To overcome these limitations, we try to preserve the information pertaining to roller bearing failures using a sampling rate far below the Nyquist sampling rate, which can ease the pressure generated by the large-scale data. The big data of a faulty roller bearing’s vibration signals is firstly reduced by a down-sample strategy while preserving the fault features by selecting peaks to represent the data segments in time domain. However, a problem arises in that the fault features may be weaker than before, since the noise may be mistaken for the peaks when the noise is stronger than the vibration signals, which makes the fault features unable to be extracted by commonly-used envelope analysis. Here we employ compressive sensing theory to overcome this problem, which can make a signal enhancement and reduce the sample sizes further. Moreover, it is capable of detecting fault features from a small number of samples based on orthogonal matching pursuit approach, which can overcome the shortcomings of the multiple down-sample algorithm. Experimental results validate the effectiveness of the proposed technique in detecting roller bearing faults. (paper)

  3. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  4. Localisation and identification of radioactive particles in solid samples by means of a nuclear track technique

    International Nuclear Information System (INIS)

    Boehnke, Antje; Treutler, Hanns-Christian; Freyer, Klaus; Schubert, Michael; Holger Weiss

    2005-01-01

    This study is aimed to develop a generally applicable methodology of investigation that can be used for the localisation of single alpha-active particles in solid samples, such as industrial dust or natural soils, sediments and rocks by autoradiography using solid-state nuclear track detectors. The developed technique allows the detection of local enrichments of alpha-emitters in any solid material. The results of such an investigation are of interest from technical, biological and environmental points of view. The idea behind the methodology is to locate the position of alpha-active spots in a sample by attaching the track detector to the sample in a defined manner, thoroughly described in the paper. The located alpha-active particles are subsequently analysed by an electron microscope and an electron microprobe. An example of the application of this methodology is also given. An ultra-fine -grained ore-processing residue, which causes serious environmental pollution in the respective mining district and thus limits possible land use and affects quality of life in the area, was examined using the described technique. The investigation revealed considerable amounts of alpha-active particles in this material

  5. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    Science.gov (United States)

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  6. Statistical surrogate model based sampling criterion for stochastic global optimization of problems with constraints

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Su Gil; Jang, Jun Yong; Kim, Ji Hoon; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Min Uk [Romax Technology Ltd., Seoul (Korea, Republic of); Choi, Jong Su; Hong, Sup [Korea Research Institute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-04-15

    Sequential surrogate model-based global optimization algorithms, such as super-EGO, have been developed to increase the efficiency of commonly used global optimization technique as well as to ensure the accuracy of optimization. However, earlier studies have drawbacks because there are three phases in the optimization loop and empirical parameters. We propose a united sampling criterion to simplify the algorithm and to achieve the global optimum of problems with constraints without any empirical parameters. It is able to select the points located in a feasible region with high model uncertainty as well as the points along the boundary of constraint at the lowest objective value. The mean squared error determines which criterion is more dominant among the infill sampling criterion and boundary sampling criterion. Also, the method guarantees the accuracy of the surrogate model because the sample points are not located within extremely small regions like super-EGO. The performance of the proposed method, such as the solvability of a problem, convergence properties, and efficiency, are validated through nonlinear numerical examples with disconnected feasible regions.

  7. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  8. Mean-Variance-Validation Technique for Sequential Kriging Metamodels

    International Nuclear Information System (INIS)

    Lee, Tae Hee; Kim, Ho Sung

    2010-01-01

    The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean 0 validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean 0 validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels

  9. An Observed Voting System Based On Biometric Technique

    Directory of Open Access Journals (Sweden)

    B. Devikiruba

    2015-08-01

    Full Text Available ABSTRACT This article describes a computational framework which can run almost on every computer connected to an IP based network to study biometric techniques. This paper discusses with a system protecting confidential information puts strong security demands on the identification. Biometry provides us with a user-friendly method for this identification and is becoming a competitor for current identification mechanisms. The experimentation section focuses on biometric verification specifically based on fingerprints. This article should be read as a warning to those thinking of using methods of identification without first examine the technical opportunities for compromising mechanisms and the associated legal consequences. The development is based on the java language that easily improves software packages that is useful to test new control techniques.

  10. Coacervative extraction as a green technique for sample preparation for the analysis of organic compounds.

    Science.gov (United States)

    Melnyk, A; Wolska, L; Namieśnik, J

    2014-04-25

    One of the present trends in analytical chemistry is miniaturization, which is one of the methods of green analytical chemistry application. A particular emphasis is placed on the elimination of the use of large amounts of organic solvents which are toxic and harmful to the environment, maintaining high efficiency of the extraction process, high recovery values and low values of quantification (LOQ) and detection (LOD) limits. These requirements are fulfilled by coacervative extraction (CAE) technique. In this review, theoretical aspects of the coacervation process are presented along with environmental and bioanalytical applications of this technique, its advantages, limitations and competitiveness with other techniques. Due to its simplicity and rapidity, CAE is an excellent alternative for currently practiced procedures of sample preparation for the analysis of organic compounds. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Fast neutron and gamma-ray transmission technique in mixed samples. MCNP calculations

    International Nuclear Information System (INIS)

    Perez, N.; Padron, I.

    2001-01-01

    In this paper the moisture in sand and also the sulfur content in toluene have been described by using the simultaneous fast neutron/gamma transmission technique (FNGT). Monte Carlo calculations show that it is possible to apply this technique with accelerator-based and isotopic neutron sources in the on-line analysis to perform the product quality control, specifically in the building materials industry and the petroleum one. It has been used particles from a 14MeV neutron generator and also from an Am-Be neutron source. The estimation of optimal system parameters like the efficiency, detection time, hazards and costs were performed in order to compare both neutron sources

  12. Search for Fluid Inclusions in a Carbonaceous Chondrite Using a New X-Ray Micro-Tomography Technique Combined with FIB Sampling

    Science.gov (United States)

    Tsuchiyama, A.; Miyake, A.; Zolensky, M. E.; Uesugi, K.; Nakano, T.; Takeuchi, A.; Suzuki, Y.; Yoshida, K.

    2014-01-01

    Early solar system aqueous fluids are preserved in some H chondrites as aqueous fluid inclusions in halite (e.g., [1]). Although potential fluid inclusions are also expected in carbonaceous chondrites [2], they have not been surely confirmed. In order to search for these fluid inclusions, we have developped a new X-ray micro-tomography technique combined with FIB sampling and applied this techniqu to a carbanaceous chondrite. Experimental: A polished thin section of Sutter's Mill meteorite (CM) was observed with an optical microscope and FE-SEM (JEOL 7001F) for chosing mineral grains of carbonates (mainly calcite) and sulfides (FeS and ZnS) 20-50 microns in typical size, which may have aqueous fluid inclusions. Then, a "house" similar to a cube with a roof (20-30 microns in size) is sampled from the mineral grain by using FIB (FEI Quanta 200 3DS). Then, the house was atached to a thin W-needle by FIB and imaged by a SR-based imaging microtomography system with a Fresnel zone plate at beamline BL47XU, SPring-8, Japan. One sample was imaged at two X-ray energies, 7 and 8 keV, to identify mineral phases (dual-enegy microtomography: [3]). The size of voxel (pixel in 3D) was 50-80 nm, which gave the effective spatial resolution of approx. 200 nm. A terrestrial quartz sample with an aqueous fluid inclusion with a bubble was also examined as a test sample by the same method. Results and discussion: A fluid inclusion of 5-8 microns in quartz was clearly identified in a CT image. A bubble of approx. 4 microns was also identified as refraction contrast although the X-ray absorption difference between fluid and bubble is small. Volumes of the fluid and bubble were obtained from the 3D CT images. Fourteen grains of calcite, two grains of iron sulfide and one grain of (Zn,Fe)S were examined. Ten calcite, one iron sulfide and one (Zn,Fe)S grains have inclusions >1 micron in size (the maximum: approx. 5 microns). The shapes are spherical or irregular. Tiny inclusions (tiny solid

  13. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  14. A Story-Based Simulation for Teaching Sampling Distributions

    Science.gov (United States)

    Turner, Stephen; Dabney, Alan R.

    2015-01-01

    Statistical inference relies heavily on the concept of sampling distributions. However, sampling distributions are difficult to teach. We present a series of short animations that are story-based, with associated assessments. We hope that our contribution can be useful as a tool to teach sampling distributions in the introductory statistics…

  15. Amorphous and liquid samples structure and density measurements at high pressure - high temperature using diffraction and imaging techniques

    Science.gov (United States)

    Guignot, N.; King, A.; Clark, A. N.; Perrillat, J. P.; Boulard, E.; Morard, G.; Deslandes, J. P.; Itié, J. P.; Ritter, X.; Sanchez-Valle, C.

    2016-12-01

    Determination of the density and structure of liquids such as iron alloys, silicates and carbonates is a key to understand deep Earth structure and dynamics. X-ray diffraction provided by large synchrotron facilities gives excellent results as long as the signal scattered from the sample can be isolated from its environment. Different techniques already exist; we present here the implementation and the first results given by the combined angle- and energy-dispersive structural analysis and refinement (CAESAR) technique introduced by Wang et al. in 2004, that has never been used in this context. It has several advantages in the study of liquids: 1/ the standard energy-dispersive technique (EDX), fast and compatible with large multi-anvil presses frames, is used for fast analysis free of signal pollution from the sample environment 2/ some limitations of the EDX technique (homogeneity of the sample, low resolution) are irrelevant in the case of liquid signals, others (wrong intensities, escape peaks artifacts, background subtraction) are solved by the CAESAR technique 3/ high Q data (up to 15 A-1 and more) can be obtained in a few hours (usually less than 2). We present here the facilities available on the PSICHE beamline (SOLEIL synchrotron, France) and a few results obtained using a Paris-Edinburgh (PE) press and a 1200 tons load capacity multi-anvil press with a (100) DIA compression module. X-ray microtomography, used in conjunction with a PE press featuring rotating anvils (RotoPEc, Philippe et al., 2013) is also very effective, by simply measuring the 3D volume of glass or liquid spheres at HPHT, thus providing density. This can be done in conjunction with the CAESAR technique and we illustrate this point. Finally, absorption profiles can be obtained via imaging techniques, providing another independent way to measure the density of these materials. References Y. Wang et al., A new technique for angle-dispersive powder diffraction using an energy

  16. Detection of Wuchereria bancrofti DNA in paired serum and urine samples using polymerase chain reaction-based systems

    Directory of Open Access Journals (Sweden)

    Camila Ximenes

    2014-12-01

    Full Text Available The Global Program for the Elimination of Lymphatic Filariasis (GPELF aims to eliminate this disease by the year 2020. However, the development of more specific and sensitive tests is important for the success of the GPELF. The present study aimed to standardise polymerase chain reaction (PCR-based systems for the diagnosis of filariasis in serum and urine. Twenty paired biological urine and serum samples from individuals already known to be positive for Wuchereria bancrofti were collected during the day. Conventional PCR and semi-nested PCR assays were optimised. The detection limit of the technique for purified W. bancrofti DNA extracted from adult worms was 10 fg for the internal systems (WbF/Wb2 and 0.1 fg by using semi-nested PCR. The specificity of the primers was confirmed experimentally by amplification of 1 ng of purified genomic DNA from other species of parasites. Evaluation of the paired urine and serum samples by the semi-nested PCR technique indicated only two of the 20 tested individuals were positive, whereas the simple internal PCR system (WbF/Wb2, which has highly promising performance, revealed that all the patients were positive using both samples. This study successfully demonstrated the possibility of using the PCR technique on urine for the diagnosis of W. bancrofti infection.

  17. The generalization ability of online SVM classification based on Markov sampling.

    Science.gov (United States)

    Xu, Jie; Yan Tang, Yuan; Zou, Bin; Xu, Zongben; Li, Luoqing; Lu, Yang

    2015-03-01

    In this paper, we consider online support vector machine (SVM) classification learning algorithms with uniformly ergodic Markov chain (u.e.M.c.) samples. We establish the bound on the misclassification error of an online SVM classification algorithm with u.e.M.c. samples based on reproducing kernel Hilbert spaces and obtain a satisfactory convergence rate. We also introduce a novel online SVM classification algorithm based on Markov sampling, and present the numerical studies on the learning ability of online SVM classification based on Markov sampling for benchmark repository. The numerical studies show that the learning performance of the online SVM classification algorithm based on Markov sampling is better than that of classical online SVM classification based on random sampling as the size of training samples is larger.

  18. Issues in the analyze of low content gold mining samples by fire assay technique

    Science.gov (United States)

    Cetean, Valentina

    2016-04-01

    The classic technique analyze of samples with low gold content - below 0.1 g/t (=100 ppb = parts per billion), either ore or gold sediments, involves the preparation of sample by fire assay extraction, followed by the chemical attack with aqua regia (hydrochloric and nitric acid) and measuring the gold content by atomic absorption spectrometry or inductively coupled mass spectrometry. The issues raised by this analysis are well known for the world laboratories, commercial or research ones. The author's knowledge regarding this method of determining the gold content, accumulated in such laboratory from Romania (with more than 40 years of experience, even if not longer available from 2014) confirms the obtaining of reliable results required a lot of attention, amount of work and the involving of an experienced fire assayer specialist. The analytical conclusion for a research laboratory is that most reliable and statistically valid results are till reached for samples with more than 100 ppb gold content; the degree of confidence below this value is lower than 90%. Usually, for samples below 50 ppb, it does not exceed 50-70 %, unless without very strictly control of each stage, that involve additional percentage of hours allocated for successive extracting tests and knowing more precisely the other compounds that appear in the sample (Cu, Sb, As, sulfur / sulphides, Te, organic matter, etc.) or impurities. The most important operation is the preparation, namely: - grinding and splitting of sample (which can cause uneven distribution of gold flakes in the double samples for analyzed); - pyro-metallurgical recovery of gold = fire assay stage, involving the more precise temperature control in furnace during all stages (fusion and cupellation) and adjusting of the fire assay flux components to produce a successful fusion depending of the sample matrix and content; - reducing the sample weight to decrease the amount of impurities that can be concentrated in the lead button

  19. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  20. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  1. Environmental pollutants monitoring network using nuclear techniques

    International Nuclear Information System (INIS)

    Cohen, D.D.

    1994-01-01

    The Australian Nuclear Science and Technology Organisation (ANSTO) in collaboration with the NSW Environment Protection Authority (EPA), Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 60,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 μm particle diameter cut off and samples for 24 hours using a stretched Teflon filter for each day. Accelerator-based Ion Beam Analysis(IBA) techniques are well suited to analyse the thousands of filter papers a year that originate from such a large scale aerosol sampling network. These techniques are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on a 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. This paper described the four simultaneous accelerator based IBA techniques used at ANSTO, to analyse for the following 24 elements H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. Each analysis requires only a few minutes of accelerator running time to complete. 15 refs., 9 figs

  2. Development of spatial scaling technique of forest health sample point information

    Science.gov (United States)

    Lee, J.; Ryu, J.; Choi, Y. Y.; Chung, H. I.; Kim, S. H.; Jeon, S. W.

    2017-12-01

    Most forest health assessments are limited to monitoring sampling sites. The monitoring of forest health in Britain in Britain was carried out mainly on five species (Norway spruce, Sitka spruce, Scots pine, Oak, Beech) Database construction using Oracle database program with density The Forest Health Assessment in GreatBay in the United States was conducted to identify the characteristics of the ecosystem populations of each area based on the evaluation of forest health by tree species, diameter at breast height, water pipe and density in summer and fall of 200. In the case of Korea, in the first evaluation report on forest health vitality, 1000 sample points were placed in the forests using a systematic method of arranging forests at 4Km × 4Km at regular intervals based on an sample point, and 29 items in four categories such as tree health, vegetation, soil, and atmosphere. As mentioned above, existing researches have been done through the monitoring of the survey sample points, and it is difficult to collect information to support customized policies for the regional survey sites. In the case of special forests such as urban forests and major forests, policy and management appropriate to the forest characteristics are needed. Therefore, it is necessary to expand the survey headquarters for diagnosis and evaluation of customized forest health. For this reason, we have constructed a method of spatial scale through the spatial interpolation according to the characteristics of each index of the main sample point table of 29 index in the four points of diagnosis and evaluation report of the first forest health vitality report, PCA statistical analysis and correlative analysis are conducted to construct the indicators with significance, and then weights are selected for each index, and evaluation of forest health is conducted through statistical grading.

  3. Nonlinear optical characterization of phosphate glasses based on ZnO using the Z-scan technique

    International Nuclear Information System (INIS)

    Mojdehi Masoumeh Shokati; Yunus Wan Mahmood Mat; Talib Zainal Abidin; Tamchek, N.; Fhan Khor Shing

    2013-01-01

    The nonlinear optical properties of a phosphate vitreous system [(ZnO) x − (MgO) 30−x − (P 2 O 5 ) 70 ], where x = 8, 10, 15, 18, and 20 mol% synthesized through the melt-quenching technique have been investigated by using the Z-scan technique. In the experiment, a continuous-wave laser with a wavelength of 405 nm was utilized to determine the sign and value of the nonlinear refractive (NLR) index and the absorption coefficient with closed and opened apertures of the Z-scan setup. The NLR index was found to increase with the ZnO concentration in the glass samples by an order of 10 −10 cm 2 ·W −1 . The real and imaginary parts of the third-order nonlinear susceptibility were calculated by referring to the NLR index (n 2 ) and absorption coefficient (β) of the samples. The value of the third-order nonlinear susceptibility was presented by nonlinear refractive or absorptive behavior of phosphate glasses for proper utilization in nonlinear optical devices. Based on the measurement, the positive sign of the NLR index shows a self-focusing phenomenon. The figures of merit for each sample were calculated to judge the potential of phosphate glasses for application in optical switching

  4. An offset tone based gain stabilization technique for mixed-signal RF measurement systems

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Gopal, E-mail: gjos@barc.gov.in [BARC, Mumbai 400085 (India); Motiwala, Paresh D.; Randale, G.D.; Singh, Pitamber [BARC, Mumbai 400085 (India); Agarwal, Vivek; Kumar, Girish [IIT Bombay, Powai, Mumbai 400076 (India)

    2015-09-21

    This paper describes a gain stabilization technique for a RF signal measurement system. A sinusoidal signal of known amplitude, phase and close enough in frequency is added to the main, to be measured RF signal at the input of the analog section. The system stabilizes this offset tone in the digital domain, as it is sampled at the output of the analog section. This process generates a correction factor needed to stabilize the magnitude of the gain of the analog section for the main RF signal. With the help of a simple calibration procedure, the absolute amplitude of the main RF signal can be measured. The technique is especially suited for a system that processes signals around a single frequency, employs direct signal conversion into the digital domain, and processes subsequent steps in an FPGA. The inherent parallel signal processing in an FPGA-based implementation allows a real time stabilization of the gain. The effectiveness of the technique is derived from the fact, that the gain stabilization stamped to the main RF signal measurement branch requires only a few components in the system to be inherently stable. A test setup, along with experimental results is presented from the field of RF instrumentation for particle accelerators. Due to the availability of a phase synchronized RF reference signal in these systems, the measured phase difference between the main RF and the RF reference is also stabilized using this technique. A scheme of the signal processing is presented, where a moving average filter has been used to filter out not only the unwanted frequencies, but also to separate the main RF signal from the offset tone signal. This is achieved by a suitable choice of sampling and offset tone frequencies. The presented signal processing scheme is suitable to a variety of RF measurement applications.

  5. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  6. Technique for preparation of transmission electron microscope specimens from wire samples of Al and Al-Al2O3 alloys

    DEFF Research Database (Denmark)

    Lindbo, Jørgen

    1966-01-01

    A technique for thinning 1 mm wire samples of aluminium and aluminium-alumina alloys for transmission electron microscopy is described. The essential feature of the technique, which involves spark machining and electropolishing in a polytetrafluoroethylene holder followed by chemical polishing...

  7. Fast Measurement of Soluble Solid Content in Mango Based on Visible and Infrared Spectroscopy Technique

    Science.gov (United States)

    Yu, Jiajia; He, Yong

    Mango is a kind of popular tropical fruit, and the soluble solid content is an important in this study visible and short-wave near-infrared spectroscopy (VIS/SWNIR) technique was applied. For sake of investigating the feasibility of using VIS/SWNIR spectroscopy to measure the soluble solid content in mango, and validating the performance of selected sensitive bands, for the calibration set was formed by 135 mango samples, while the remaining 45 mango samples for the prediction set. The combination of partial least squares and backpropagation artificial neural networks (PLS-BP) was used to calculate the prediction model based on raw spectrum data. Based on PLS-BP, the determination coefficient for prediction (Rp) was 0.757 and root mean square and the process is simple and easy to operate. Compared with the Partial least squares (PLS) result, the performance of PLS-BP is better.

  8. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    Energy Technology Data Exchange (ETDEWEB)

    Dahing, Lahasen Normanshah [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia and Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia); Yahya, Redzuan [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia); Yahya, Roslan; Hassan, Hearie [Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia)

    2014-09-03

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm{sup 3} and 15×15×15 cm{sup 3} were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  9. Bases of technique of sprinting

    Directory of Open Access Journals (Sweden)

    Valeriy Druz

    2015-06-01

    Full Text Available Purpose: to determine the biomechanical consistent patterns of a movement of a body providing the highest speed of sprinting. Material and Methods: the analysis of scientific and methodical literature on the considered problem, the anthropometrical characteristics of the surveyed contingent of sportsmen, the analysis of high-speed shootings of the leading runners of the world. Results: the biomechanical bases of technique of sprinting make dispersal and movement of the general center of body weight of the sportsman on a parabolic curve in a start phase taking into account the initial height of its stay in a pose of a low start. Its further movement happens on a cycloidal trajectory which is formed due to a pendulum movement of the extremities creating the lifting power which provides flight duration more in a running step, than duration of a basic phase. Conclusions: the received biomechanical regularities of technique of sprinting allow increasing the efficiency of training of sportsmen in sprinting.

  10. One-step derivatization and preconcentration microextraction technique for determination of bisphenol A in beverage samples by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Fontana, Ariel R; Muñoz de Toro, Mónica; Altamirano, Jorgelina C

    2011-04-27

    A simple technique based on ultrasound-assisted emulsification microextraction in situ derivatization (USAEME-ISD) is proposed for the one-step derivatization, extraction, and preconcentration of bisphenol A (BPA) in beverage samples prior to gas chromatography-mass spectrometry (GC-MS) analysis. BPA was in situ derivatized with acetic anhydride and simultaneously extracted and preconcentrated by using USAEME. Variables affecting the extraction efficiency of BPA were evaluated. Under optimal experimental conditions, the detection limit (LOD) was 38 ng L(-1) with a relative standard deviation (RSD) value of 11.6%. The linear working range was 100-1250 ng L(-1), and the coefficient of estimation (r(2)) of the calibration curve was ≥0.9971. The robustness of the proposed methodology was probed by developing a recovery study at two concentrations (125 and 500 ng L(-1)) over different beverage samples. This study led to a satisfactory result achieving recoveries of ≥82%, which showed acceptable robustness for determination of nanograms per liter of BPA in samples of food safety interest.

  11. Real-time PCR to supplement gold-standard culture-based detection of Legionella in environmental samples.

    Science.gov (United States)

    Collins, S; Jorgensen, F; Willis, C; Walker, J

    2015-10-01

    Culture remains the gold-standard for the enumeration of environmental Legionella. However, it has several drawbacks including long incubation and poor sensitivity, causing delays in response times to outbreaks of Legionnaires' disease. This study aimed to validate real-time PCR assays to quantify Legionella species (ssrA gene), Legionella pneumophila (mip gene) and Leg. pneumophila serogroup-1 (wzm gene) to support culture-based detection in a frontline public health laboratory. Each qPCR assay had 100% specificity, excellent sensitivity (5 GU/reaction) and reproducibility. Comparison of the assays to culture-based enumeration of Legionella from 200 environmental samples showed that they had a negative predictive value of 100%. Thirty eight samples were positive for Legionella species by culture and qPCR. One hundred samples were negative by both methods, whereas 62 samples were negative by culture but positive by qPCR. The average log10 increase between culture and qPCR for Legionella spp. and Leg. pneumophila was 0·72 (P = 0·0002) and 0·51 (P = 0·006), respectively. The qPCR assays can be conducted on the same 1 l water sample as culture thus can be used as a supplementary technique to screen out negative samples and allow more rapid indication of positive samples. The assay could prove informative in public health investigations to identify or rule out sources of Legionella as well as to specifically identify Leg. pneumophila serogroup 1 in a timely manner not possible with culture. © 2015 The Society for Applied Microbiology.

  12. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    Science.gov (United States)

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  13. Soil classification basing on the spectral characteristics of topsoil samples

    Science.gov (United States)

    Liu, Huanjun; Zhang, Xiaokang; Zhang, Xinle

    2016-04-01

    Soil taxonomy plays an important role in soil utility and management, but China has only course soil map created based on 1980s data. New technology, e.g. spectroscopy, could simplify soil classification. The study try to classify soils basing on the spectral characteristics of topsoil samples. 148 topsoil samples of typical soils, including Black soil, Chernozem, Blown soil and Meadow soil, were collected from Songnen plain, Northeast China, and the room spectral reflectance in the visible and near infrared region (400-2500 nm) were processed with weighted moving average, resampling technique, and continuum removal. Spectral indices were extracted from soil spectral characteristics, including the second absorption positions of spectral curve, the first absorption vale's area, and slope of spectral curve at 500-600 nm and 1340-1360 nm. Then K-means clustering and decision tree were used respectively to build soil classification model. The results indicated that 1) the second absorption positions of Black soil and Chernozem were located at 610 nm and 650 nm respectively; 2) the spectral curve of the meadow is similar to its adjacent soil, which could be due to soil erosion; 3) decision tree model showed higher classification accuracy, and accuracy of Black soil, Chernozem, Blown soil and Meadow are 100%, 88%, 97%, 50% respectively, and the accuracy of Blown soil could be increased to 100% by adding one more spectral index (the first two vole's area) to the model, which showed that the model could be used for soil classification and soil map in near future.

  14. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  15. Decomposition techniques

    Science.gov (United States)

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  16. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  17. Evaluation of diesel particulate matter sampling techniques

    CSIR Research Space (South Africa)

    Pretorius, CJ

    2011-09-01

    Full Text Available The study evaluated diesel particulate matter (DPM) sampling methods used in the South African mining industry. The three-piece cassette respirable, open face and stopper sampling methods were compared with the SKC DPM cassette method to find a...

  18. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  19. Skull base tumours part I: Imaging technique, anatomy and anterior skull base tumours

    International Nuclear Information System (INIS)

    Borges, Alexandra

    2008-01-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed

  20. Final Report: Laser-Based Optical Trap for Remote Sampling of Interplanetary and Atmospheric Particulate Matter

    Science.gov (United States)

    Stysley, Paul

    2016-01-01

    Applicability to Early Stage Innovation NIAC Cutting edge and innovative technologies are needed to achieve the demanding requirements for NASA origin missions that require sample collection as laid out in the NRC Decadal Survey. This proposal focused on fully understanding the state of remote laser optical trapping techniques for capturing particles and returning them to a target site. In future missions, a laser-based optical trapping system could be deployed on a lander that would then target particles in the lower atmosphere and deliver them to the main instrument for analysis, providing remote access to otherwise inaccessible samples. Alternatively, for a planetary mission the laser could combine ablation and trapping capabilities on targets typically too far away or too hard for traditional drilling sampling systems. For an interstellar mission, a remote laser system could gather particles continuously at a safe distance; this would avoid the necessity of having a spacecraft fly through a target cloud such as a comet tail. If properly designed and implemented, a laser-based optical trapping system could fundamentally change the way scientists designand implement NASA missions that require mass spectroscopy and particle collection.

  1. Comparison of three sampling instruments, Cytobrush, Curette and OralCDx, for liquid-based cytology of the oral mucosa.

    Science.gov (United States)

    Reboiras-López, M D; Pérez-Sayáns, M; Somoza-Martín, J M; Antúnez-López, J R; Gándara-Vila, P; Gayoso-Diz, P; Gándara-Rey, J M; García-García, A

    2012-01-01

    Exfoliative cytology of the oral cavity is a simple and noninvasive technique that permits the study of epithelial cells. Liquid-based cytology is an auxiliary diagnostic tool for improving the specificity and sensitivity of conventional cytology. The objective of our study was to compare the quality of normal oral mucosa cytology samples obtained using three different instruments, Cytobrush®, dermatological curette and Oral CDx® for liquid-based cytology. One hundred four cytological samples of oral cavity were analyzed. Samples were obtained from healthy volunteer subjects using all three instruments. The clinical and demographic variables were age, sex and smoking habits. We analyzed cellularity, quality of the preparation and types of cells in the samples. All preparations showed appropriate preparation quality. In all smears analyzed, cells were distributed uniformly and showed no mucus, bleeding, inflammatory exudate or artifacts. We found no correlation between the average number of cells and the type of instrument. The samples generally consisted of two types of cells: superficial and intermediate. No differences were found among the cytological preparations of these three instruments. We did not observe basal cells in any of the samples analyzed.

  2. Efficacy of liquid-based cytology versus conventional smears in FNA samples

    Directory of Open Access Journals (Sweden)

    Kalpalata Tripathy

    2015-01-01

    Conclusion: LBC performed on FNA samples can be a simple and valuable technique. Only in few selected cases, where background factor is an essential diagnostic clue, a combination of both CP and TP is necessary.

  3. A simple technique for measuring the superconducting critical temperature of small (>= 10 μg) samples

    International Nuclear Information System (INIS)

    Pereira, R.F.R.; Meyer, E.; Silveira, M.F. da.

    1983-01-01

    A simple technique for measuring the superconducting critical temperature of small (>=10μg) samples is described. The apparatus is built in the form of a probe, which can be introduced directly into a liquid He storage dewar and permits the determination of the critical temperature, with an imprecision of +- 0.05 K above 4.2 K, in about 10 minutes. (Author) [pt

  4. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    Science.gov (United States)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-12-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.

  5. Nasal base narrowing of the caucasian nose through the cerclage technique

    Directory of Open Access Journals (Sweden)

    Mocellin, Marcos

    2010-06-01

    Full Text Available Introduction: Several techniques can be performed to reduce the nasal base (narrowing, as skin resection vestibular columellar skin resection, resection of skin in elliptical lip narinary, sloughing of skin and advancements (VY technique of Bernstein and the use of cerclage sutures in the nasal base. Objective: To evaluate the technique of cerclage performed in the nasal base, through endonasal rhinoplasty without delivery of basic technique, in the Caucasian nose, reducing the distance inter-alar flare and correcting the wing with consequent improvement in nasal harmony in the whole face. Methods: A retrospective analysis by analysis of clinical documents and photos of 43 patients in whom cerclage was made of the nasal base by resecting skin ellipse in the region of the vestibule and the nasal base (modified technique of Weir using colorless mononylon® 4 "0" with a straight cutting needle. The study was conducted in 2008 and 2009 at Hospital of Paraná Institute of Otolaryngology - IPO in Curitiba, Parana - Brazil. Patients had a follow up ranging 7-12 months. Results: In 100% of cases was achieved an improvement in nasal harmony, by decreasing the inter-alar distance. Conclusion: The encircling with minimal resection of vestibular skin and the nasal base is an effective method for the narrowing of the nasal base in the Caucasian nose, with predictable results and easy to perform.

  6. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  7. Sampling practices and analytical techniques used in the monitoring of steam and water in CEGB nuclear boilers

    International Nuclear Information System (INIS)

    Goodfellow, G.I.

    1978-01-01

    The steam and water in CEGB Magnox and AGR nuclear boilers are continuously monitored, using both laboratory techniques and on-line instrumentation, in order to maintain the chemical quality within pre-determined limits. The sampling systems in use and some of the difficulties associated with sampling requirements are discussed. The relative merits of chemical instruments installed either locally in various parts of the plant or in centralized instrument rooms are reviewed. The quality of water in nuclear boilers, as with all high-pressure steam-raising plant, is extremely high; consequently very sensitive analytical procedures are required, particularly for monitoring the feed-water of 'once-through boiler' systems. Considerable progress has been made in this field and examples are given of some of the techniques developed for analyses at the 'μ/kg' level together with some of the current problems.(author)

  8. A review on creatinine measurement techniques.

    Science.gov (United States)

    Mohabbati-Kalejahi, Elham; Azimirad, Vahid; Bahrami, Manouchehr; Ganbari, Ahmad

    2012-08-15

    This paper reviews the entire recent global tendency for creatinine measurement. Creatinine biosensors involve complex relationships between biology and micro-mechatronics to which the blood is subjected. Comparison between new and old methods shows that new techniques (e.g. Molecular Imprinted Polymers based algorithms) are better than old methods (e.g. Elisa) in terms of stability and linear range. All methods and their details for serum, plasma, urine and blood samples are surveyed. They are categorized into five main algorithms: optical, electrochemical, impedometrical, Ion Selective Field-Effect Transistor (ISFET) based technique and chromatography. Response time, detection limit, linear range and selectivity of reported sensors are discussed. Potentiometric measurement technique has the lowest response time of 4-10 s and the lowest detection limit of 0.28 nmol L(-1) belongs to chromatographic technique. Comparison between various techniques of measurements indicates that the best selectivity belongs to MIP based and chromatographic techniques. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    Science.gov (United States)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  10. A correlation-based pulse detection technique for gamma-ray/neutron detectors

    International Nuclear Information System (INIS)

    Faisal, Muhammad; Schiffer, Randolph T.; Flaska, Marek; Pozzi, Sara A.; Wentzloff, David D.

    2011-01-01

    We present a correlation-based detection technique that significantly improves the probability of detection for low energy pulses. We propose performing a normalized cross-correlation of the incoming pulse data to a predefined pulse template, and using a threshold correlation value to trigger the detection of a pulse. This technique improves the detector sensitivity by amplifying the signal component of incoming pulse data and rejecting noise. Simulation results for various different templates are presented. Finally, the performance of the correlation-based detection technique is compared to the current state-of-the-art techniques.

  11. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  12. Techniques for sampling nuclear waste tank contents and in situ measurement of activity

    International Nuclear Information System (INIS)

    Lawrence, R.C.

    1978-04-01

    A study was conducted to develop suitable sampling equipment and techniques for characterizing the mechanical properties of nuclear wastes; identifying effective means of measuring radiation levels, temperatures, and neutron fluxes in situ in wastes; and developing a waste core sampler. A portable, stainless steel probe was developed which is placed in the tank through a riser. This probe is built for the insertion of instrumentation that can measure the contents of the tank at any level and take temperature, radiation, and neutron activation readings with reliable accuracy. A simple and reliable instrument for the in situ extraction of waste materials ranging from liquid to concrete-like substances was also developed. This portable, stainless steel waste core sampler can remove up to one liter of radioactive waste from tanks for transportation to hot cell laboratories for analysis of hardness, chemical form, and isotopic content. A cask for transporting the waste samples from the tanks to the laboratory under radiation-protected conditions was also fabricated. This cask was designed with a ''boot'' or inner-seal liner to contain any radioactive wastes that might remain on the outside of the waste core sampling device

  13. Memory Based Machine Intelligence Techniques in VLSI hardware

    OpenAIRE

    James, Alex Pappachen

    2012-01-01

    We briefly introduce the memory based approaches to emulate machine intelligence in VLSI hardware, describing the challenges and advantages. Implementation of artificial intelligence techniques in VLSI hardware is a practical and difficult problem. Deep architectures, hierarchical temporal memories and memory networks are some of the contemporary approaches in this area of research. The techniques attempt to emulate low level intelligence tasks and aim at providing scalable solutions to high ...

  14. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies

    International Nuclear Information System (INIS)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-01-01

    Graphical abstract: -- Highlights: •Several methods based on nanotechnology achieve limit of detections in the pM and nM ranges for mercury (II) analysis. •Most of these methods are validated in filtered water samples and/or spiked samples. •Thiols in real samples constitute an actual competence for any sensor based on the binding of mercury (II) ions. •Future research should include the study of matrix interferences including thiols and dissolved organic matter. -- Abstract: In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis

  15. Office-based narrow band imaging-guided flexible laryngoscopy tissue sampling: A cost-effectiveness analysis evaluating its impact on Taiwanese health insurance program

    OpenAIRE

    Fang, Tuan-Jen; Li, Hsueh-Yu; Liao, Chun-Ta; Chiang, Hui-Chen; Chen, I-How

    2015-01-01

    Narrow band imaging (NBI)-guided flexible laryngoscopy tissue sampling for laryngopharyngeal lesions is a novel technique. Patients underwent the procedure in an office-based setting without being sedated, which is different from the conventional technique performed using direct laryngoscopy. Although the feasibility and effects of this procedure were established, its financial impact on the institution and Taiwanese National Health Insurance program was not determined. Methods: This is a ...

  16. Recent Application of Solid Phase Based Techniques for Extraction and Preconcentration of Cyanotoxins in Environmental Matrices.

    Science.gov (United States)

    Mashile, Geaneth Pertunia; Nomngongo, Philiswa N

    2017-03-04

    Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.

  17. Image-based ELISA on an activated polypropylene microtest plate--a spectrophotometer-free low cost assay technique.

    Science.gov (United States)

    Parween, Shahila; Nahar, Pradip

    2013-10-15

    In this communication, we report ELISA technique on an activated polypropylene microtest plate (APPµTP) as an illustrative example of a low cost diagnostic assay. Activated test zone in APPµTP binds a capture biomolecule through covalent linkage thereby, eliminating non-specific binding often prevalent in absorption based techniques. Efficacy of APPµTP is demonstrated by detecting human immunoglobulin G (IgG), human immunoglobulin E (IgE) and Aspergillus fumigatus antibody in patient's sera. Detection is done by taking the image of the assay solution by a desktop scanner and analyzing the color of the image. Human IgE quantification by color saturation in the image-based assay shows excellent correlation with absorbance-based assay (Pearson correlation coefficient, r=0.992). Significance of the relationship is seen from its p value which is 4.087e-11. Performance of APPµTP is also checked with respect to microtiter plate and paper-based ELISA. APPµTP can quantify an analyte as precisely as in microtiter plate with insignificant non-specific binding, a necessary prerequisite for ELISA assay. In contrast, paper-ELISA shows high non-specific binding in control sera (false positive). Finally, we have carried out ELISA steps on APPµTP by ultrasound waves on a sonicator bath and the results show that even in 8 min, it can convincingly differentiate a test sample from a control sample. In short, spectrophotometer-free image-based miniaturized ELISA on APPµTP is precise, reliable, rapid, and sensitive and could be a good substitute for conventional immunoassay procedures widely used in clinical and research laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  19. Characterization techniques for graphene-based materials in catalysis

    Directory of Open Access Journals (Sweden)

    Maocong Hu

    2017-06-01

    Full Text Available Graphene-based materials have been studied in a wide range of applications including catalysis due to the outstanding electronic, thermal, and mechanical properties. The unprecedented features of graphene-based catalysts, which are believed to be responsible for their superior performance, have been characterized by many techniques. In this article, we comprehensively summarized the characterization methods covering bulk and surface structure analysis, chemisorption ability determination, and reaction mechanism investigation. We reviewed the advantages/disadvantages of different techniques including Raman spectroscopy, X-ray photoelectron spectroscopy (XPS, Fourier transform infrared spectroscopy (FTIR and Diffuse Reflectance Fourier Transform Infrared Spectroscopy (DRIFTS, X-Ray diffraction (XRD, X-ray absorption near edge structure (XANES and X-ray absorption fine structure (XAFS, atomic force microscopy (AFM, scanning electron microscopy (SEM, transmission electron microscopy (TEM, high-resolution transmission electron microscopy (HRTEM, ultraviolet-visible spectroscopy (UV-vis, X-ray fluorescence (XRF, inductively coupled plasma mass spectrometry (ICP, thermogravimetric analysis (TGA, Brunauer–Emmett–Teller (BET, and scanning tunneling microscopy (STM. The application of temperature-programmed reduction (TPR, CO chemisorption, and NH3/CO2-temperature-programmed desorption (TPD was also briefly introduced. Finally, we discussed the challenges and provided possible suggestions on choosing characterization techniques. This review provides key information to catalysis community to adopt suitable characterization techniques for their research.

  20. Mathematical Foundation Based Inter-Connectivity modelling of Thermal Image processing technique for Fire Protection

    Directory of Open Access Journals (Sweden)

    Sayantan Nath

    2015-09-01

    Full Text Available In this paper, integration between multiple functions of image processing and its statistical parameters for intelligent alarming series based fire detection system is presented. The proper inter-connectivity mapping between processing elements of imagery based on classification factor for temperature monitoring and multilevel intelligent alarm sequence is introduced by abstractive canonical approach. The flow of image processing components between core implementation of intelligent alarming system with temperature wise area segmentation as well as boundary detection technique is not yet fully explored in the present era of thermal imaging. In the light of analytical perspective of convolutive functionalism in thermal imaging, the abstract algebra based inter-mapping model between event-calculus supported DAGSVM classification for step-by-step generation of alarm series with gradual monitoring technique and segmentation of regions with its affected boundaries in thermographic image of coal with respect to temperature distinctions is discussed. The connectedness of the multifunctional operations of image processing based compatible fire protection system with proper monitoring sequence is presently investigated here. The mathematical models representing the relation between the temperature affected areas and its boundary in the obtained thermal image defined in partial derivative fashion is the core contribution of this study. The thermal image of coal sample is obtained in real-life scenario by self-assembled thermographic camera in this study. The amalgamation between area segmentation, boundary detection and alarm series are described in abstract algebra. The principal objective of this paper is to understand the dependency pattern and the principles of working of image processing components and structure an inter-connected modelling technique also for those components with the help of mathematical foundation.

  1. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  2. Calibrated Phase-Shifting Digital Holographic Microscope Using a Sampling Moiré Technique

    Directory of Open Access Journals (Sweden)

    Peng Xia

    2018-05-01

    Full Text Available A calibrated phase-shifting digital holographic microscope system capable of improving the quality of reconstructed images is proposed. Phase-shifting errors are introduced in phase-shifted holograms for numerous reasons, such as the non-linearity of piezoelectric transducers (PZTs, wavelength fluctuations in lasers, and environmental disturbances, leading to poor-quality reconstructions. In our system, in addition to the camera used to record object information, an extra camera is used to record interferograms, which are used to analyze phase-shifting errors using a sampling Moiré technique. The quality of the reconstructed object images can be improved by the phase-shifting error compensation algorithm. Both the numerical simulation and experiment demonstrate the effectiveness of the proposed system.

  3. A novel PMT test system based on waveform sampling

    Science.gov (United States)

    Yin, S.; Ma, L.; Ning, Z.; Qian, S.; Wang, Y.; Jiang, X.; Wang, Z.; Yu, B.; Gao, F.; Zhu, Y.; Wang, Z.

    2018-01-01

    Comparing with the traditional test system based on a QDC and TDC and scaler, a test system based on waveform sampling is constructed for signal sampling of the 8"R5912 and the 20"R12860 Hamamatsu PMT in different energy states from single to multiple photoelectrons. In order to achieve high throughput and to reduce the dead time in data processing, the data acquisition software based on LabVIEW is developed and runs with a parallel mechanism. The analysis algorithm is realized in LabVIEW and the spectra of charge, amplitude, signal width and rising time are analyzed offline. The results from Charge-to-Digital Converter, Time-to-Digital Converter and waveform sampling are discussed in detailed comparison.

  4. Development of a Univariate Membrane-Based Mid-Infrared Method for Protein Quantitation and Total Lipid Content Analysis of Biological Samples

    Directory of Open Access Journals (Sweden)

    Ivona Strug

    2014-01-01

    Full Text Available Biological samples present a range of complexities from homogeneous purified protein to multicomponent mixtures. Accurate qualification of such samples is paramount to downstream applications. We describe the development of an MIR spectroscopy-based analytical method offering simultaneous protein quantitation (0.25–5 mg/mL and analysis of total lipid or detergent species, as well as the identification of other biomolecules present in biological samples. The method utilizes a hydrophilic PTFE membrane engineered for presentation of aqueous samples in a dried format compatible with fast infrared analysis. Unlike classical quantification techniques, the reported method is amino acid sequence independent and thus applicable to complex samples of unknown composition. By comparison to existing platforms, this MIR-based method enables direct quantification using minimal sample volume (2 µL; it is well-suited where repeat access and limited sample size are critical parameters. Further, accurate results can be derived without specialized training or knowledge of IR spectroscopy. Overall, the simplified application and analysis system provides a more cost-effective alternative to high-throughput IR systems for research laboratories with minimal throughput demands. In summary, the MIR-based system provides a viable alternative to current protein quantitation methods; it also uniquely offers simultaneous qualification of other components, notably lipids and detergents.

  5. Voltammetric technique, a panacea for analytical examination of environmental samples

    International Nuclear Information System (INIS)

    Zahir, E.; Mohiuddin, S.; Naqvi, I.I.

    2012-01-01

    Voltammetric methods for trace metal analysis in environmental samples of marine origin like mangrove, sediments and shrimps are generally recommended. Three different electro-analytical techniques i.e. polarography, anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (ADSV) have been used. Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ and Mn/sub 2/+ were determined through ASV, Cr/sub 6/+ was analyzed by ADSV and Fe/sub 2/+, Zn/sub 2/+, Ni/sub 2/+ and Co/sub 2/+ were determined through polarography. Out of which pairs of Fe/sub 2/+Zn/sub 2/+ and Ni/sub 2/+Co/sub 2/+ were determined in two separate runs while Cd/sub 2/+, Pb/sub 2/+, Cu/sub 2/+ were analyzed in single run of ASV. Sensitivity and speciation capabilities of voltammetric methods have been employed. Analysis conditions were optimized that includes selection of supporting electrolyte, pH, working electrodes, sweep rate etc. Stripping voltammetry was adopted for analysis at ultra trace levels. Statistical parameters for analytical method development like selectivity factor, interference, repeatability (0.0065-0.130 macro g/g), reproducibility (0.08125-1.625 macro g/g), detection limits (0.032-5.06 macro g/g), limits of quantification (0.081-12.652 macro g/g), sensitivities (5.636-2.15 nA mL macro g-1) etc. were also determined. The percentage recoveries were found in between 95-105% using certified reference materials. Real samples of complex marine environment from Karachi coastline were also analyzed. The standard addition method was employed where any matrix effect was evidenced. (author)

  6. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  7. Recruitment Techniques and Strategies in a Community-Based Colorectal Cancer Screening Study of Men and Women of African Ancestry.

    Science.gov (United States)

    Davis, Stacy N; Govindaraju, Swapamthi; Jackson, Brittany; Williams, Kimberly R; Christy, Shannon M; Vadaparampil, Susan T; Quinn, Gwendolyn P; Shibata, David; Roetzheim, Richard; Meade, Cathy D; Gwede, Clement K

    Recruiting ethnically diverse Black participants to an innovative, community-based research study to reduce colorectal cancer screening disparities requires multipronged recruitment techniques. This article describes active, passive, and snowball recruitment techniques, and challenges and lessons learned in recruiting a diverse sample of Black participants. For each of the three recruitment techniques, data were collected on strategies, enrollment efficiency (participants enrolled/participants evaluated), and reasons for ineligibility. Five hundred sixty individuals were evaluated, and 330 individuals were enrolled. Active recruitment yielded the highest number of enrolled participants, followed by passive and snowball. Snowball recruitment was the most efficient technique, with enrollment efficiency of 72.4%, followed by passive (58.1%) and active (55.7%) techniques. There were significant differences in gender, education, country of origin, health insurance, and having a regular physician by recruitment technique (p recruitment techniques should be employed to increase reach, diversity, and study participation rates among Blacks. Although each recruitment technique had a variable enrollment efficiency, the use of multipronged recruitment techniques can lead to successful enrollment of diverse Blacks into cancer prevention and control interventions.

  8. Sample preparation in alkaline media

    International Nuclear Information System (INIS)

    Nobrega, Joaquim A.; Santos, Mirian C.; Sousa, Rafael A. de; Cadore, Solange; Barnes, Ramon M.; Tatro, Mark

    2006-01-01

    The use of tetramethylammonium hydroxide, tertiary amines and strongly alkaline reagents for sample treatment involving extraction and digestion procedures is discussed in this review. The preparation of slurries is also discussed. Based on literature data, alkaline media offer a good alternative for sample preparation involving an appreciable group of analytes in different types of samples. These reagents are also successfully employed in tailored speciation procedures wherein there is a critical dependence on maintenance of chemical forms. The effects of these reagents on measurements performed using spectroanalytical techniques are discussed. Several undesirable effects on transport and atomization processes necessitate use of the method of standard additions to obtain accurate results. It is also evident that alkaline media can improve the performance of techniques such as inductively coupled plasma mass spectrometry and accessories, such as autosamplers coupled to graphite furnace atomic absorption spectrometers

  9. Use of a 137Cs re-sampling technique to investigate temporal changes in soil erosion and sediment mobilisation for a small forested catchment in southern Italy

    International Nuclear Information System (INIS)

    Porto, Paolo; Walling, Des E.; Alewell, Christine; Callegari, Giovanni; Mabit, Lionel; Mallimo, Nicola; Meusburger, Katrin; Zehringer, Markus

    2014-01-01

    Soil erosion and both its on-site and off-site impacts are increasingly seen as a serious environmental problem across the world. The need for an improved evidence base on soil loss and soil redistribution rates has directed attention to the use of fallout radionuclides, and particularly 137 Cs, for documenting soil redistribution rates. This approach possesses important advantages over more traditional means of documenting soil erosion and soil redistribution. However, one key limitation of the approach is the time-averaged or lumped nature of the estimated erosion rates. In nearly all cases, these will relate to the period extending from the main period of bomb fallout to the time of sampling. Increasing concern for the impact of global change, particularly that related to changing land use and climate change, has frequently directed attention to the need to document changes in soil redistribution rates within this period. Re-sampling techniques, which should be distinguished from repeat-sampling techniques, have the potential to meet this requirement. As an example, the use of a re-sampling technique to derive estimates of the mean annual net soil loss from a small (1.38 ha) forested catchment in southern Italy is reported. The catchment was originally sampled in 1998 and samples were collected from points very close to the original sampling points again in 2013. This made it possible to compare the estimate of mean annual erosion for the period 1954–1998 with that for the period 1999–2013. The availability of measurements of sediment yield from the catchment for parts of the overall period made it possible to compare the results provided by the 137 Cs re-sampling study with the estimates of sediment yield for the same periods. In order to compare the estimates of soil loss and sediment yield for the two different periods, it was necessary to establish the uncertainty associated with the individual estimates. In the absence of a generally accepted procedure

  10. [Amanitine determination as an example of peptide analysis in the biological samples with HPLC-MS technique].

    Science.gov (United States)

    Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof

    Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers

  11. Development of Energy Management System Based on Internet of Things Technique

    OpenAIRE

    Wen-Jye Shyr; Chia-Ming Lin and Hung-Yun Feng

    2017-01-01

    The purpose of this study was to develop an energy management system for university campuses based on the Internet of Things (IoT) technique. The proposed IoT technique based on WebAccess is used via network browser Internet Explore and applies TCP/IP protocol. The case study of IoT for lighting energy usage management system was proposed. Structure of proposed IoT technique included perception layer, equipment layer, control layer, application layer and network layer.

  12. Microextraction Techniques Coupled to Liquid Chromatography with Mass Spectrometry for the Determination of Organic Micropollutants in Environmental Water Samples

    Directory of Open Access Journals (Sweden)

    Mª Esther Torres Padrón

    2014-07-01

    Full Text Available Until recently, sample preparation was carried out using traditional techniques, such as liquid–liquid extraction (LLE, that use large volumes of organic solvents. Solid-phase extraction (SPE uses much less solvent than LLE, although the volume can still be significant. These preparation methods are expensive, time-consuming and environmentally unfriendly. Recently, a great effort has been made to develop new analytical methodologies able to perform direct analyses using miniaturised equipment, thereby achieving high enrichment factors, minimising solvent consumption and reducing waste. These microextraction techniques improve the performance during sample preparation, particularly in complex water environmental samples, such as wastewaters, surface and ground waters, tap waters, sea and river waters. Liquid chromatography coupled to tandem mass spectrometry (LC/MS/MS and time-of-flight mass spectrometric (TOF/MS techniques can be used when analysing a broad range of organic micropollutants. Before separating and detecting these compounds in environmental samples, the target analytes must be extracted and pre-concentrated to make them detectable. In this work, we review the most recent applications of microextraction preparation techniques in different water environmental matrices to determine organic micropollutants: solid-phase microextraction SPME, in-tube solid-phase microextraction (IT-SPME, stir bar sorptive extraction (SBSE and liquid-phase microextraction (LPME. Several groups of compounds are considered organic micropollutants because these are being released continuously into the environment. Many of these compounds are considered emerging contaminants. These analytes are generally compounds that are not covered by the existing regulations and are now detected more frequently in different environmental compartments. Pharmaceuticals, surfactants, personal care products and other chemicals are considered micropollutants. These

  13. Sample-interpolation timing: an optimized technique for the digital measurement of time of flight for γ rays and neutrons at relatively low sampling rates

    International Nuclear Information System (INIS)

    Aspinall, M D; Joyce, M J; Mackin, R O; Jarrah, Z; Boston, A J; Nolan, P J; Peyton, A J; Hawkes, N P

    2009-01-01

    A unique, digital time pick-off method, known as sample-interpolation timing (SIT) is described. This method demonstrates the possibility of improved timing resolution for the digital measurement of time of flight compared with digital replica-analogue time pick-off methods for signals sampled at relatively low rates. Three analogue timing methods have been replicated in the digital domain (leading-edge, crossover and constant-fraction timing) for pulse data sampled at 8 GSa s −1 . Events arising from the 7 Li(p, n) 7 Be reaction have been detected with an EJ-301 organic liquid scintillator and recorded with a fast digital sampling oscilloscope. Sample-interpolation timing was developed solely for the digital domain and thus performs more efficiently on digital signals compared with analogue time pick-off methods replicated digitally, especially for fast signals that are sampled at rates that current affordable and portable devices can achieve. Sample interpolation can be applied to any analogue timing method replicated digitally and thus also has the potential to exploit the generic capabilities of analogue techniques with the benefits of operating in the digital domain. A threshold in sampling rate with respect to the signal pulse width is observed beyond which further improvements in timing resolution are not attained. This advance is relevant to many applications in which time-of-flight measurement is essential

  14. Accurate recapture identification for genetic mark–recapture studies with error-tolerant likelihood-based match calling and sample clustering

    Science.gov (United States)

    Sethi, Suresh; Linden, Daniel; Wenburg, John; Lewis, Cara; Lemons, Patrick R.; Fuller, Angela K.; Hare, Matthew P.

    2016-01-01

    Error-tolerant likelihood-based match calling presents a promising technique to accurately identify recapture events in genetic mark–recapture studies by combining probabilities of latent genotypes and probabilities of observed genotypes, which may contain genotyping errors. Combined with clustering algorithms to group samples into sets of recaptures based upon pairwise match calls, these tools can be used to reconstruct accurate capture histories for mark–recapture modelling. Here, we assess the performance of a recently introduced error-tolerant likelihood-based match-calling model and sample clustering algorithm for genetic mark–recapture studies. We assessed both biallelic (i.e. single nucleotide polymorphisms; SNP) and multiallelic (i.e. microsatellite; MSAT) markers using a combination of simulation analyses and case study data on Pacific walrus (Odobenus rosmarus divergens) and fishers (Pekania pennanti). A novel two-stage clustering approach is demonstrated for genetic mark–recapture applications. First, repeat captures within a sampling occasion are identified. Subsequently, recaptures across sampling occasions are identified. The likelihood-based matching protocol performed well in simulation trials, demonstrating utility for use in a wide range of genetic mark–recapture studies. Moderately sized SNP (64+) and MSAT (10–15) panels produced accurate match calls for recaptures and accurate non-match calls for samples from closely related individuals in the face of low to moderate genotyping error. Furthermore, matching performance remained stable or increased as the number of genetic markers increased, genotyping error notwithstanding.

  15. Detection of creep damage in a nickel base superalloy using NDE techniques

    International Nuclear Information System (INIS)

    Carreon, H.; Mora, B.; Barrera, G.

    2009-10-01

    Due to elevated temperatures, excessive stresses and severed corrosion conditions, turbine engine components are subject to creep processes that limit the components life such as a turbine bucket. The failure mechanism of a turbine bucket is related primarily to creep and corrosion and secondarily to thermal fatigue. As a result, it is desirable to assess the current conditions of such turbine component. This study uses the eddy current nondestructive evaluation technique in an effort to monitor the creep damage in a nickel base super-alloy, turbine bucket after service. The experimental results show an important electrical conductivity variation in eddy current images on the creep damage zone of nickel base super-alloy samples cut from a turbine bucket. Thermoelectric power measurements were also conducted in order to obtain a direct correlation between the presence of material changes due to creep damage and the electrical conductivity measurements. This research work shows an alternative non-destructive method in order to detect creep damage in a nickel base super-alloy turbine bucket. (Author)

  16. Control charts for location based on different sampling schemes

    NARCIS (Netherlands)

    Mehmood, R.; Riaz, M.; Does, R.J.M.M.

    2013-01-01

    Control charts are the most important statistical process control tool for monitoring variations in a process. A number of articles are available in the literature for the X̄ control chart based on simple random sampling, ranked set sampling, median-ranked set sampling (MRSS), extreme-ranked set

  17. Individual and pen-based oral fluid sampling: A welfare-friendly sampling method for group-housed gestating sows.

    Science.gov (United States)

    Pol, Françoise; Dorenlor, Virginie; Eono, Florent; Eudier, Solveig; Eveno, Eric; Liégard-Vanhecke, Dorine; Rose, Nicolas; Fablet, Christelle

    2017-11-01

    The aims of this study were to assess the feasibility of individual and pen-based oral fluid sampling (OFS) in 35 pig herds with group-housed sows, compare these methods to blood sampling, and assess the factors influencing the success of sampling. Individual samples were collected from at least 30 sows per herd. Pen-based OFS was performed using devices placed in at least three pens for 45min. Information related to the farm, the sows, and their living conditions were collected. Factors significantly associated with the duration of sampling and the chewing behaviour of sows were identified by logistic regression. Individual OFS took 2min 42s on average; the type of floor, swab size, and operator were associated with a sampling time >2min. Pen-based OFS was obtained from 112 devices (62.2%). The type of floor, parity, pen-level activity, and type of feeding were associated with chewing behaviour. Pen activity was associated with the latency to interact with the device. The type of floor, gestation stage, parity, group size, and latency to interact with the device were associated with a chewing time >10min. After 15, 30 and 45min of pen-based OFS, 48%, 60% and 65% of the sows were lying down, respectively. The time spent after the beginning of sampling, genetic type, and time elapsed since the last meal were associated with 50% of the sows lying down at one time point. The mean time to blood sample the sows was 1min 16s and 2min 52s if the number of operators required was considered in the sampling time estimation. The genetic type, parity, and type of floor were significantly associated with a sampling time higher than 1min 30s. This study shows that individual OFS is easy to perform in group-housed sows by a single operator, even though straw-bedded animals take longer to sample than animals housed on slatted floors, and suggests some guidelines to optimise pen-based OFS success. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Solving mercury (Hg) speciation in soil samples by synchrotron X-ray microspectroscopic techniques

    International Nuclear Information System (INIS)

    Terzano, Roberto; Santoro, Anna; Spagnuolo, Matteo; Vekemans, Bart; Medici, Luca; Janssens, Koen; Goettlicher, Joerg; Denecke, Melissa A.; Mangold, Stefan; Ruggiero, Pacifico

    2010-01-01

    Direct mercury (Hg) speciation was assessed for soil samples with a Hg concentration ranging from 7 up to 240 mg kg -1 . Hg chemical forms were identified and quantified by sequential extractions and bulk- and micro-analytical techniques exploiting synchrotron generated X-rays. In particular, microspectroscopic techniques such as μ-XRF, μ-XRD and μ-XANES were necessary to solve bulk Hg speciation, in both soil fractions 3 S 2 Cl 2 ), and an amorphous phase containing Hg bound to chlorine and sulfur. The amount of metacinnabar and amorphous phases increased in the fraction <2 μm. No interaction among Hg-species and soil components was observed. All the observed Hg-species originated from the slow weathering of an inert Hg-containing waste material (K106, U.S. EPA) dumped in the area several years ago, which is changing into a relatively more dangerous source of pollution. - Direct mercury (Hg) speciation in chlor-alkali plant contaminated soils enabled the identification of potentially dangerous Hg-S/Cl amorphous species.

  19. Improved abdominal MRI in non-breath-holding children using a radial k-space sampling technique

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Hyuk; Choi, Young Hun; Cheon, Jung Eun; Lee, So Mi; Cho, Hyun Hae; Kim, Woo Sun; Kim, In One [Seoul National University Children' s Hospital, Department of Radiology, Seoul (Korea, Republic of); Shin, Su Mi [SMG-SNU Boramae Medical Center, Department of Radiology, Seoul (Korea, Republic of)

    2015-06-15

    Radial k-space sampling techniques have been shown to reduce motion artifacts in adult abdominal MRI. To compare a T2-weighted radial k-space sampling MRI pulse sequence (BLADE) with standard respiratory-triggered T2-weighted turbo spin echo (TSE) in pediatric abdominal imaging. Axial BLADE and respiratory-triggered turbo spin echo sequences were performed without fat suppression in 32 abdominal MR examinations in children. We retrospectively assessed overall image quality, the presence of respiratory, peristaltic and radial artifact, and lesion conspicuity. We evaluated signal uniformity of each sequence. BLADE showed improved overall image quality (3.35 ± 0.85 vs. 2.59 ± 0.59, P < 0.001), reduced respiratory motion artifact (0.51 ± 0.56 vs. 1.89 ± 0.68, P < 0.001), and improved lesion conspicuity (3.54 ± 0.88 vs. 2.92 ± 0.77, P = 0.006) compared to respiratory triggering turbo spin-echo (TSE) sequences. The bowel motion artifact scores were similar for both sequences (1.65 ± 0.77 vs. 1.79 ± 0.74, P = 0.691). BLADE introduced a radial artifact that was not observed on the respiratory triggering-TSE images (1.10 ± 0.85 vs. 0, P < 0.001). BLADE was associated with diminished signal variation compared with respiratory triggering-TSE in the liver, spleen and air (P < 0.001). The radial k-space sampling technique improved the quality and reduced respiratory motion artifacts in young children compared with conventional respiratory-triggered turbo spin-echo sequences. (orig.)

  20. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Shoji Kawahito

    2016-11-01

    Full Text Available This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs. This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC. The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median: 0.29 e−rms when compared with the CMS gain of two (2.4 e−rms, or 16 (1.1 e−rms.

  1. Use of radiocarbon technique for archaelogic dating

    International Nuclear Information System (INIS)

    Chausson, Y.

    1986-01-01

    The nuclear technique based on the beta radiation measurements emitted by the radiocarbon is applied an the geochronologycal dating of organic samples of prehistoric fires and sambaqui shells. This paper describes the origin of the method, the technique used and its applications, the analysis method, the equipments and the experiences performed. (Author) [pt

  2. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    Science.gov (United States)

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style

  3. Comprehensive Non-Destructive Conservation Documentation of Lunar Samples Using High-Resolution Image-Based 3D Reconstructions and X-Ray CT Data

    Science.gov (United States)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2015-01-01

    Established contemporary conservation methods within the fields of Natural and Cultural Heritage encourage an interdisciplinary approach to preservation of heritage material (both tangible and intangible) that holds "Outstanding Universal Value" for our global community. NASA's lunar samples were acquired from the moon for the primary purpose of intensive scientific investigation. These samples, however, also invoke cultural significance, as evidenced by the millions of people per year that visit lunar displays in museums and heritage centers around the world. Being both scientifically and culturally significant, the lunar samples require a unique conservation approach. Government mandate dictates that NASA's Astromaterials Acquisition and Curation Office develop and maintain protocols for "documentation, preservation, preparation and distribution of samples for research, education and public outreach" for both current and future collections of astromaterials. Documentation, considered the first stage within the conservation methodology, has evolved many new techniques since curation protocols for the lunar samples were first implemented, and the development of new documentation strategies for current and future astromaterials is beneficial to keeping curation protocols up to date. We have developed and tested a comprehensive non-destructive documentation technique using high-resolution image-based 3D reconstruction and X-ray CT (XCT) data in order to create interactive 3D models of lunar samples that would ultimately be served to both researchers and the public. These data enhance preliminary scientific investigations including targeted sample requests, and also provide a new visual platform for the public to experience and interact with the lunar samples. We intend to serve these data as they are acquired on NASA's Astromaterials Acquisistion and Curation website at http://curator.jsc.nasa.gov/. Providing 3D interior and exterior documentation of astromaterial

  4. MITIE: Simultaneous RNA-Seq-based transcript identification and quantification in multiple samples.

    Science.gov (United States)

    Behr, Jonas; Kahles, André; Zhong, Yi; Sreedharan, Vipin T; Drewe, Philipp; Rätsch, Gunnar

    2013-10-15

    High-throughput sequencing of mRNA (RNA-Seq) has led to tremendous improvements in the detection of expressed genes and reconstruction of RNA transcripts. However, the extensive dynamic range of gene expression, technical limitations and biases, as well as the observed complexity of the transcriptional landscape, pose profound computational challenges for transcriptome reconstruction. We present the novel framework MITIE (Mixed Integer Transcript IdEntification) for simultaneous transcript reconstruction and quantification. We define a likelihood function based on the negative binomial distribution, use a regularization approach to select a few transcripts collectively explaining the observed read data and show how to find the optimal solution using Mixed Integer Programming. MITIE can (i) take advantage of known transcripts, (ii) reconstruct and quantify transcripts simultaneously in multiple samples, and (iii) resolve the location of multi-mapping reads. It is designed for genome- and assembly-based transcriptome reconstruction. We present an extensive study based on realistic simulated RNA-Seq data. When compared with state-of-the-art approaches, MITIE proves to be significantly more sensitive and overall more accurate. Moreover, MITIE yields substantial performance gains when used with multiple samples. We applied our system to 38 Drosophila melanogaster modENCODE RNA-Seq libraries and estimated the sensitivity of reconstructing omitted transcript annotations and the specificity with respect to annotated transcripts. Our results corroborate that a well-motivated objective paired with appropriate optimization techniques lead to significant improvements over the state-of-the-art in transcriptome reconstruction. MITIE is implemented in C++ and is available from http://bioweb.me/mitie under the GPL license.

  5. Electromembrane extraction as a rapid and selective miniaturized sample preparation technique for biological fluids

    DEFF Research Database (Denmark)

    Gjelstad, Astrid; Pedersen-Bjergaard, Stig; Seip, Knut Fredrik

    2015-01-01

    This special report discusses the sample preparation method electromembrane extraction, which was introduced in 2006 as a rapid and selective miniaturized extraction method. The extraction principle is based on isolation of charged analytes extracted from an aqueous sample, across a thin film....... Technical aspects of electromembrane extraction, important extraction parameters as well as a handful of examples of applications from different biological samples and bioanalytical areas are discussed in the paper....

  6. An evaluation of sampling methods and supporting techniques for tackling lead in drinking water in Aberta Province

    Science.gov (United States)

    A collaborative project commenced in August 2013 with the aim of demonstrating a range of techniques that can be used in tackling the problems of lead in drinking water. The main project was completed in March 2014, with supplementary sampling exercises in mid-2014. It involved t...

  7. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    Science.gov (United States)

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  8. A comparative study of 232Th and 238U activity estimation in soil samples by gamma spectrometry and Neutron Activation Analysis (NAA) technique

    International Nuclear Information System (INIS)

    Rekha, A.K.; Anilkumar, S.; Narayani, K.; Babu, D.A.R.

    2012-01-01

    Radioactivity in the environment is mainly due to the naturally occurring radionuclides like uranium, thorium with their daughter products and potassium. Even though Gamma spectrometry is the most commonly used non destructive method for the quantification of these naturally occurring radionuclides, Neutron Activation Analysis (NAA), a well established analytical technique, can also be used. But the NAA technique is a time consuming process and needs proper standards, proper sample preparation etc. In this paper, the 232 Th and 238 U activity estimated using gamma ray spectrometry and NAA technique are compared. In the case of direct gamma spectrometry method, the samples were analysed after sealing in a 250 ml container. Whereas for the NAA, about 300 mg of each sample, after irradiation were subjected to gamma spectrometry. The 238 U and 232 Th activities (in Bq/kg) in samples were estimated after the proper efficiency correction and were compared. The estimated activities by these two methods are in good agreement. The variation in 238 U and 232 Th activity values are within ± 15% which are acceptable for environmental samples

  9. Determination of hydrogen content by neutron techniques

    International Nuclear Information System (INIS)

    Santisteban, J.R.; Granada, J.R.; Mayer, R.E.

    1997-01-01

    The commonly available techniques for the determination of hydrogen dissolved in solids are usually destructive from the point of view of the sample. A new, nondestructive method for this kind of measurements has been developed at our laboratory, with the requirement of improved sensitivity for massive samples. This scattering method is based on the use of epithermal neutrons, and has been implemented through the design and construction of a spectrometer dedicated to that task. In addition, the traditional transmission method has been employed to determine hydrogen content in metals, using the full sub thermal and thermal neutron energy ranges. A pulsed neutron source based on an electron LINAC is employed, together with time-of-flight techniques. In this work we will present some results illustrative of the sensitivity achieved by these neutron techniques in different systems and for a wide range of hydrogen concentrations. (author) [es

  10. Effect of composition and technique of production, on the mechanical behaviour of based-zirconium metallic glasses

    International Nuclear Information System (INIS)

    Nowak, Sophie

    2009-01-01

    The metallic glasses are relatively new materials (50 years), produced by quenching a molten alloy. The amorphous structure of these materials gives them unique properties: very high strength (fracture stress is about 1.7 GPa for Zr based alloys), an elastic deformation reaching 2%, but little or no ductility. The compositions, which could produce both amorphous and bulk samples, are limited. The work, detailed in this manuscript, shows the possibility of sintering using SPS (Spark Plasma Sintering) amorphous powders obtained by atomization (Φaverage = 70 microns). The result is a fully densified and near fully amorphous sample. The optimization of this technique, with the composition Zr 57 Cu 20 Al 10 Ni 8 Ti 5 , gave samples for which mechanical behaviour is close to the bulk metallic glass behaviour. However, partial crystallization of the material occurs, localized at the contact points of particles, but could be reduced by deepening the sintering model outlined in this manuscript. In view of these results, new compositions are designed, and the production of ribbons was conducted. The characterization by nano-indentation estimates reliably the mechanical properties of these alloys. Finally, a new method, evaluating the activation volume, which is the elementary volume initiating plastic deformation, is presented. This technique is a statistical analysis of pseudo-creep tests performed by nano-indentation, at room temperature. In conclusion, this work opens new perspectives to develop bulk samples in broad range of compositions. (author)

  11. Probabilistic techniques using Monte Carlo sampling for multi- component system diagnostics

    International Nuclear Information System (INIS)

    Aumeier, S.E.; Lee, J.C.; Akcasu, A.Z.

    1995-01-01

    We outline the structure of a new approach at multi-component system fault diagnostics which utilizes detailed system simulation models, uncertain system observation data, statistical knowledge of system parameters, expert opinion, and component reliability data in an effort to identify incipient component performance degradations of arbitrary number and magnitude. The technique involves the use of multiple adaptive Kalman filters for fault estimation, the results of which are screened using standard hypothesis testing procedures to define a set of component events that could have transpired. Latin Hypercube sample each of these feasible component events in terms of uncertain component reliability data and filter estimates. The capabilities of the procedure are demonstrated through the analysis of a simulated small magnitude binary component fault in a boiling water reactor balance of plant. The results show that the procedure has the potential to be a very effective tool for incipient component fault diagnosis

  12. Manipulation of biological samples using micro and nano techniques

    DEFF Research Database (Denmark)

    Castillo, Jaime; Dimaki, Maria; Svendsen, Winnie Edith

    2009-01-01

    to their natural structure. Thanks to the advances in micro- and nanofabrication during the last decades several manipulation techniques offer us the possibility to image, characterize and manipulate biological material in a controlled way. Using these techniques the integration of biomaterials with remarkable...

  13. Determination of silver, gold, zinc and copper in mineral samples by various techniques of instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Rodriguez R, N. I.; Rios M, C.; Pinedo V, J. L.; Yoho, M.; Landsberger, S.

    2015-09-01

    Using the method of instrumental neutron activation analysis, mineral exploration samples were analyzed in order to determine the concentrations of silver, gold, zinc and copper; these minerals being the main products of benefit of Tizapa and Cozamin mines. Samples were subjected to various techniques, where the type of radiation and counting methods were chosen based on the specific isotopic characteristics of each element. For calibration and determination of concentrations the comparator method was used, certified standards were subjected to the same conditions of irradiation and measurement that the prospecting samples. The irradiations were performed at the research reactor TRIGA Mark II of the University of Texas at Austin. The silver concentrations were determined by Cyclical Epithermal Neutron Activation Analysis. This method in combination with the transfer pneumatic system allowed a good analytical precision and accuracy in prospecting for silver, from photo peak measurement 657.7 keV of short half-life radionuclide 110 Ag. For the determination of gold and zinc, Epithermal Neutron Activation Analysis was used, the photo peaks analyzed corresponded to the energies 411.8 keV of radionuclide 199 Au and 438.6 keV of metastable radionuclide 69m Zn. On the other hand, copper quantification was based on the photo peak analysis of 1039.2 keV produced by the short half-life radionuclide 66 Cu, by Thermal Neutron Activation Analysis. The photo peaks measurement corresponding to gold, zinc and copper was performed using a Compton suppression system, which allowed an improvement in the signal to noise relationship, so that better detection limits and low uncertainties associated with the results were obtained. Comparing elemental concentrations the highest values in silver, zinc and copper was for samples of mine Tizapa. Regarding gold values were found in the same range for both mines. To evaluate the precision and accuracy of the methods used, various geological

  14. All-optical optoacoustic microscopy based on probe beam deflection technique

    Directory of Open Access Journals (Sweden)

    Saher M. Maswadi

    2016-09-01

    Full Text Available Optoacoustic (OA microscopy using an all-optical system based on the probe beam deflection technique (PBDT for detection of laser-induced acoustic signals was investigated as an alternative to conventional piezoelectric transducers. PBDT provides a number of advantages for OA microscopy including (i efficient coupling of laser excitation energy to the samples being imaged through the probing laser beam, (ii undistorted coupling of acoustic waves to the detector without the need for separation of the optical and acoustic paths, (iii high sensitivity and (iv ultrawide bandwidth. Because of the unimpeded optical path in PBDT, diffraction-limited lateral resolution can be readily achieved. The sensitivity of the current PBDT sensor of 22 μV/Pa and its noise equivalent pressure (NEP of 11.4 Pa are comparable with these parameters of the optical micro-ring resonator and commercial piezoelectric ultrasonic transducers. Benefits of the present prototype OA microscope were demonstrated by successfully resolving micron-size details in histological sections of cardiac muscle.

  15. All-optical optoacoustic microscopy based on probe beam deflection technique.

    Science.gov (United States)

    Maswadi, Saher M; Ibey, Bennett L; Roth, Caleb C; Tsyboulski, Dmitri A; Beier, Hope T; Glickman, Randolph D; Oraevsky, Alexander A

    2016-09-01

    Optoacoustic (OA) microscopy using an all-optical system based on the probe beam deflection technique (PBDT) for detection of laser-induced acoustic signals was investigated as an alternative to conventional piezoelectric transducers. PBDT provides a number of advantages for OA microscopy including (i) efficient coupling of laser excitation energy to the samples being imaged through the probing laser beam, (ii) undistorted coupling of acoustic waves to the detector without the need for separation of the optical and acoustic paths, (iii) high sensitivity and (iv) ultrawide bandwidth. Because of the unimpeded optical path in PBDT, diffraction-limited lateral resolution can be readily achieved. The sensitivity of the current PBDT sensor of 22 μV/Pa and its noise equivalent pressure (NEP) of 11.4 Pa are comparable with these parameters of the optical micro-ring resonator and commercial piezoelectric ultrasonic transducers. Benefits of the present prototype OA microscope were demonstrated by successfully resolving micron-size details in histological sections of cardiac muscle.

  16. Characterization technique for detection of atom-size crystalline defects and strains using two-dimensional fast-Fourier-transform sampling Moiré method

    Science.gov (United States)

    Kodera, Masako; Wang, Qinghua; Ri, Shien; Tsuda, Hiroshi; Yoshioka, Akira; Sugiyama, Toru; Hamamoto, Takeshi; Miyashita, Naoto

    2018-04-01

    Recently, we have developed a two-dimensional (2D) fast-Fourier-transform (FFT) sampling Moiré technique to visually and quantitatively determine the locations of minute defects in a transmission electron microscopy (TEM) image. We applied this technique for defect detection with GaN high electron mobility transistor (HEMT) devices, and successfully and clearly visualized atom-size defects in AlGaN/GaN crystalline structures. The defect density obtained in the AlGaN/GaN structures is ∼1013 counts/cm2. In addition, we have successfully confirmed that the distribution and number of defects closely depend on the process conditions. Thus, this technique is quite useful for a device development. Moreover, the strain fields in an AlGaN/GaN crystal were effectively calculated with nm-scale resolution using this method. We also demonstrated that this sampling Moiré technique is applicable to silicon devices, which have principal directions different from those of AlGaN/GaN crystals. As a result, we believe that the 2D FFT sampling Moiré method has great potential applications to the discovery of new as yet unknown phenomena occurring between the characteristics of a crystalline material and device performance.

  17. Comparison of sampling techniques for Bayesian parameter estimation

    Science.gov (United States)

    Allison, Rupert; Dunkley, Joanna

    2014-02-01

    The posterior probability distribution for a set of model parameters encodes all that the data have to tell us in the context of a given model; it is the fundamental quantity for Bayesian parameter estimation. In order to infer the posterior probability distribution we have to decide how to explore parameter space. Here we compare three prescriptions for how parameter space is navigated, discussing their relative merits. We consider Metropolis-Hasting sampling, nested sampling and affine-invariant ensemble Markov chain Monte Carlo (MCMC) sampling. We focus on their performance on toy-model Gaussian likelihoods and on a real-world cosmological data set. We outline the sampling algorithms themselves and elaborate on performance diagnostics such as convergence time, scope for parallelization, dimensional scaling, requisite tunings and suitability for non-Gaussian distributions. We find that nested sampling delivers high-fidelity estimates for posterior statistics at low computational cost, and should be adopted in favour of Metropolis-Hastings in many cases. Affine-invariant MCMC is competitive when computing clusters can be utilized for massive parallelization. Affine-invariant MCMC and existing extensions to nested sampling naturally probe multimodal and curving distributions.

  18. Knowledge based systems advanced concepts, techniques and applications

    CERN Document Server

    1997-01-01

    The field of knowledge-based systems (KBS) has expanded enormously during the last years, and many important techniques and tools are currently available. Applications of KBS range from medicine to engineering and aerospace.This book provides a selected set of state-of-the-art contributions that present advanced techniques, tools and applications. These contributions have been prepared by a group of eminent researchers and professionals in the field.The theoretical topics covered include: knowledge acquisition, machine learning, genetic algorithms, knowledge management and processing under unc

  19. Use of the electrodeposition technique in the preparation of samples of 237Np and its determination by alpha spectrometry

    International Nuclear Information System (INIS)

    Mertzig, W.; Matsuda, H.T.; Araujo, B.F. de; Araujo, J.A. de.

    1981-05-01

    The electroplating technique to prepare 237 Np source and its determination by alpha spectrometry is presented. The samples were prepared using a lucite-eletrolitic cell manufactured at IPEN, specially to trace amounts of actinides. A polished brass disk coated with Ni film has been used as cathodo and a fixed Pt wire as anode. The electroplated samples were alpha counted using a surface barrier detector. The optimum conditions to obtain the quantitative deposition of 237 Np have been achieved by studying the effects of some parameters as current density, pH and concentration of eletrolitic solution and time of eletrodeposition, using a carrier technique. After preliminary purification, the method is applied to control trace amounts of 237 Np in the Purex process solutions. (Author) [pt

  20. Fractal Image Compression Based on High Entropy Values Technique

    Directory of Open Access Journals (Sweden)

    Douaa Younis Abbaas

    2018-04-01

    Full Text Available There are many attempts tried to improve the encoding stage of FIC because it consumed time. These attempts worked by reducing size of the search pool for pair range-domain matching but most of them led to get a bad quality, or a lower compression ratio of reconstructed image. This paper aims to present a method to improve performance of the full search algorithm by combining FIC (lossy compression and another lossless technique (in this case entropy coding is used. The entropy technique will reduce size of the domain pool (i. e., number of domain blocks based on the entropy value of each range block and domain block and then comparing the results of full search algorithm and proposed algorithm based on entropy technique to see each of which give best results (such as reduced the encoding time with acceptable values in both compression quali-ty parameters which are C. R (Compression Ratio and PSNR (Image Quality. The experimental results of the proposed algorithm proven that using the proposed entropy technique reduces the encoding time while keeping compression rates and reconstruction image quality good as soon as possible.

  1. Classification of sand samples according to radioactivity content by the use of euclidean and rough sets techniques

    International Nuclear Information System (INIS)

    Abd El-Monsef, M.M.; Kozae, A.M.; Seddeek, M.K.; Medhat, T.; Sharshar, T.; Badran, H.M.

    2004-01-01

    Form the geological point of view, the origin and transport of black and normal sands is particularly important. Black and normal sands came to their places along the Mediterranean-sea coast after transport by some natural process. Both types of sands have different radiological properties. This study is, therefore, attempts to use mathematical methods to classify Egyptian sand samples collected from 42 locations in an area of 40 x 19 km 2 based on their radioactivity contents. The use of all information resulted from the experimental measurements of radioactivity contents as well as some other parameters can be a time and effort consuming task. So that the process of eliminating unnecessary attributes is of prime importance. This elimination process of the superfluous attributes that cannot affect the decision was carried out. Some topological techniques to classify the information systems resulting from the radioactivity measurements were then carried out. These techniques were applied in Euclidean and quasi-discrete topological cases. While there are some applications in environmental radioactivity of the former case, the use of the quasi-discrete in the so-called rough set information analysis is new in such a study. The mathematical methods are summarized and the results and their radiological implications are discussed. Generally, the results indicate no radiological anomaly and it supports the hypothesis previously suggested about the presence of two types of sand in the studied area

  2. Tracer techniques for urine volume determination and urine collection and sampling back-up system

    Science.gov (United States)

    Ramirez, R. V.

    1971-01-01

    The feasibility, functionality, and overall accuracy of the use of lithium were investigated as a chemical tracer in urine for providing a means of indirect determination of total urine volume by the atomic absorption spectrophotometry method. Experiments were conducted to investigate the parameters of instrumentation, tracer concentration, mixing times, and methods for incorporating the tracer material in the urine collection bag, and to refine and optimize the urine tracer technique to comply with the Skylab scheme and operational parameters of + or - 2% of volume error and + or - 1% accuracy of amount of tracer added to each container. In addition, a back-up method for urine collection and sampling system was developed and evaluated. This back-up method incorporates the tracer technique for volume determination in event of failure of the primary urine collection and preservation system. One chemical preservative was selected and evaluated as a contingency chemical preservative for the storage of urine in event of failure of the urine cooling system.

  3. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  4. Pseudogenes and DNA-based diet analyses: A cautionary tale from a relatively well sampled predator-prey system

    DEFF Research Database (Denmark)

    Dunshea, G.; Barros, N. B.; Wells, R. S.

    2008-01-01

    Mitochondrial ribosomal DNA is commonly used in DNA-based dietary analyses. In such studies, these sequences are generally assumed to be the only version present in DNA of the organism of interest. However, nuclear pseudogenes that display variable similarity to the mitochondrial versions...... are common in many taxa. The presence of nuclear pseudogenes that co-amplify with their mitochondrial paralogues can lead to several possible confounding interpretations when applied to estimating animal diet. Here, we investigate the occurrence of nuclear pseudogenes in fecal samples taken from bottlenose...... dolphins (Tursiops truncatus) that were assayed for prey DNA with a universal primer technique. We found pseudogenes in 13 of 15 samples and 1-5 pseudogene haplotypes per sample representing 5-100% of all amplicons produced. The proportion of amplicons that were pseudogenes and the diversity of prey DNA...

  5. Quantifying Tip-Sample Interactions in Vacuum Using Cantilever-Based Sensors: An Analysis

    Science.gov (United States)

    Dagdeviren, Omur E.; Zhou, Chao; Altman, Eric I.; Schwarz, Udo D.

    2018-04-01

    Atomic force microscopy is an analytical characterization method that is able to image a sample's surface topography at high resolution while simultaneously probing a variety of different sample properties. Such properties include tip-sample interactions, the local measurement of which has gained much popularity in recent years. To this end, either the oscillation frequency or the oscillation amplitude and phase of the vibrating force-sensing cantilever are recorded as a function of tip-sample distance and subsequently converted into quantitative values for the force or interaction potential. Here, we theoretically and experimentally show that the force law obtained from such data acquired under vacuum conditions using the most commonly applied methods may deviate more than previously assumed from the actual interaction when the oscillation amplitude of the probe is of the order of the decay length of the force near the surface, which may result in a non-negligible error if correct absolute values are of importance. Caused by approximations made in the development of the mathematical reconstruction procedures, the related inaccuracies can be effectively suppressed by using oscillation amplitudes sufficiently larger than the decay length. To facilitate efficient data acquisition, we propose a technique that includes modulating the drive amplitude at a constant height from the surface while monitoring the oscillation amplitude and phase. Ultimately, such an amplitude-sweep-based force spectroscopy enables shorter data acquisition times and increased accuracy for quantitative chemical characterization compared to standard approaches that vary the tip-sample distance. An additional advantage is that since no feedback loop is active while executing the amplitude sweep, the force can be consistently recovered deep into the repulsive regime.

  6. Analysis of boron utilization in sample preparation for microorganisms detection by neutron radiography technique

    International Nuclear Information System (INIS)

    Wacha, Reinaldo; Crispim, Verginia R.

    2000-01-01

    The neutron radiography technique applied to the microorganisms detection is the study of a new and faster alternative for diagnosis of infectious means. This work presents the parameters and the effects involved in the use of the boron as a conversion agent, that convert neutrons in a particles, capable ones of generating latent tracks in a solid state nuclear tracks detector, CR-39. The collected samples are doped with the boron by the incubation method, propitiating an interaction microorganisms/boron, that will guarantee the identification of the images of those microorganisms, through your morphology. (author)

  7. A microhistological technique for analysis of food habits of mycophagous rodents.

    Science.gov (United States)

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  8. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  9. Optimization and analysis of a quantitative real-time PCR-based technique to determine microRNA expression in formalin-fixed paraffin-embedded samples

    Directory of Open Access Journals (Sweden)

    Reis Patricia P

    2010-06-01

    Full Text Available Abstract Background MicroRNAs (miRs are non-coding RNA molecules involved in post-transcriptional regulation, with diverse functions in tissue development, differentiation, cell proliferation and apoptosis. miRs may be less prone to degradation during formalin fixation, facilitating miR expression studies in formalin-fixed paraffin-embedded (FFPE tissue. Results Our study demonstrates that the TaqMan Human MicroRNA Array v1.0 (Early Access platform is suitable for miR expression analysis in FFPE tissue with a high reproducibility (correlation coefficients of 0.95 between duplicates, p 35, we show that reproducibility between technical replicates, equivalent dilutions, and FFPE vs. frozen samples is best in the high abundance stratum. We also demonstrate that the miR expression profiles of FFPE samples are comparable to those of fresh-frozen samples, with a correlation of up to 0.87 (p Conclusion Our study thus demonstrates the utility, reproducibility, and optimization steps needed in miR expression studies using FFPE samples on a high-throughput quantitative PCR-based miR platform, opening up a realm of research possibilities for retrospective studies.

  10. Solid phase microextraction headspace sampling of chemical warfare agent contaminated samples : method development for GC-MS analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson Lepage, C.R.; Hancock, J.R. [Defence Research and Development Canada, Medicine Hat, AB (Canada); Wyatt, H.D.M. [Regina Univ., SK (Canada)

    2004-07-01

    Defence R and D Canada-Suffield (DRDC-Suffield) is responsible for analyzing samples that are suspected to contain chemical warfare agents, either collected by the Canadian Forces or by first-responders in the event of a terrorist attack in Canada. The analytical techniques used to identify the composition of the samples include gas chromatography-mass spectrometry (GC-MS), liquid chromatography-mass spectrometry (LC-MS), Fourier-transform infrared spectroscopy (FT-IR) and nuclear magnetic resonance spectroscopy. GC-MS and LC-MS generally require solvent extraction and reconcentration, thereby increasing sample handling. The authors examined analytical techniques which reduce or eliminate sample manipulation. In particular, this paper presented a screening method based on solid phase microextraction (SPME) headspace sampling and GC-MS analysis for chemical warfare agents such as mustard, sarin, soman, and cyclohexyl methylphosphonofluoridate in contaminated soil samples. SPME is a method which uses small adsorbent polymer coated silica fibers that trap vaporous or liquid analytes for GC or LC analysis. Collection efficiency can be increased by adjusting sampling time and temperature. This method was tested on two real-world samples, one from excavated chemical munitions and the second from a caustic decontamination mixture. 7 refs., 2 tabs., 3 figs.

  11. Rapid Fractionation and Isolation of Whole Blood Components in Samples Obtained from a Community-based Setting.

    Science.gov (United States)

    Weckle, Amy; Aiello, Allison E; Uddin, Monica; Galea, Sandro; Coulborn, Rebecca M; Soliven, Richelo; Meier, Helen; Wildman, Derek E

    2015-11-30

    Collection and processing of whole blood samples in a non-clinical setting offers a unique opportunity to evaluate community-dwelling individuals both with and without preexisting conditions. Rapid processing of these samples is essential to avoid degradation of key cellular components. Included here are methods for simultaneous peripheral blood mononuclear cell (PBMC), DNA, RNA and serum isolation from a single blood draw performed in the homes of consenting participants across a metropolitan area, with processing initiated within 2 hr of collection. We have used these techniques to process over 1,600 blood specimens yielding consistent, high quality material, which has subsequently been used in successful DNA methylation, genotyping, gene expression and flow cytometry analyses. Some of the methods employed are standard; however, when combined in the described manner, they enable efficient processing of samples from participants of population- and/or community-based studies who would not normally be evaluated in a clinical setting. Therefore, this protocol has the potential to obtain samples (and subsequently data) that are more representative of the general population.

  12. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    Science.gov (United States)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  13. Auto-correlation based intelligent technique for complex waveform presentation and measurement

    International Nuclear Information System (INIS)

    Rana, K P S; Singh, R; Sayann, K S

    2009-01-01

    Waveform acquisition and presentation forms the heart of many measurement systems. Particularly, data acquisition and presentation of repeating complex signals like sine sweep and frequency-modulated signals introduces the challenge of waveform time period estimation and live waveform presentation. This paper presents an intelligent technique, for waveform period estimation of both the complex and simple waveforms, based on the normalized auto-correlation method. The proposed technique is demonstrated using LabVIEW based intensive simulations on several simple and complex waveforms. Implementation of the technique is successfully demonstrated using LabVIEW based virtual instrumentation. Sine sweep vibration waveforms are successfully presented and measured for electrodynamic shaker system generated vibrations. The proposed method is also suitable for digital storage oscilloscope (DSO) triggering, for complex signals acquisition and presentation. This intelligence can be embodied into the DSO, making it an intelligent measurement system, catering wide varieties of the waveforms. The proposed technique, simulation results, robustness study and implementation results are presented in this paper.

  14. Determination of toxic and trace elements in human hair and sediment samples by reactor neutron activation analysis technique based-on the k-zero method

    International Nuclear Information System (INIS)

    Ho Manh Dung; Nguyen Mong Sinh; Nguyen Thanh Binh; Cao Dong Vu; Nguyen Thi Si

    2004-01-01

    The analysis of human hair can evaluate the degree of environmental pollutants exposure to human body, intakes of food and metabolism. Also, the analysis of sediment can aid in reconstructing the history of changes, understanding human impact on the ecosystem, and suggesting possible remedial strategies. The k o -standardization method of neutron activation analysis (k o -NAA) on research reactor is capable to play an important role as a main analytical technique with the advantages of sensitivity, precision, accuracy, multielement and routine for the sample object. Therefore, the project's aim is to build the k o -NAA procedures on the Dalat research reactor for the analysis of human hair and sediment samples. The K o -NAA procedure on the Dalat research reactor is able to determine of multielement: Ag, Al, As, Br, Ca, Cl, Co, Cr, Cu, Fe, Hg, I, K Mg, Mn, Na, S, Sb, Se, Sr, Ti, V and Zn in The human hair; and of multielement: As, Co, Cr, Cs, Fe, Hf, K, La, Mn, Na Rb, Sb, Sc, Yb and Zn in the sediment. (author)

  15. Comparison of Two Surface Contamination Sampling Techniques Conducted for the Characterization of Two Pajarito Site Manhattan Project National Historic Park Properties

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Tammy Ann [Montana Tech of the Univ. of Montana, Butte, MT (United States); Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-29

    Technical Area-18 (TA-18), also known as Pajarito Site, is located on Los Alamos National Laboratory property and has historic buildings that will be included in the Manhattan Project National Historic Park. Characterization studies of metal contamination were needed in two of the four buildings that are on the historic registry in this area, a “battleship” bunker building (TA-18-0002) and the Pond cabin (TA-18-0029). However, these two buildings have been exposed to the elements, are decades old, and have porous and rough surfaces (wood and concrete). Due to these conditions, it was questioned whether standard wipe sampling would be adequate to detect surface dust metal contamination in these buildings. Thus, micro-vacuum and surface wet wipe sampling techniques were performed side-by-side at both buildings and results were compared statistically. A two-tail paired t-test revealed that the micro-vacuum and wet wipe techniques were statistically different for both buildings. Further mathematical analysis revealed that the wet wipe technique picked up more metals from the surface than the microvacuum technique. Wet wipes revealed concentrations of beryllium and lead above internal housekeeping limits; however, using an yttrium normalization method with linear regression analysis between beryllium and yttrium revealed a correlation indicating that the beryllium levels were likely due to background and not operational contamination. PPE and administrative controls were implemented for National Park Service (NPS) and Department of Energy (DOE) tours as a result of this study. Overall, this study indicates that the micro-vacuum technique may not be an efficient technique to sample for metal dust contamination.

  16. Technical Note: A novel rocket-based in situ collection technique for mesospheric and stratospheric aerosol particles

    Directory of Open Access Journals (Sweden)

    W. Reid

    2013-03-01

    Full Text Available A technique for collecting aerosol particles between altitudes of 17 and 85 km is described. Spin-stabilized collection probes are ejected from a sounding rocket allowing for multi-point measurements. Each probe is equipped with 110 collection samples that are 3 mm in diameter. The collection samples are one of three types: standard transmission electron microscopy carbon grids, glass fibre filter paper or silicone gel. Collection samples are exposed over a 50 m to 5 km height range with a total of 45 separate ranges. Post-flight electron microscopy will give size-resolved information on particle number, shape and elemental composition. Each collection probe is equipped with a suite of sensors to capture the probe's status during the fall. Parachute recovery systems along with GPS-based localization will ensure that each probe can be located and recovered for post-flight analysis.

  17. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  18. Craniospinal radiotherapy in children: Electron- or photon-based technique of spinal irradiation

    International Nuclear Information System (INIS)

    Chojnacka, M.; Skowronska-Gardas, A.; Pedziwiatr, K.; Morawska-Kaczynska, M.; Zygmuntowicz-Pietka, A.; Semaniak, A.

    2010-01-01

    Background: The prone position and electron-based technique for craniospinal irradiation (CSI) have been standard in our department for many years. But this immobilization is difficult for the anaesthesiologist to gain airway access. The increasing number of children treated under anaesthesia led us to reconsider our technique. Aim: The purpose of this study is to report our new photon-based technique for CSI which could be applied in both the supine and the prone position and to compare this technique with our electron-based technique. Materials and methods: Between November 2007 and May 2008, 11 children with brain tumours were treated in the prone position with CSI. For 9 patients two treatment plans were created: the first one using photons and the second one using electron beams for spinal irradiation. We prepared seven 3D-conformal photon plans and four forward planned segmented field plans. We compared 20 treatment plans in terms of target dose homogeneity and sparing of organs at risk. Results: In segmented field plans better dose homogeneity in the thecal sac volume was achieved than in electron-based plans. Regarding doses in organs at risk, in photon-based plans we obtained a lower dose in the thyroid but a higher one in the heart and liver. Conclusions: Our technique can be applied in both the supine and prone position and it seems to be more feasible and precise than the electron technique. However, more homogeneous target coverage and higher precision of dose delivery for photons are obtained at the cost of slightly higher doses to the heart and liver. (authors)

  19. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  20. Backstepping Based Formation Control of Quadrotors with the State Transformation Technique

    Directory of Open Access Journals (Sweden)

    Keun Uk Lee

    2017-11-01

    Full Text Available In this paper, a backstepping-based formation control of quadrotors with the state transformation technique is proposed. First, the dynamics of a quadrotor is derived by using the Newton–Euler formulation. Next, a backstepping-based formation control for quadrotors using a state transformation technique is presented. In the position control, which is the basis of formation control, it is possible to derive the reference attitude angles employing a state transformation technique without the small angle assumption or the simplified dynamics usually used. Stability analysis based on the Lyapunov theorem shows that the proposed formation controller can provide a quadrotor formation error system that is asymptotically stabilized. Finally, we verify the performance of the proposed formation control method through comparison simulations.