WorldWideScience

Sample records for reference sample approach

  1. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  2. The Lyman alpha reference sample

    DEFF Research Database (Denmark)

    Hayes, M.; Östlin, G.; Schaerer, D.

    2013-01-01

    We report on new imaging observations of the Lyman alpha emission line (Lyα), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028

  3. Criteria to define a more relevant reference sample of titanium dioxide in the context of food: a multiscale approach.

    Science.gov (United States)

    Dudefoi, William; Terrisse, Hélène; Richard-Plouet, Mireille; Gautron, Eric; Popa, Florin; Humbert, Bernard; Ropers, Marie-Hélène

    2017-05-01

    Titanium dioxide (TiO 2 ) is a transition metal oxide widely used as a white pigment in various applications, including food. Due to the classification of TiO 2 nanoparticles by the International Agency for Research on Cancer as potentially harmful for humans by inhalation, the presence of nanoparticles in food products needed to be confirmed by a set of independent studies. Seven samples of food-grade TiO 2 (E171) were extensively characterised for their size distribution, crystallinity and surface properties by the currently recommended methods. All investigated E171 samples contained a fraction of nanoparticles, however, below the threshold defining the labelling of nanomaterial. On the basis of these results and a statistical analysis, E171 food-grade TiO 2 totally differs from the reference material P25, confirming the few published data on this kind of particle. Therefore, the reference material P25 does not appear to be the most suitable model to study the fate of food-grade TiO 2 in the gastrointestinal tract. The criteria currently to obtain a representative food-grade sample of TiO 2 are the following: (1) crystalline-phase anatase, (2) a powder with an isoelectric point very close to 4.1, (3) a fraction of nanoparticles comprised between 15% and 45%, and (4) a low specific surface area around 10 m 2  g - 1 .

  4. Sampling the reference set’ revisited

    NARCIS (Netherlands)

    Berkum, van E.E.M.; Linssen, H.N.; Overdijk, D.A.

    1998-01-01

    The confidence level of an inference table is defined as a weighted truth probability of the inference when sampling the reference set. The reference set is recognized by conditioning on the values of maximal partially ancillary statistics. In the sampling experiment values of incidental parameters

  5. Reference samples for the earth sciences

    Science.gov (United States)

    Flanagan, F.J.

    1974-01-01

    A revised list of reference samples of interest to geoscientists has been extended to include samples for the agronomist, the archaeologist and the environmentalist. In addition to the source from which standard samples may be obtained, references or pertinent notes for some samples are included. The number of rock reference samples is now almost adequate, and the variety of ore samples will soon be sufficient. There are very few samples for microprobe work. Oil shales will become more important because of the outlook for world petroleum resources. The dryland equivalent of a submarine basalt might be useful in studies of sea-floor spreading and of the geochemistry of basalts. The Na- and K-feldspars of BCS (British Chemical Standards-Bureau of Analysed Samples), NBS (National Bureau of Standards), and ANRT (Association Kationale de la Recherche Technique) could serve as trace-element standards if such data were available. Similarly, the present NBS flint and plastic clays, as well as their predecessors, might be useful for archaeological pottery studies. The International Decade for Ocean Exploration may stimulate the preparation of ocean-water standards for trace elements or pollutants and a standard for manganese nodules. ?? 1974.

  6. Large sample neutron activation analysis of a reference inhomogeneous sample

    International Nuclear Information System (INIS)

    Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.

    2011-01-01

    A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)

  7. Development of standard reference samples for diffractometry

    International Nuclear Information System (INIS)

    Galvao, Antonio de Sant'Ana

    2011-01-01

    In this work, samples of standard reference materials for diffractometry of polycrystals were developed. High-purity materials were submitted to mechanical and thermal treatments in order to present the adequate properties to be used as high-quality standard reference materials for powder diffraction, comparable to the internationally recognized produced by the USA National Institute of Standards and Technology NIST, but at lower costs. The characterization of the standard materials was performed by measurements in conventional X-ray diffraction diffractometers, high resolution neutron diffraction and high-resolution synchrotron diffraction. The lattice parameters were calculated by extrapolation of the values obtained from each X-ray reflection against cos 2 θ by the Least-Squares Method. The adjustments were compared to the values obtained by the Rietveld Method, using program GSAS. The materials thus obtained were the α-alumina, yttrium oxide, silicon, cerium oxide, lanthanum hexaboride and lithium fluoride. The standard reference materials produced present quality similar or, in some cases, superior to the standard reference materials produced and commercialized by the NIST. (author)

  8. The Lyα Reference Sample

    DEFF Research Database (Denmark)

    Ostlin, Goran; Hayes, Matthew; Duval, Florent

    2014-01-01

    The Lyα Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Lyαline of neutral hydrogen. Lyα is the dominant spectral...... are produced (whether or not they escape), we demanded an Hα equivalent width W(Hα) ≥100 Å. The final sample of 14 galaxies covers far-UV (FUV, λ ~ 1500 Å) luminosities that overlap with those of high-z Lyα emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present......) but strongly asymmetric Lyα emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate outflow in the neutral interstellar medium (probed by low ionization stage absorption features) and Lyα emission with an asymmetric profile. Radiative transfer...

  9. Use of reference samples for more accurate RBS analyses

    International Nuclear Information System (INIS)

    Lanford, W.A.; Pelicon, P.; Zorko, B.; Budnar, M.

    2002-01-01

    While one of the primary assets of RBS analysis is that it is quantitative without use of reference samples, for certain types of analyses the precision of the method can be improved by measuring RBS spectra of unknowns relative to the RBS spectra of a similar known sample. The advantage of such an approach is that one can reduce (or eliminate) the uncertainties that arise from error in the detector solid angle, beam current integration efficiency, scattering cross-section, and stopping powers. We have used this approach extensively to determine the composition (x) of homogeneous thin films of TaN x using as reference samples films of pure Ta. Our approach is to measure R=(Ta count) unknown /(Ta count) standard and use RUMP to determine the function x(R). Once the function x(R) has been determined, this approach makes it easy to analyze many samples quickly. Other analyses for which this approach has proved useful are determination of the composition (x) of WN x , SiO x H y and SiN x H y , using W, SiO 2 and amorphous Si as reference samples, respectively

  10. A soil sampling reference site: The challenge in defining reference material for sampling

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; Perk, Marcel van der

    2008-01-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations

  11. A soil sampling reference site: The challenge in defining reference material for sampling

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, Rome 100-00128 (Italy); Fajgelj, Ales [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, Vienna A-1400 (Austria); Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto [Jozef Stefan Institute, Jamova 39, Ljubljana 1000 (Slovenia); Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, TC Utrecht 3508 (Netherlands)

    2008-11-15

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  12. A soil sampling reference site: the challenge in defining reference material for sampling.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Jacimovic, Radojko; Jeran, Zvonka; Sansone, Umberto; van der Perk, Marcel

    2008-11-01

    In the frame of the international SOILSAMP project, funded and coordinated by the Italian Environmental Protection Agency, an agricultural area was established as a reference site suitable for performing soil sampling inter-comparison exercises. The reference site was characterized for trace element content in soil, in terms of the spatial and temporal variability of their mass fraction. Considering that the behaviour of long-lived radionuclides in soil can be expected to be similar to that of some stable trace elements and that the distribution of these trace elements in soil can simulate the distribution of radionuclides, the reference site characterised in term of trace elements, can be also used to compare the soil sampling strategies developed for radionuclide investigations.

  13. Estimation of individual reference intervals in small sample sizes

    DEFF Research Database (Denmark)

    Hansen, Ase Marie; Garde, Anne Helene; Eller, Nanna Hurwitz

    2007-01-01

    In occupational health studies, the study groups most often comprise healthy subjects performing their work. Sampling is often planned in the most practical way, e.g., sampling of blood in the morning at the work site just after the work starts. Optimal use of reference intervals requires...... from various variables such as gender, age, BMI, alcohol, smoking, and menopause. The reference intervals were compared to reference intervals calculated using IFCC recommendations. Where comparable, the IFCC calculated reference intervals had a wider range compared to the variance component models...

  14. Reference Priors For Non-Normal Two-Sample Problems

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1997-01-01

    The reference prior algorithm (Berger and Bernardo, 1992) is applied to locationscale models with any regular sampling density. A number of two-sample problems is analyzed in this general context, extending the dierence, ratio and product of Normal means problems outside Normality, while explicitly

  15. Phobos Sample Return: Next Approach

    Science.gov (United States)

    Zelenyi, Lev; Martynov, Maxim; Zakharov, Alexander; Korablev, Oleg; Ivanov, Alexey; Karabadzak, George

    The Martian moons still remain a mystery after numerous studies by Mars orbiting spacecraft. Their study cover three major topics related to (1) Solar system in general (formation and evolution, origin of planetary satellites, origin and evolution of life); (2) small bodies (captured asteroid, or remnants of Mars formation, or reaccreted Mars ejecta); (3) Mars (formation and evolution of Mars; Mars ejecta at the satellites). As reviewed by Galimov [2010] most of the above questions require the sample return from the Martian moon, while some (e.g. the characterization of the organic matter) could be also answered by in situ experiments. There is the possibility to obtain the sample of Mars material by sampling Phobos: following to Chappaz et al. [2012] a 200-g sample could contain 10-7 g of Mars surface material launched during the past 1 mln years, or 5*10-5 g of Mars material launched during the past 10 mln years, or 5*1010 individual particles from Mars, quantities suitable for accurate laboratory analyses. The studies of Phobos have been of high priority in the Russian program on planetary research for many years. Phobos-88 mission consisted of two spacecraft (Phobos-1, Phobos-2) and aimed the approach to Phobos at 50 m and remote studies, and also the release of small landers (long-living stations DAS). This mission implemented the program incompletely. It was returned information about the Martian environment and atmosphere. The next profect Phobos Sample Return (Phobos-Grunt) initially planned in early 2000 has been delayed several times owing to budget difficulties; the spacecraft failed to leave NEO in 2011. The recovery of the science goals of this mission and the delivery of the samples of Phobos to Earth remain of highest priority for Russian scientific community. The next Phobos SR mission named Boomerang was postponed following the ExoMars cooperation, but is considered the next in the line of planetary exploration, suitable for launch around 2022. A

  16. Determination of toxic trace elements in body fluid reference samples

    International Nuclear Information System (INIS)

    Gills, T.E.; McClendon, L.T.; Maienthal, E.J.; Becker, D.A.; Durst, R.A.; LaFleur, P.D.

    1974-01-01

    The measurement of elemental concentration in body fluids has been widely used to give indication of exposures to certain toxic materials and/or a measure of body burden. To understand fully the toxicological effect of these trace elements on our physiological system, meaningful analytical data are required along with accurate standards or reference samples. The National Bureau of Standards has prepared for the National Institute for Occupational Safety and Health (NIOSH) a number of reference samples containing selected toxic trace elements in body fluids. The reference samples produced include mercury in urine at three concentration levels, five elements (Se, Cu, As, Ni and Cr) in freeze-dried urine at two levels, fluorine in freeze-dried urine at two levels and lead in blood at two concentration levels. These reference samples have been found to be extremely useful for the evaluation of field and laboratory analytical methods for the analysis of toxic trace elements. In particular the use of at least two calibration points (i.e., ''normal'' and ''elevated'' levels) for a given matrix provides a more positive calibration for most analytical techniques over the range of interest for occupational toxicological levels of exposure. (U.S.)

  17. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    Science.gov (United States)

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  18. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. A structural SVM approach for reference parsing.

    Science.gov (United States)

    Zhang, Xiaoli; Zou, Jie; Le, Daniel X; Thoma, George R

    2011-06-09

    Automated extraction of bibliographic data, such as article titles, author names, abstracts, and references is essential to the affordable creation of large citation databases. References, typically appearing at the end of journal articles, can also provide valuable information for extracting other bibliographic data. Therefore, parsing individual reference to extract author, title, journal, year, etc. is sometimes a necessary preprocessing step in building citation-indexing systems. The regular structure in references enables us to consider reference parsing a sequence learning problem and to study structural Support Vector Machine (structural SVM), a newly developed structured learning algorithm on parsing references. In this study, we implemented structural SVM and used two types of contextual features to compare structural SVM with conventional SVM. Both methods achieve above 98% token classification accuracy and above 95% overall chunk-level accuracy for reference parsing. We also compared SVM and structural SVM to Conditional Random Field (CRF). The experimental results show that structural SVM and CRF achieve similar accuracies at token- and chunk-levels. When only basic observation features are used for each token, structural SVM achieves higher performance compared to SVM since it utilizes the contextual label features. However, when the contextual observation features from neighboring tokens are combined, SVM performance improves greatly, and is close to that of structural SVM after adding the second order contextual observation features. The comparison of these two methods with CRF using the same set of binary features show that both structural SVM and CRF perform better than SVM, indicating their stronger sequence learning ability in reference parsing.

  20. Determination of sampling constants in NBS geochemical standard reference materials

    International Nuclear Information System (INIS)

    Filby, R.H.; Bragg, A.E.; Grimm, C.A.

    1986-01-01

    Recently Filby et al. showed that, for several elements, National Bureau of Standards (NBS) Fly Ash standard reference material (SRM) 1633a was a suitable reference material for microanalysis (sample weights 2 , and the mean sample weight, W vector, K/sub s/ = (S/sub s/%) 2 W vector, could not be determined from these data because it was not possible to quantitate other sources of error in the experimental variances. K/sub s/ values for certified elements in geochemical SRMs provide important homogeneity information for microanalysis. For mineralogically homogeneous SRMs (i.e., small K/sub s/ values for associated elements) such as the proposed clays, it is necessary to determine K/sub s/ by analysis of very small sample aliquots to maximize the subsampling variance relative to other sources of error. This source of error and the blank correction for the sample container can be eliminated by determining K/sub s/ from radionuclide activities of weighed subsamples of a preirradiated SRM

  1. Hafnium isotope ratios of nine GSJ reference samples

    International Nuclear Information System (INIS)

    Hanyu, Takeshi; Nakai, Shun'ichi; Tatsuta, Riichiro

    2005-01-01

    176 Hf/ 177 Hf ratios of nine geochemical reference rocks from the Geological Survey of Japan, together with BIR-1 and BCR-2, were determined using multi-collector inductively coupled plasma mass spectrometry. Our data for BIR-1, BCR-2 and JB-1 are in agreement with those previously reported, demonstrating the appropriateness of the chemical procedure and isotopic measurement employed in this study. The reference rocks have a wide range of 176 Hf/ 177 Hf covering the field defined by various volcanic rocks, such as mid-ocean ridge basalts, ocean island basalts, and subduction related volcanic rocks. They are therefore suitable as rock standards for Hf isotope measurement of geological samples. (author)

  2. Soil sampling strategies: Evaluation of different approaches

    International Nuclear Information System (INIS)

    De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-01-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2σ, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies

  3. Soil sampling strategies: Evaluation of different approaches

    Energy Technology Data Exchange (ETDEWEB)

    De Zorzi, Paolo [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy)], E-mail: paolo.dezorzi@apat.it; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Metrologia Ambientale, Via di Castel Romano, 100-00128 Roma (Italy); Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia [Agenzia Regionale per la Prevenzione e Protezione dell' Ambiente del Veneto, ARPA Veneto, U.O. Centro Qualita Dati, Via Spalato, 14-36045 Vicenza (Italy)

    2008-11-15

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2{sigma}, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  4. Soil sampling strategies: evaluation of different approaches.

    Science.gov (United States)

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  5. Reference volumetric samples of gamma-spectroscopic sources

    International Nuclear Information System (INIS)

    Taskaev, E.; Taskaeva, M.; Grigorov, T.

    1993-01-01

    The purpose of this investigation is to determine the requirements for matrices of reference volumetric radiation sources necessary for detector calibration. The first stage of this determination consists in analysing some available organic and nonorganic materials. Different sorts of food, grass, plastics, minerals and building materials have been considered, taking into account the various procedures of their processing (grinding, screening, homogenizing) and their properties (hygroscopy, storage life, resistance to oxidation during gamma sterilization). The procedures of source processing, sample preparation, matrix irradiation and homogenization have been determined. A rotation homogenizing device has been elaborated enabling to homogenize the matrix activity irrespective of the vessel geometry. 33 standard volumetric radioactive sources have been prepared: 14 - on organic matrix and 19 - on nonorganic matrix. (author)

  6. Characterisation of a reference site for quantifying uncertainties related to soil sampling

    International Nuclear Information System (INIS)

    Barbizzi, Sabrina; Zorzi, Paolo de; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter

    2004-01-01

    An integrated approach to quality assurance in soil sampling remains to be accomplished. - The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the 'fit-for-purpose' method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated

  7. Identification and assembly of genomes and genetic elements in complex metagenomic samples without using reference genomes.

    Science.gov (United States)

    Nielsen, H Bjørn; Almeida, Mathieu; Juncker, Agnieszka Sierakowska; Rasmussen, Simon; Li, Junhua; Sunagawa, Shinichi; Plichta, Damian R; Gautier, Laurent; Pedersen, Anders G; Le Chatelier, Emmanuelle; Pelletier, Eric; Bonde, Ida; Nielsen, Trine; Manichanh, Chaysavanh; Arumugam, Manimozhiyan; Batto, Jean-Michel; Quintanilha Dos Santos, Marcelo B; Blom, Nikolaj; Borruel, Natalia; Burgdorf, Kristoffer S; Boumezbeur, Fouad; Casellas, Francesc; Doré, Joël; Dworzynski, Piotr; Guarner, Francisco; Hansen, Torben; Hildebrand, Falk; Kaas, Rolf S; Kennedy, Sean; Kristiansen, Karsten; Kultima, Jens Roat; Léonard, Pierre; Levenez, Florence; Lund, Ole; Moumen, Bouziane; Le Paslier, Denis; Pons, Nicolas; Pedersen, Oluf; Prifti, Edi; Qin, Junjie; Raes, Jeroen; Sørensen, Søren; Tap, Julien; Tims, Sebastian; Ussery, David W; Yamada, Takuji; Renault, Pierre; Sicheritz-Ponten, Thomas; Bork, Peer; Wang, Jun; Brunak, Søren; Ehrlich, S Dusko

    2014-08-01

    Most current approaches for analyzing metagenomic data rely on comparisons to reference genomes, but the microbial diversity of many environments extends far beyond what is covered by reference databases. De novo segregation of complex metagenomic data into specific biological entities, such as particular bacterial strains or viruses, remains a largely unsolved problem. Here we present a method, based on binning co-abundant genes across a series of metagenomic samples, that enables comprehensive discovery of new microbial organisms, viruses and co-inherited genetic entities and aids assembly of microbial genomes without the need for reference sequences. We demonstrate the method on data from 396 human gut microbiome samples and identify 7,381 co-abundance gene groups (CAGs), including 741 metagenomic species (MGS). We use these to assemble 238 high-quality microbial genomes and identify affiliations between MGS and hundreds of viruses or genetic entities. Our method provides the means for comprehensive profiling of the diversity within complex metagenomic samples.

  8. Statistical sampling approaches for soil monitoring

    NARCIS (Netherlands)

    Brus, D.J.

    2014-01-01

    This paper describes three statistical sampling approaches for regional soil monitoring, a design-based, a model-based and a hybrid approach. In the model-based approach a space-time model is exploited to predict global statistical parameters of interest such as the space-time mean. In the hybrid

  9. Artificial radioactivity in the environmental samples as IAEA reference materials

    International Nuclear Information System (INIS)

    Salagean, M.; Pantelica, A.

    1998-01-01

    . Uncontaminated by nuclear activities: IAEA-327, Podsolic soil collected in 1990 from the Moscow region and considered uncontaminated by radionuclides of the Chernobyl accident or by other nuclear activities. The results obtained by our laboratory are in good agreement with the certified IAEA data. Generally, the concentration of the artificial radionuclides in the investigated samples is higher than that expected from the influence of global fallout in the intercomparison materials distributed before Chernobyl accident. Concerning the nature of these investigated IAEA reference materials, very high values for the concentration levels of cesium radionuclides especially in IAEA-373 (grass) and IAEA-375 (soil) samples collected in the vicinity of Chernobyl Power Station after the nuclear accident in 1986 were found. High levels of radioactivities for the artificial radionuclides were also determined in the samples collected in the neighbourhood of the nuclear installations, especially in marine sediment (IAEA-135). It is of interest to point out the high concentration of cesium radionuclides in IAEA-300 sediment collected in 1992 in the Baltic Sea in comparison with the IAEA-306 sediment collected also in the Baltic Sea in 1986. It seems to be an increase of the Baltic Sea artificial radioactivity by accumulation in time. Marine sediment constitutes an important component of marine ecosystem since it represents the final sink for any releases of wastes into the sea. These certified radioactive materials are very useful to all laboratories engaged in the radioactive pollution investigations on environmental samples. (authors)

  10. Validation of endogenous reference genes for qRT-PCR analysis of human visceral adipose samples.

    Science.gov (United States)

    Mehta, Rohini; Birerdinc, Aybike; Hossain, Noreen; Afendy, Arian; Chandhoke, Vikas; Younossi, Zobair; Baranova, Ancha

    2010-05-21

    Given the epidemic proportions of obesity worldwide and the concurrent prevalence of metabolic syndrome, there is an urgent need for better understanding the underlying mechanisms of metabolic syndrome, in particular, the gene expression differences which may participate in obesity, insulin resistance and the associated series of chronic liver conditions. Real-time PCR (qRT-PCR) is the standard method for studying changes in relative gene expression in different tissues and experimental conditions. However, variations in amount of starting material, enzymatic efficiency and presence of inhibitors can lead to quantification errors. Hence the need for accurate data normalization is vital. Among several known strategies for data normalization, the use of reference genes as an internal control is the most common approach. Recent studies have shown that both obesity and presence of insulin resistance influence an expression of commonly used reference genes in omental fat. In this study we validated candidate reference genes suitable for qRT-PCR profiling experiments using visceral adipose samples from obese and lean individuals. Cross-validation of expression stability of eight selected reference genes using three popular algorithms, GeNorm, NormFinder and BestKeeper found ACTB and RPII as most stable reference genes. We recommend ACTB and RPII as stable reference genes most suitable for gene expression studies of human visceral adipose tissue. The use of these genes as a reference pair may further enhance the robustness of qRT-PCR in this model system.

  11. Validation of endogenous reference genes for qRT-PCR analysis of human visceral adipose samples

    Directory of Open Access Journals (Sweden)

    Afendy Arian

    2010-05-01

    Full Text Available Abstract Background Given the epidemic proportions of obesity worldwide and the concurrent prevalence of metabolic syndrome, there is an urgent need for better understanding the underlying mechanisms of metabolic syndrome, in particular, the gene expression differences which may participate in obesity, insulin resistance and the associated series of chronic liver conditions. Real-time PCR (qRT-PCR is the standard method for studying changes in relative gene expression in different tissues and experimental conditions. However, variations in amount of starting material, enzymatic efficiency and presence of inhibitors can lead to quantification errors. Hence the need for accurate data normalization is vital. Among several known strategies for data normalization, the use of reference genes as an internal control is the most common approach. Recent studies have shown that both obesity and presence of insulin resistance influence an expression of commonly used reference genes in omental fat. In this study we validated candidate reference genes suitable for qRT-PCR profiling experiments using visceral adipose samples from obese and lean individuals. Results Cross-validation of expression stability of eight selected reference genes using three popular algorithms, GeNorm, NormFinder and BestKeeper found ACTB and RPII as most stable reference genes. Conclusions We recommend ACTB and RPII as stable reference genes most suitable for gene expression studies of human visceral adipose tissue. The use of these genes as a reference pair may further enhance the robustness of qRT-PCR in this model system.

  12. Adult health study reference papers. Selection of the sample. Characteristics of the sample

    Energy Technology Data Exchange (ETDEWEB)

    Beebe, G W; Fujisawa, Hideo; Yamasaki, Mitsuru

    1960-12-14

    The characteristics and selection of the clinical sample have been described in some detail to provide information on the comparability of the exposure groups with respect to factors excluded from the matching criteria and to provide basic descriptive information potentially relevant to individual studies that may be done within the framework of the Adult Health Study. The characteristics under review here are age, sex, many different aspects of residence, marital status, occupation and industry, details of location and shielding ATB, acute radiation signs and symptoms, and prior ABCC medical or pathology examinations. 5 references, 57 tables.

  13. Prediction of the Reference Evapotranspiration Using a Chaotic Approach

    Science.gov (United States)

    Wang, Wei-guang; Zou, Shan; Luo, Zhao-hui; Zhang, Wei; Kong, Jun

    2014-01-01

    Evapotranspiration is one of the most important hydrological variables in the context of water resources management. An attempt was made to understand and predict the dynamics of reference evapotranspiration from a nonlinear dynamical perspective in this study. The reference evapotranspiration data was calculated using the FAO Penman-Monteith equation with the observed daily meteorological data for the period 1966–2005 at four meteorological stations (i.e., Baotou, Zhangbei, Kaifeng, and Shaoguan) representing a wide range of climatic conditions of China. The correlation dimension method was employed to investigate the chaotic behavior of the reference evapotranspiration series. The existence of chaos in the reference evapotranspiration series at the four different locations was proved by the finite and low correlation dimension. A local approximation approach was employed to forecast the daily reference evapotranspiration series. Low root mean square error (RSME) and mean absolute error (MAE) (for all locations lower than 0.31 and 0.24, resp.), high correlation coefficient (CC), and modified coefficient of efficiency (for all locations larger than 0.97 and 0.8, resp.) indicate that the predicted reference evapotranspiration agrees well with the observed one. The encouraging results indicate the suitableness of chaotic approach for understanding and predicting the dynamics of the reference evapotranspiration. PMID:25133221

  14. Prediction of the Reference Evapotranspiration Using a Chaotic Approach

    Directory of Open Access Journals (Sweden)

    Wei-guang Wang

    2014-01-01

    Full Text Available Evapotranspiration is one of the most important hydrological variables in the context of water resources management. An attempt was made to understand and predict the dynamics of reference evapotranspiration from a nonlinear dynamical perspective in this study. The reference evapotranspiration data was calculated using the FAO Penman-Monteith equation with the observed daily meteorological data for the period 1966–2005 at four meteorological stations (i.e., Baotou, Zhangbei, Kaifeng, and Shaoguan representing a wide range of climatic conditions of China. The correlation dimension method was employed to investigate the chaotic behavior of the reference evapotranspiration series. The existence of chaos in the reference evapotranspiration series at the four different locations was proved by the finite and low correlation dimension. A local approximation approach was employed to forecast the daily reference evapotranspiration series. Low root mean square error (RSME and mean absolute error (MAE (for all locations lower than 0.31 and 0.24, resp., high correlation coefficient (CC, and modified coefficient of efficiency (for all locations larger than 0.97 and 0.8, resp. indicate that the predicted reference evapotranspiration agrees well with the observed one. The encouraging results indicate the suitableness of chaotic approach for understanding and predicting the dynamics of the reference evapotranspiration.

  15. Reference Scenario Forecasting: A New Approach to Transport Project Assessment

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen; Skougaard, Britt Zoëga

    2010-01-01

    This paper presents a new approach to transport project assessment in terms of feasibility risk assessment and reference class forecasting. Normally, transport project assessment is based upon a cost-benefit approach where evaluation criteria such as net present values are obtained. Recent research...... construction cost estimates. Hereafter, a quantitative risk analysis is provided making use of Monte Carlo simulation. This stochastic approach facilitates random input parameters based upon reference class forecasting, hence, a parameter data fit has been performed in order to obtain validated probability...... forecasting (RSF) frame. The RSF is anchored in the cost-benefit analysis (CBA), thus, it provides decision-makers with a quantitative mean of assessing the transport infrastructure project. First, the RSF method introduces uncertainties within the CBA by applying Optimism Bias uplifts on the preliminary...

  16. An integrated approach for multi-level sample size determination

    International Nuclear Information System (INIS)

    Lu, M.S.; Teichmann, T.; Sanborn, J.B.

    1997-01-01

    Inspection procedures involving the sampling of items in a population often require steps of increasingly sensitive measurements, with correspondingly smaller sample sizes; these are referred to as multilevel sampling schemes. In the case of nuclear safeguards inspections verifying that there has been no diversion of Special Nuclear Material (SNM), these procedures have been examined often and increasingly complex algorithms have been developed to implement them. The aim in this paper is to provide an integrated approach, and, in so doing, to describe a systematic, consistent method that proceeds logically from level to level with increasing accuracy. The authors emphasize that the methods discussed are generally consistent with those presented in the references mentioned, and yield comparable results when the error models are the same. However, because of its systematic, integrated approach the proposed method elucidates the conceptual understanding of what goes on, and, in many cases, simplifies the calculations. In nuclear safeguards inspections, an important aspect of verifying nuclear items to detect any possible diversion of nuclear fissile materials is the sampling of such items at various levels of sensitivity. The first step usually is sampling by ''attributes'' involving measurements of relatively low accuracy, followed by further levels of sampling involving greater accuracy. This process is discussed in some detail in the references given; also, the nomenclature is described. Here, the authors outline a coordinated step-by-step procedure for achieving such multilevel sampling, and they develop the relationships between the accuracy of measurement and the sample size required at each stage, i.e., at the various levels. The logic of the underlying procedures is carefully elucidated; the calculations involved and their implications, are clearly described, and the process is put in a form that allows systematic generalization

  17. Serum free light chain reference values: a critical approach.

    Science.gov (United States)

    Altinier, Sara; Seguso, Mara; Zaninotto, Martina; Varagnolo, Mariacristina; Adami, Fausto; Angeli, Paolo; Plebani, Mario

    2013-05-01

    The clinical usefulness of serum free light chain (FLC) measurement in the management of patients with plasma cell proliferative disorders has been reported in several papers, and most clinical studies use the reference ranges declared by the manufacturer. The aim of the present study was to evaluate the reproducibility of FLCs immunoassay and to validate the reference range, before introducing it in routine setting. Internal quality control materials and a pool of fresh serum samples were used to evaluate imprecision; 162 fresh sera from healthy blood donors were analyzed to evaluate the reference range for FLCs. In order to verify the κ/λ FLC ratio, 43 sera from patients with polyclonal hypergammaglobulinemia were tested. The FLC immunoassay was performed using a nephelometer with the Freelite reagents. The imprecision studies performed using a serum pool tested with two different lots of reagents showed a mean CV of 16.09% for κFLC and of 16.72% for λFLC. Lower CV%s and different mean values were found by calculating the results from each specific lot separately, while different results were obtained using the control materials provided by the manufacturer. In reference subjects, the 2.5-97.5th percentiles were found to be 4.52-22.33 and 4.84-21.88mg/L for κFLC and λFLC, respectively. The range for κ/λ ratio (0.65-2.36) was validated with the values obtained from subjects with polyclonal hypergammaglobulinemia. In retesting 15 samples from blood donor subjects with a different lot of reagents, mean bias percentages of 17.60 for κFLC and 15.26 for λFLC were obtained. These findings confirm the lot-to-lot variability of the FLC assays also in the measurement of polyclonal light chains, as well as the need to carefully validate the reference values. Published by Elsevier Inc.

  18. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  19. Blending the Trends: A Holistic Approach to Reference Services

    Science.gov (United States)

    Dempsey, Megan

    2011-01-01

    The growing trends of tiered reference, roving librarians, and virtual reference offer academic libraries several options for providing the most effective reference service. Increased enrollment at community colleges has prompted a reconsideration of how librarians can balance reference, teaching, and faculty responsibilities. This article…

  20. A novel SNP analysis method to detect copy number alterations with an unbiased reference signal directly from tumor samples

    Directory of Open Access Journals (Sweden)

    LaFramboise William A

    2011-01-01

    Full Text Available Abstract Background Genomic instability in cancer leads to abnormal genome copy number alterations (CNA as a mechanism underlying tumorigenesis. Using microarrays and other technologies, tumor CNA are detected by comparing tumor sample CN to normal reference sample CN. While advances in microarray technology have improved detection of copy number alterations, the increase in the number of measured signals, noise from array probes, variations in signal-to-noise ratio across batches and disparity across laboratories leads to significant limitations for the accurate identification of CNA regions when comparing tumor and normal samples. Methods To address these limitations, we designed a novel "Virtual Normal" algorithm (VN, which allowed for construction of an unbiased reference signal directly from test samples within an experiment using any publicly available normal reference set as a baseline thus eliminating the need for an in-lab normal reference set. Results The algorithm was tested using an optimal, paired tumor/normal data set as well as previously uncharacterized pediatric malignant gliomas for which a normal reference set was not available. Using Affymetrix 250K Sty microarrays, we demonstrated improved signal-to-noise ratio and detected significant copy number alterations using the VN algorithm that were validated by independent PCR analysis of the target CNA regions. Conclusions We developed and validated an algorithm to provide a virtual normal reference signal directly from tumor samples and minimize noise in the derivation of the raw CN signal. The algorithm reduces the variability of assays performed across different reagent and array batches, methods of sample preservation, multiple personnel, and among different laboratories. This approach may be valuable when matched normal samples are unavailable or the paired normal specimens have been subjected to variations in methods of preservation.

  1. Precision and Accuracy of k0-NAA Method for Analysis of Multi Elements in Reference Samples

    International Nuclear Information System (INIS)

    Sri-Wardani

    2004-01-01

    Accuracy and precision of k 0 -NAA method could determine in the analysis of multi elements contained in reference samples. The analyzed results of multi elements in SRM 1633b sample were obtained with optimum results in bias of 20% but it is in a good accuracy and precision. The analyzed results of As, Cd and Zn in CCQM-P29 rice flour sample were obtained with very good result in bias of 0.5 - 5.6%. (author)

  2. A hybrid reference-guided de novo assembly approach for generating Cyclospora mitochondrion genomes.

    Science.gov (United States)

    Gopinath, G R; Cinar, H N; Murphy, H R; Durigan, M; Almeria, M; Tall, B D; DaSilva, A J

    2018-01-01

    Cyclospora cayetanensis is a coccidian parasite associated with large and complex foodborne outbreaks worldwide. Linking samples from cyclosporiasis patients during foodborne outbreaks with suspected contaminated food sources, using conventional epidemiological methods, has been a persistent challenge. To address this issue, development of new methods based on potential genomically-derived markers for strain-level identification has been a priority for the food safety research community. The absence of reference genomes to identify nucleotide and structural variants with a high degree of confidence has limited the application of using sequencing data for source tracking during outbreak investigations. In this work, we determined the quality of a high resolution, curated, public mitochondrial genome assembly to be used as a reference genome by applying bioinformatic analyses. Using this reference genome, three new mitochondrial genome assemblies were built starting with metagenomic reads generated by sequencing DNA extracted from oocysts present in stool samples from cyclosporiasis patients. Nucleotide variants were identified in the new and other publicly available genomes in comparison with the mitochondrial reference genome. A consolidated workflow, presented here, to generate new mitochondrion genomes using our reference-guided de novo assembly approach could be useful in facilitating the generation of other mitochondrion sequences, and in their application for subtyping C. cayetanensis strains during foodborne outbreak investigations.

  3. A review of airborne particle sampling with special reference to long-lived radioactive dust

    International Nuclear Information System (INIS)

    Bigu, J.

    1990-03-01

    This report reviews some basic aspects related to the sampling of airborne particles with special reference to Long-Lived Radioactive Dust (LLRD). The report covers a number of areas of practical interest such as the production of aerosols, the dynamics of suspended particles, the physical and chemical characteristics and properties of dust clouds, and the inhalation and measurement of dust. It is followed with a brief review of dust sampling instrumentation, and with a short account of the work done on LLRD in Canada with a few references to work done outside this country. (34 figs., 7 tabs., 117 refs.)

  4. A Hybrid Approach to Proving Memory Reference Monotonicity

    KAUST Repository

    Oancea, Cosmin E.

    2013-01-01

    Array references indexed by non-linear expressions or subscript arrays represent a major obstacle to compiler analysis and to automatic parallelization. Most previous proposed solutions either enhance the static analysis repertoire to recognize more patterns, to infer array-value properties, and to refine the mathematical support, or apply expensive run time analysis of memory reference traces to disambiguate these accesses. This paper presents an automated solution based on static construction of access summaries, in which the reference non-linearity problem can be solved for a large number of reference patterns by extracting arbitrarily-shaped predicates that can (in)validate the reference monotonicity property and thus (dis)prove loop independence. Experiments on six benchmarks show that our general technique for dynamic validation of the monotonicity property can cover a large class of codes, incurs minimal run-time overhead and obtains good speedups. © 2013 Springer-Verlag.

  5. Evaluation of Skin Surface as an Alternative Source of Reference DNA Samples: A Pilot Study.

    Science.gov (United States)

    Albujja, Mohammed H; Bin Dukhyil, Abdul Aziz; Chaudhary, Abdul Rauf; Kassab, Ahmed Ch; Refaat, Ahmed M; Babu, Saranya Ramesh; Okla, Mohammad K; Kumar, Sachil

    2018-01-01

    An acceptable area for collecting DNA reference sample is a part of the forensic DNA analysis development. The aim of this study was to evaluate skin surface cells (SSC) as an alternate source of reference DNA sample. From each volunteer (n = 10), six samples from skin surface areas (forearm and fingertips) and two traditional samples (blood and buccal cells) were collected. Genomic DNA was extracted and quantified then genotyped using standard techniques. The highest DNA concentration of SSC samples was collected using the tape/forearm method of collection (2.1 ng/μL). Cotton swabs moistened with ethanol yielded higher quantities of DNA than swabs moistened with salicylic acid, and it gave the highest percentage of full STR profiles (97%). This study supports the use of SSC as a noninvasive sampling technique and as a extremely useful source of DNA reference samples among certain cultures where the use of buccal swabs can be considered socially unacceptable. © 2017 American Academy of Forensic Sciences.

  6. Optimizing sampling approaches along ecological gradients

    DEFF Research Database (Denmark)

    Schweiger, Andreas; Irl, Severin D. H.; Steinbauer, Manuel

    2016-01-01

    1. Natural scientists and especially ecologists use manipulative experiments or field observations along gradients to differentiate patterns driven by processes from those caused by random noise. A well-conceived sampling design is essential for identifying, analysing and reporting underlying...... patterns in a statistically solid and reproducible manner, given the normal restrictions in labour, time and money. However, a technical guideline about an adequate sampling design to maximize prediction success under restricted resources is lacking. This study aims at developing such a solid...... and reproducible guideline for sampling along gradients in all fields of ecology and science in general. 2. We conducted simulations with artificial data for five common response types known in ecology, each represented by a simple function (no response, linear, exponential, symmetric unimodal and asymmetric...

  7. Influence of secular trends and sample size on reference equations for lung function tests.

    Science.gov (United States)

    Quanjer, P H; Stocks, J; Cole, T J; Hall, G L; Stanojevic, S

    2011-03-01

    The aim of our study was to determine the contribution of secular trends and sample size to lung function reference equations, and establish the number of local subjects required to validate published reference values. 30 spirometry datasets collected between 1978 and 2009 provided data on healthy, white subjects: 19,291 males and 23,741 females aged 2.5-95 yrs. The best fit for forced expiratory volume in 1 s (FEV(1)), forced vital capacity (FVC) and FEV(1)/FVC as functions of age, height and sex were derived from the entire dataset using GAMLSS. Mean z-scores were calculated for individual datasets to determine inter-centre differences. This was repeated by subdividing one large dataset (3,683 males and 4,759 females) into 36 smaller subsets (comprising 18-227 individuals) to preclude differences due to population/technique. No secular trends were observed and differences between datasets comprising >1,000 subjects were small (maximum difference in FEV(1) and FVC from overall mean: 0.30- -0.22 z-scores). Subdividing one large dataset into smaller subsets reproduced the above sample size-related differences and revealed that at least 150 males and 150 females would be necessary to validate reference values to avoid spurious differences due to sampling error. Use of local controls to validate reference equations will rarely be practical due to the numbers required. Reference equations derived from large or collated datasets are recommended.

  8. Determination of element concentrations in biological reference materials by solid sampling and other analytical methods

    International Nuclear Information System (INIS)

    Schauenburg, H.; Weigert, P.

    1992-01-01

    Using solid sampling with graphite furnace atomic absorption spectrometry (GFAAS), values for cadmium, copper, lead and zinc in six biological reference materials were obtained from up to four laboratories participating in three collaborative studies. These results are compared with those obtained with other methods used in routine analysis from laboratories of official food control. Under certain conditions solid sampling with GFAAS seems to be suitable for routine analysis as well as conventional methods. (orig.)

  9. A Hybrid Approach to Proving Memory Reference Monotonicity

    KAUST Repository

    Oancea, Cosmin E.; Rauchwerger, Lawrence

    2013-01-01

    Array references indexed by non-linear expressions or subscript arrays represent a major obstacle to compiler analysis and to automatic parallelization. Most previous proposed solutions either enhance the static analysis repertoire to recognize more

  10. Efficient approach for reliability-based optimization based on weighted importance sampling approach

    International Nuclear Information System (INIS)

    Yuan, Xiukai; Lu, Zhenzhou

    2014-01-01

    An efficient methodology is presented to perform the reliability-based optimization (RBO). It is based on an efficient weighted approach for constructing an approximation of the failure probability as an explicit function of the design variables which is referred to as the ‘failure probability function (FPF)’. It expresses the FPF as a weighted sum of sample values obtained in the simulation-based reliability analysis. The required computational effort for decoupling in each iteration is just single reliability analysis. After the approximation of the FPF is established, the target RBO problem can be decoupled into a deterministic one. Meanwhile, the proposed weighted approach is combined with a decoupling approach and a sequential approximate optimization framework. Engineering examples are given to demonstrate the efficiency and accuracy of the presented methodology

  11. Identification and assembly of genomes and genetic elements in complex metagenomic samples without using reference genomes

    DEFF Research Database (Denmark)

    Nielsen, Henrik Bjørn; Almeida, Mathieu; Juncker, Agnieszka

    2014-01-01

    of microbial genomes without the need for reference sequences. We demonstrate the method on data from 396 human gut microbiome samples and identify 7,381 co-abundance gene groups (CAGs), including 741 metagenomic species (MGS). We use these to assemble 238 high-quality microbial genomes and identify...

  12. Parent-Adolescent Cross-Informant Agreement in Clinically Referred Samples

    DEFF Research Database (Denmark)

    Rescorla, Leslie A; Ewing, Grace; Ivanova, Masha Y

    2017-01-01

    To conduct international comparisons of parent-adolescent cross-informant agreement in clinical samples, we analyzed ratings on the Child Behavior Checklist (CBCL) and Youth Self-Report (YSR) for 6,762 clinically referred adolescents ages 11-18 from 7 societies (M = 14.5 years, SD = 2.0 years; 51...

  13. A Generalized Approach to Forensic Dye Identification: Development and Utility of Reference Libraries.

    Science.gov (United States)

    Groves, Ethan; Palenik, Skip; Palenik, Christopher S

    2018-04-18

    While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.

  14. Evaluation of Botanical Reference Materials for the Determination of Vanadium in Biological Samples

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Damsgaard, Else

    1982-01-01

    Three botanical reference materials prepared by the National Bureau of Standards have been studied by neutron activation analysis to evaluate their suitability with respect to the determination of vanadium in biological samples. Various decomposition methods were applied in connection with chemic....... A reference value of 1.15 mg/kg of this material is recommended, based on results from 3 different methods. All three materials are preferable to SRM 1571 Orchard Leaves, while Bowen's Kale remains the material of choice because of its lower concentration....

  15. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available and make it economically viable. In the identification of core elements within the process reference model, the focus is often on the end-product and not on the procedure used to identify the elements. As often proved in development of projects, there is a...

  16. An approach to an acute emotional stress reference scale.

    Science.gov (United States)

    Garzon-Rey, J M; Arza, A; de-la-Camara, C; Lobo, A; Armario, A; Aguilo, J

    2017-06-16

    The clinical diagnosis aims to identify the degree of affectation of the psycho-physical state of the patient as a guide to therapeutic intervention. In stress, the lack of a measurement tool based on a reference makes it difficult to quantitatively assess this degree of affectation. To define and perform a primary assessment of a standard reference in order to measure acute emotional stress from the markers identified as indicators of the degree. Psychometric tests and biochemical variables are, in general, the most accepted stress measurements by the scientific community. Each one of them probably responds to different and complementary processes related to the reaction to a stress stimulus. The reference that is proposed is a weighted mean of these indicators by assigning them relative weights in accordance with a principal components analysis. An experimental study was conducted on 40 healthy young people subjected to the psychosocial stress stimulus of the Trier Social Stress Test in order to perform a primary assessment and consistency check of the proposed reference. The proposed scale clearly differentiates between the induced relax and stress states. Accepting the subjectivity of the definition and the lack of a subsequent validation with new experimental data, the proposed standard differentiates between a relax state and an emotional stress state triggered by a moderate stress stimulus, as it is the Trier Social Stress Test. The scale is robust. Although the variations in the percentage composition slightly affect the score, but they do not affect the valid differentiation between states.

  17. Evaluating Electronic Reference Services: Issues, Approaches and Criteria.

    Science.gov (United States)

    Novotny, Eric

    2001-01-01

    Discussion of electronic library reference services focuses on an overview of the chief methodologies available for conducting assessments of electronic services. Highlights include quantitative measures and benchmarks, including equity and access; quality measures; behavioral aspects of quality, including librarian-patron interaction; and future…

  18. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.

    2017-11-27

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  19. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.; Pan, B.; Lubineau, Gilles

    2017-01-01

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  20. Validity of the WISC-IV Spanish for a clinically referred sample of Hispanic children.

    Science.gov (United States)

    San Miguel Montes, Liza E; Allen, Daniel N; Puente, Antonio E; Neblina, Cris

    2010-06-01

    The Wechsler Intelligence Scale for Children (WISC) is the most commonly used intelligence test for children. Five years ago, a Spanish version of the WISC-IV was published (WISC-IV Spanish; Wechsler, 2005), but a limited amount of published information is available regarding its utility when assessing clinical samples. The current study included 107 children who were Spanish speaking and of Puerto Rican descent that had been administered the WISC-IV Spanish. They were subdivided into a clinical sample of 35 children with diagnoses of various forms of brain dysfunction (primarily learning disability, attention-deficit/hyperactivity disorder, and epilepsy) and a comparison group made up of 72 normal children who were part of the WISC-IV Spanish version standardization sample. Comparisons between these groups and the standardization sample were performed for the WISC-IV Spanish index and subtest scores. Results indicated that the clinical sample performed worse than the comparison samples on the Working Memory and Processing Speed Indexes, although findings varied to some extent depending on whether the clinical group was compared with the normal comparison group or the standardization sample. These findings provide support for the criterion validity of the WISC-IV Spanish when it is used to assess a clinically referred sample with brain dysfunction.

  1. An Adaptive Critic Approach to Reference Model Adaptation

    Science.gov (United States)

    Krishnakumar, K.; Limes, G.; Gundy-Burlet, K.; Bryant, D.

    2003-01-01

    Neural networks have been successfully used for implementing control architectures for different applications. In this work, we examine a neural network augmented adaptive critic as a Level 2 intelligent controller for a C- 17 aircraft. This intelligent control architecture utilizes an adaptive critic to tune the parameters of a reference model, which is then used to define the angular rate command for a Level 1 intelligent controller. The present architecture is implemented on a high-fidelity non-linear model of a C-17 aircraft. The goal of this research is to improve the performance of the C-17 under degraded conditions such as control failures and battle damage. Pilot ratings using a motion based simulation facility are included in this paper. The benefits of using an adaptive critic are documented using time response comparisons for severe damage situations.

  2. Instrumental neutron activation analysis of rib bone samples and of bone reference materials

    International Nuclear Information System (INIS)

    Saiki, M.; Takata, M.K.; Kramarski, S.; Borelli, A.

    2000-01-01

    The instrumental neutron activation analysis method was used for the determination of trace elements in rib bone samples taken from autopsies of accident victims. The elements Br, Ca, Cl, Cr, Fe, Mg, Mn, Na, P, Sr, Rb and Zn were determined in cortical tissues by using short and long irradiations with thermal neutron flux of the IEA-R1m nuclear reactor. The reference materials NIST SRM 1400 Bone Ash and NIST SRM 1486 Bone Meal were also analyzed in order to evaluate the precision and the accuracy of the results. It was verified that lyophilization is the most convenient process for drying bone samples since it does not cause any element losses. Comparisons were made between the results obtained for rib samples and the literature values as well as between the results obtained for different ribs from a single individual and for bones from different individuals. (author)

  3. Evaluation of botanical reference materials for the determination of vanadium in biological samples

    International Nuclear Information System (INIS)

    Heydorn, K.; Damsgaard, E.

    1982-01-01

    Three botanical reference materials prepared by the National Bureau of Standards have been studied by neutron activation analysis to evaluate their suitability with respect to the determination of vanadium in biological samples. Various decomposition methods were applied in connection with chemical or radiochemical separations, and results for vanadium were compared with those found by purely instrumental neutron activation analysis. Significantly lower results indicate losses or incomplete dissolution, which makes SRM 1575 Pine Needles and SRM 1573 Tomato Leaves less satisfactory than SRM 1570 Spinach. A reference value of 1.15 mg/kg of this material is recommended, based on results from 3 different methods. All three materials are preferable to SRM 1571 Orchard Leaves, while Bowen's Kale remains the material of choice because of its lower concentration. (author)

  4. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account

  5. Virtual reconstruction of modern and fossil hominoid crania: consequences of reference sample choice.

    Science.gov (United States)

    Senck, Sascha; Bookstein, Fred L; Benazzi, Stefano; Kastner, Johann; Weber, Gerhard W

    2015-05-01

    Most hominin cranial fossils are incomplete and require reconstruction prior to subsequent analyses. Missing data can be estimated by geometric morphometrics using information from complete specimens, for example, by using thin-plate splines. In this study, we estimate missing data in several virtually fragmented models of hominoid crania (Homo, Pan, Pongo) and fossil hominins (e.g., Australopithecus africanus, Homo heidelbergensis). The aim is to investigate in which way different references influence estimations of cranial shape and how this information can be employed in the reconstruction of fossils. We used a sample of 64 three-dimensional digital models of complete human, chimpanzee, and orangutan crania and a set of 758 landmarks and semilandmarks. The virtually knocked out neurocranial and facial areas that were reconstructed corresponded to those of a real case found in A.L. 444-2 (A. afarensis) cranium. Accuracy of multiple intraspecies and interspecies reconstructions was computed as the maximum square root of the mean squared difference between the original and the reconstruction (root mean square). The results show that the uncertainty in reconstructions is a function of both the geometry of the knockout area and the dissimilarity between the reference sample and the specimen(s) undergoing reconstruction. We suggest that it is possible to estimate large missing cranial areas if the shape of the reference is similar enough to the shape of the specimen reconstructed, though caution must be exercised when employing these reconstructions in subsequent analyses. We provide a potential guide for the choice of the reference by means of bending energy. © 2015 Wiley Periodicals, Inc.

  6. Gender differences in a clinic-referred sample of Taiwanese attention-deficit/hyperactivity disorder children.

    Science.gov (United States)

    Yang, Pinchen; Jong, Yuh-Jyh; Chung, Li-Chen; Chen, Cheng-Sheng

    2004-12-01

    The purpose of this study was to examine gender differences within a clinic-referred sample of 6-11-year-old Taiwanese children with attention-deficit/hyperactivity disorder (ADHD)- combined subtype. The subjects were 21 girls with a diagnosis of ADHD from the 4th edition of the Diagnostic and Statistical Manual and 21 age-matched boys with ADHD. Comparisons were made of behavioral ratings, cognitive profiles, and vigilance/attention assessments between these two groups. The results found ADHD girls and ADHD boys to be statistically indistinguishable on nearly all measures except the subtests of block design (P = 0.016), the discrepancy between Performance Intelligence Quotient and Verbal Intelligence Quotient (P = 0.019), and the discrepancy between fluid and crystallized IQ (P = 0.041). In the study samples, ADHD girls and ADHD boys were strikingly similar on a wide range of measures. ADHD boys and girls in clinics may be expected to show more similarities than differences in treatment needs. However, these results should be interpreted with caution since data were only from clinic-referred samples.

  7. Multielemental analysis of Korean geological reference samples by INAA, ICP-AES and ICP-MS

    International Nuclear Information System (INIS)

    Naoki Shirai; Hiroki Takahashi; Yuta Yokozuka; Mitsuru Ebihara; Meiramkhan Toktaganov; Shun Sekimoto

    2015-01-01

    Six Korean geological reference samples (KB-1, KGB-1, KT-1, KD-1, KG-1 and KG-2) prepared by Korea Institutes of Geoscience and Mineral Resources were analyzed by using INAA, ICP-AES and ICP-MS. Some elements could be determined by both INAA and non-INAA methods (ICP-AES and ICP-MS), and these data are consistent with each other. This study confirms that a combination of ICP-AES and ICP-MS is comparable to INAA in determining a wide range of major, minor and trace elements in geological materials. (author)

  8. Resampling Approach for Determination of the Method for Reference Interval Calculation in Clinical Laboratory Practice▿

    Science.gov (United States)

    Pavlov, Igor Y.; Wilson, Andrew R.; Delgado, Julio C.

    2010-01-01

    Reference intervals (RI) play a key role in clinical interpretation of laboratory test results. Numerous articles are devoted to analyzing and discussing various methods of RI determination. The two most widely used approaches are the parametric method, which assumes data normality, and a nonparametric, rank-based procedure. The decision about which method to use is usually made arbitrarily. The goal of this study was to demonstrate that using a resampling approach for the comparison of RI determination techniques could help researchers select the right procedure. Three methods of RI calculation—parametric, transformed parametric, and quantile-based bootstrapping—were applied to multiple random samples drawn from 81 values of complement factor B observations and from a computer-simulated normally distributed population. It was shown that differences in RI between legitimate methods could be up to 20% and even more. The transformed parametric method was found to be the best method for the calculation of RI of non-normally distributed factor B estimations, producing an unbiased RI and the lowest confidence limits and interquartile ranges. For a simulated Gaussian population, parametric calculations, as expected, were the best; quantile-based bootstrapping produced biased results at low sample sizes, and the transformed parametric method generated heavily biased RI. The resampling approach could help compare different RI calculation methods. An algorithm showing a resampling procedure for choosing the appropriate method for RI calculations is included. PMID:20554803

  9. Geochemistry of outcrop samples from the Raven Canyon and Paintbrush Canyon reference sections, Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Peterman, Z.E.; Spengler, R.W.; Singer, F.R.; Dickerson, R.P.

    1996-01-01

    The Yucca Mountain area in southern Nevada is being evaluated for its suitability as a potential site for the construction of an underground, high-level nuclear waste repository. With support from the Department of Energy, the US Geological Survey is conducting detailed petrographic, geochemical, and isotopic analyses of samples collected from drill cores and from outcrops. The geochemical and isotopic compositions of the volcanic rocks of Yucca Mountain derive from those of their parental magmas, from changes resulting from the eruptive processes and from post-depositional alteration. In this study, geochemical and isotopic data were acquired on samples from reference sections selected in areas where the effects of the post-depositional alteration has been minimal. These data will be used as baseline information for delineating and correlating zonal features in the volcanic rock alteration that may occur in the thermal aureole of the potential repository after it has been loaded with nuclear waste

  10. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  11. Comparison of analytical methods for the determination of histamine in reference canned fish samples

    Science.gov (United States)

    Jakšić, S.; Baloš, M. Ž.; Mihaljev, Ž.; Prodanov Radulović, J.; Nešić, K.

    2017-09-01

    Two screening methods for histamine in canned fish, an enzymatic test and a competitive direct enzyme-linked immunosorbent assay (CD-ELISA), were compared with the reversed-phase liquid chromatography (RP-HPLC) standard method. For enzymatic and CD-ELISA methods, determination was conducted according to producers’ manuals. For RP-HPLC, histamine was derivatized with dansyl-chloride, followed by RP-HPLC and diode array detection. Results of analysis of canned fish, supplied as reference samples for proficiency testing, showed good agreement when histamine was present at higher concentrations (above 100 mg kg-1). At a lower level (16.95 mg kg-1), the enzymatic test produced some higher results. Generally, analysis of four reference samples according to CD-ELISA and RP-HPLC showed good agreement for histamine determination (r=0.977 in concentration range 16.95-216 mg kg-1) The results show that the applied enzymatic test and CD-ELISA appeared to be suitable screening methods for the determination of histamine in canned fish.

  12. Analysis of cement solidified product and ash samples and preparation of a reference material

    International Nuclear Information System (INIS)

    Ishimori, Ken-ichiro; Haraga, Tomoko; Shimada, Asako; Kameo, Yutaka; Takahashi, Kuniaki

    2010-08-01

    Simple and rapid analytical methods for radionuclides in low-level radioactive waste have been developed by the present authors. The methods were applied to simulated solidified products and actual metal wastes to confirm their usefulness. The results were summarized as analytical guide lines. In the present work, cement solidified product and ash waste were analyzed followed by the analytical guide lines and subjects were picked up and solved for the application of the analytical guide lines to these wastes. Pulverization and homogenization method for ash waste was improved to prevent a contamination since the radioactivity concentrations of the ash samples were relatively high. Pre-treatment method was altered for the cement solidified product and ash samples taking account for their high concentration of Ca. Newly, an analytical method was also developed to measure 129 I with a dynamic reaction cell inductively coupled plasma mass spectrometer. In the analytical test based on the improved guide lines, gamma-ray emitting nuclides, 60 Co and 137 Cs, were measured to estimate the radioactivity of the other alpha and beta-ray emitting nuclides. The radionuclides assumed detectable, 3 H, 14 C, 36 Cl, 63 Ni, 90 Sr, and alpha-ray emitting nuclides, were analyzed with the improved analytical guide lines and their applicability for cement solidified product and ash samples were confirmed. Additionally a cement solidified product sample was evaluated in terms of the homogeneity and the radioactivity concentrations in order to prepare a reference material for radiochemical analysis. (author)

  13. Estimated ventricle size using Evans index: reference values from a population-based sample.

    Science.gov (United States)

    Jaraj, D; Rabiei, K; Marlow, T; Jensen, C; Skoog, I; Wikkelsø, C

    2017-03-01

    Evans index is an estimate of ventricular size used in the diagnosis of idiopathic normal-pressure hydrocephalus (iNPH). Values >0.3 are considered pathological and are required by guidelines for the diagnosis of iNPH. However, there are no previous epidemiological studies on Evans index, and normal values in adults are thus not precisely known. We examined a representative sample to obtain reference values and descriptive data on Evans index. A population-based sample (n = 1235) of men and women aged ≥70 years was examined. The sample comprised people living in private households and residential care, systematically selected from the Swedish population register. Neuropsychiatric examinations, including head computed tomography, were performed between 1986 and 2000. Evans index ranged from 0.11 to 0.46. The mean value in the total sample was 0.28 (SD, 0.04) and 20.6% (n = 255) had values >0.3. Among men aged ≥80 years, the mean value of Evans index was 0.3 (SD, 0.03). Individuals with dementia had a mean value of Evans index of 0.31 (SD, 0.05) and those with radiological signs of iNPH had a mean value of 0.36 (SD, 0.04). A substantial number of subjects had ventricular enlargement according to current criteria. Clinicians and researchers need to be aware of the range of values among older individuals. © 2017 EAN.

  14. Small sample approach, and statistical and epidemiological aspects

    NARCIS (Netherlands)

    Offringa, Martin; van der Lee, Hanneke

    2011-01-01

    In this chapter, the design of pharmacokinetic studies and phase III trials in children is discussed. Classical approaches and relatively novel approaches, which may be more useful in the context of drug research in children, are discussed. The burden of repeated blood sampling in pediatric

  15. THE SAMPLING PROCESS IN THE FINANCIAL AUDIT .TECHNICAL PRACTICE APPROACH

    Directory of Open Access Journals (Sweden)

    Cardos Vasile-Daniel

    2014-12-01

    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES

  16. THE SAMPLING PROCESS IN THE FINANCIAL AUDIT .TECHNICAL PRACTICE APPROACH

    Directory of Open Access Journals (Sweden)

    GRIGORE MARIAN

    2014-07-01

    “Audit sampling” (sampling assumes appliancing audit procedures for less than 100% of the elements within an account or a trasaction class balance, such that all the samples will be selected. This will allow the auditor to obtain and to evaluate the audit evidence on some features for the selected elements, in purpose to assist or to express a conclusion regardind the population within the sample was extracted. The sampling in audit can use both a statistical or a non-statistical approach. (THE AUDIT INTERNATIONAl STANDARD 530 –THE SAMPLING IN AUDIT AND OTHER SELECTIVE TESTING PROCEDURES

  17. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  18. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples."

    Data.gov (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  19. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  20. Prescribing Patterns in a Psychiatrically Referred Sample of Youth With Autism Spectrum Disorder.

    Science.gov (United States)

    Shekunov, Julia; Wozniak, Janet; Conroy, Kristina; Pinsky, Elizabeth; Fitzgerald, Maura; de Leon, Melissa F; Belser, Abigail; Biederman, Joseph; Joshi, Gagan

    The aim of this study was to examine the pattern of psychopharmacologic interventions in a psychiatrically referred sample of youth with autism spectrum disorder (ASD). This retrospective chart review aimed at collecting demographic and clinical information, including data on DSM-IV-TR criteria-based psychiatric disorders and related current medication treatment and response. Data were collected in December 2011. Clinicians identified the target disorder for each medication and any adverse events. Level of psychopathology and therapeutic response was assessed by the clinician-rated Clinical Global Impressions scale (CGI). Psychiatrically referred youth with ASD (n = 54) suffered from multiple psychopathologies (mean = 2.3) and had a marked level of morbidity (range of baseline CGI-Severity of Illness mean scores, 4.3-5.6). The most prevalent psychopathology was ADHD (83%), anxiety disorders (67%), bipolar spectrum disorder (43%), and mood disorder not otherwise specified (44%). The majority (80%) of the subjects received combination therapy (mean ± SD number of psychotropic medications = 3 ± 1.5). Forty percent of the participants responded on all treatment target symptoms (CGI-Improvement scale score ≤ 2), and an additional 10% experienced response versus nonresponse on a relatively greater number of target symptoms. Half of the subjects reported an adverse event, most commonly weight gain (28%) and sedation (12%), both from antipsychotic medication use. Psychiatrically referred youth with ASD suffer from multiple highly impairing psychiatric disorders that require combination pharmacotherapy. These findings highlight the need for further research to guide clinical decision-making and treatment. © Copyright 2017 Physicians Postgraduate Press, Inc.

  1. An examination of the MASC Social Anxiety Scale in a non-referred sample of adolescents.

    Science.gov (United States)

    Anderson, Emily R; Jordan, Judith A; Smith, Ashley J; Inderbitzen-Nolan, Heidi M

    2009-12-01

    Social phobia is prevalent during adolescence and is associated with negative outcomes. Two self-report instruments are empirically validated to specifically assess social phobia symptomatology in youth: the Social Phobia and Anxiety Inventory for Children and the Social Anxiety Scale for Adolescents. The Multidimensional Anxiety Scale for Children is a broad-band measure of anxiety containing a scale assessing the social phobia construct. The present study investigated the MASC Social Anxiety Scale in relation to other well-established measures of social phobia and depression in a non-referred sample of adolescents. Results support the convergent validity of the MASC Social Anxiety Scale and provide some support for its discriminant validity, suggesting its utility in the initial assessment of social phobia. Receiver Operating Characteristics (ROCs) calculated the sensitivity and specificity of the MASC Social Anxiety Scale. Binary logistic regression analyses determined the predictive utility of the MASC Social Anxiety Scale. Implications for assessment are discussed.

  2. Testing the homogeneity of candidate reference materials by solid sampling - AAS and INAA

    International Nuclear Information System (INIS)

    Rossbach, M.; Grobecker, K.-H.

    2002-01-01

    The necessity to quantify a natural material's homogeneity with respect to its elemental distribution prior to chemical analysis of a given aliquot is emphasised. Available instruments and methods to obtain the relevant information are described. Additionally the calculation of element specific, relative homogeneity factors, H E , and of a minimum sample mass M 5% to achieve 5% precision on a 95% confidence level is given. Especially, in the production and certification of Certified Reference Materials (CRMs) this characteristic information should be determined in order to provide the user with additional inherent properties of the CRM to enable more economical use of the expensive material and to evaluate further systematic bias of the applied analytical technique. (author)

  3. Approach-Induced Biases in Human Information Sampling.

    Directory of Open Access Journals (Sweden)

    Laurence T Hunt

    2016-11-01

    Full Text Available Information sampling is often biased towards seeking evidence that confirms one's prior beliefs. Despite such biases being a pervasive feature of human behavior, their underlying causes remain unclear. Many accounts of these biases appeal to limitations of human hypothesis testing and cognition, de facto evoking notions of bounded rationality, but neglect more basic aspects of behavioral control. Here, we investigated a potential role for Pavlovian approach in biasing which information humans will choose to sample. We collected a large novel dataset from 32,445 human subjects, making over 3 million decisions, who played a gambling task designed to measure the latent causes and extent of information-sampling biases. We identified three novel approach-related biases, formalized by comparing subject behavior to a dynamic programming model of optimal information gathering. These biases reflected the amount of information sampled ("positive evidence approach", the selection of which information to sample ("sampling the favorite", and the interaction between information sampling and subsequent choices ("rejecting unsampled options". The prevalence of all three biases was related to a Pavlovian approach-avoid parameter quantified within an entirely independent economic decision task. Our large dataset also revealed that individual differences in the amount of information gathered are a stable trait across multiple gameplays and can be related to demographic measures, including age and educational attainment. As well as revealing limitations in cognitive processing, our findings suggest information sampling biases reflect the expression of primitive, yet potentially ecologically adaptive, behavioral repertoires. One such behavior is sampling from options that will eventually be chosen, even when other sources of information are more pertinent for guiding future action.

  4. Multi-edge X-ray absorption spectroscopy study of road dust samples from a traffic area of Venice using stoichiometric and environmental references

    Science.gov (United States)

    Valotto, Gabrio; Cattaruzza, Elti; Bardelli, Fabrizio

    2017-02-01

    The appropriate selection of representative pure compounds to be used as reference is a crucial step for successful analysis of X-ray absorption near edge spectroscopy (XANES) data, and it is often not a trivial task. This is particularly true when complex environmental matrices are investigated, being their elemental speciation a priori unknown. In this paper, an investigation on the speciation of Cu, Zn, and Sb based on the use of conventional (stoichiometric compounds) and non-conventional (environmental samples or relevant certified materials) references is explored. This method can be useful in when the effectiveness of XANES analysis is limited because of the difficulty in obtaining a set of references sufficiently representative of the investigated samples. Road dust samples collected along the bridge connecting Venice to the mainland were used to show the potentialities and the limits of this approach.

  5. Quality assurance and reference material requirements and considerations for environmental sample analysis in nuclear forensics

    International Nuclear Information System (INIS)

    Swindle, D.W. Jr.; Perrin, R.E.; Goldberg, S.A.; Cappis, J.

    2002-01-01

    Full text: High-sensitivity nuclear environmental sampling and analysis techniques have been proven in their ability to verify declared nuclear activities, as well as to assist in the detection of undeclared nuclear activities and facilities. Following the Gulf War, the capability and revealing power of environmental sampling and analysis techniques to support international safeguards was demonstrated and subsequently adopted by the International Atomic Energy Agency (IAEA) as routine safeguards measures in safeguards inspections and verifications. In addition to having been proved useful in international safeguards, environmental sampling and analysis techniques have demonstrated their utility in identifying the origins of 'orphaned' nuclear material, as well as the origin of intercepted smuggled nuclear material. Today, environmental sampling and analysis techniques are now being applied in six broad areas to support nonproliferation, disarmament treaty verification, national and international nuclear security, and environmental stewardship of weapons production activities. Consequently, more and more laboratories around the world are establishing capabilities or expanding capabilities to meet these growing applications, and as such requirements for quality assurance and control are increasing. The six areas are: 1) Nuclear safeguards; 2) Nuclear forensics/illicit trafficking; 3) Ongoing monitoring and verification (OMV); 4) Comprehensive Test Ban Treaty (CTBT); 5) Weapons dismantlement/materials disposition; and 6) Research and development (R and D)/environmental stewardship/safety. Application of environmental sampling and analysis techniques and resources to illicit nuclear material trafficking, while embodying the same basic techniques and resources, does have unique requirements for sample management, handling, protocols, chain of custody, archiving, and data interpretation. These requirements are derived from needs of how data from nuclear forensics

  6. Proposing an Empirically Justified Reference Threshold for Blood Culture Sampling Rates in Intensive Care Units

    Science.gov (United States)

    Castell, Stefanie; Schwab, Frank; Geffers, Christine; Bongartz, Hannah; Brunkhorst, Frank M.; Gastmeier, Petra; Mikolajczyk, Rafael T.

    2014-01-01

    Early and appropriate blood culture sampling is recommended as a standard of care for patients with suspected bloodstream infections (BSI) but is rarely taken into account when quality indicators for BSI are evaluated. To date, sampling of about 100 to 200 blood culture sets per 1,000 patient-days is recommended as the target range for blood culture rates. However, the empirical basis of this recommendation is not clear. The aim of the current study was to analyze the association between blood culture rates and observed BSI rates and to derive a reference threshold for blood culture rates in intensive care units (ICUs). This study is based on data from 223 ICUs taking part in the German hospital infection surveillance system. We applied locally weighted regression and segmented Poisson regression to assess the association between blood culture rates and BSI rates. Below 80 to 90 blood culture sets per 1,000 patient-days, observed BSI rates increased with increasing blood culture rates, while there was no further increase above this threshold. Segmented Poisson regression located the threshold at 87 (95% confidence interval, 54 to 120) blood culture sets per 1,000 patient-days. Only one-third of the investigated ICUs displayed blood culture rates above this threshold. We provided empirical justification for a blood culture target threshold in ICUs. In the majority of the studied ICUs, blood culture sampling rates were below this threshold. This suggests that a substantial fraction of BSI cases might remain undetected; reporting observed BSI rates as a quality indicator without sufficiently high blood culture rates might be misleading. PMID:25520442

  7. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger [Department of Astronomy, Oskar Klein Centre, Stockholm University, AlbaNova University Centre, SE-106 91 Stockholm (Sweden); Adamo, Angela [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Schaerer, Daniel [Université de Toulouse, UPS-OMP, IRAP, F-31000 Toulouse (France); Verhamme, Anne; Orlitová, Ivana [Geneva Observatory, University of Geneva, 51 Chemin des Maillettes, CH-1290 Versoix (Switzerland); Mas-Hesse, J. Miguel; Otí-Floranes, Héctor [Centro de Astrobiología (CSIC-INTA), Departamento de Astrofísica, P.O. Box 78, E-28691 Villanueva de la Cañada (Spain); Cannon, John M.; Pardy, Stephen [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Atek, Hakim [Laboratoire dAstrophysique, École Polytechnique Fédérale de Lausanne (EPFL), Observatoire, CH-1290 Sauverny (Switzerland); Kunth, Daniel [Institut d' Astrophysique de Paris, UMR 7095, CNRS and UPMC, 98 bis Bd Arago, F-75014 Paris (France); Laursen, Peter [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark); Herenz, E. Christian, E-mail: matthew@astro.su.se [Leibniz-Institut für Astrophysik (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany)

    2014-02-10

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f{sub esc}{sup Lyα} of 80%; such objects have not previously been reported at low-z.

  8. Transcriptome sequencing of the Microarray Quality Control (MAQC RNA reference samples using next generation sequencing

    Directory of Open Access Journals (Sweden)

    Thierry-Mieg Danielle

    2009-06-01

    Full Text Available Abstract Background Transcriptome sequencing using next-generation sequencing platforms will soon be competing with DNA microarray technologies for global gene expression analysis. As a preliminary evaluation of these promising technologies, we performed deep sequencing of cDNA synthesized from the Microarray Quality Control (MAQC reference RNA samples using Roche's 454 Genome Sequencer FLX. Results We generated more that 3.6 million sequence reads of average length 250 bp for the MAQC A and B samples and introduced a data analysis pipeline for translating cDNA read counts into gene expression levels. Using BLAST, 90% of the reads mapped to the human genome and 64% of the reads mapped to the RefSeq database of well annotated genes with e-values ≤ 10-20. We measured gene expression levels in the A and B samples by counting the numbers of reads that mapped to individual RefSeq genes in multiple sequencing runs to evaluate the MAQC quality metrics for reproducibility, sensitivity, specificity, and accuracy and compared the results with DNA microarrays and Quantitative RT-PCR (QRTPCR from the MAQC studies. In addition, 88% of the reads were successfully aligned directly to the human genome using the AceView alignment programs with an average 90% sequence similarity to identify 137,899 unique exon junctions, including 22,193 new exon junctions not yet contained in the RefSeq database. Conclusion Using the MAQC metrics for evaluating the performance of gene expression platforms, the ExpressSeq results for gene expression levels showed excellent reproducibility, sensitivity, and specificity that improved systematically with increasing shotgun sequencing depth, and quantitative accuracy that was comparable to DNA microarrays and QRTPCR. In addition, a careful mapping of the reads to the genome using the AceView alignment programs shed new light on the complexity of the human transcriptome including the discovery of thousands of new splice variants.

  9. The Lyman alpha reference sample. II. Hubble space telescope imaging results, integrated properties, and trends

    International Nuclear Information System (INIS)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Melinder, Jens; Rivera-Thorsen, Thøger; Adamo, Angela; Schaerer, Daniel; Verhamme, Anne; Orlitová, Ivana; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor; Cannon, John M.; Pardy, Stephen; Atek, Hakim; Kunth, Daniel; Laursen, Peter; Herenz, E. Christian

    2014-01-01

    We report new results regarding the Lyα output of galaxies, derived from the Lyman Alpha Reference Sample, and focused on Hubble Space Telescope imaging. For 14 galaxies we present intensity images in Lyα, Hα, and UV, and maps of Hα/Hβ, Lyα equivalent width (EW), and Lyα/Hα. We present Lyα and UV radial light profiles and show they are well-fitted by Sérsic profiles, but Lyα profiles show indices systematically lower than those of the UV (n ≈ 1-2 instead of ≳ 4). This reveals a general lack of the central concentration in Lyα that is ubiquitous in the UV. Photometric growth curves increase more slowly for Lyα than the far ultraviolet, showing that small apertures may underestimate the EW. For most galaxies, however, flux and EW curves flatten by radii ≈10 kpc, suggesting that if placed at high-z only a few of our galaxies would suffer from large flux losses. We compute global properties of the sample in large apertures, and show total Lyα luminosities to be independent of all other quantities. Normalized Lyα throughput, however, shows significant correlations: escape is found to be higher in galaxies of lower star formation rate, dust content, mass, and nebular quantities that suggest harder ionizing continuum and lower metallicity. Six galaxies would be selected as high-z Lyα emitters, based upon their luminosity and EW. We discuss the results in the context of high-z Lyα and UV samples. A few galaxies have EWs above 50 Å, and one shows f esc Lyα of 80%; such objects have not previously been reported at low-z.

  10. Use of an excess variance approach for the certification of reference materials by interlaboratory comparison

    International Nuclear Information System (INIS)

    Crozet, M.; Rigaux, C.; Roudil, D.; Tuffery, B.; Ruas, A.; Desenfant, M.

    2014-01-01

    In the nuclear field, the accuracy and comparability of analytical results are crucial to insure correct accountancy, good process control and safe operational conditions. All of these require reliable measurements based on reference materials whose certified values must be obtained by robust metrological approaches according to the requirements of ISO guides 34 and 35. The data processing of the characterization step is one of the key steps of a reference material production process. Among several methods, the use of interlaboratory comparison results for reference material certification is very common. The DerSimonian and Laird excess variance approach, described and implemented in this paper, is a simple and efficient method for the data processing of interlaboratory comparison results for reference material certification. By taking into account not only the laboratory uncertainties but also the spread of the individual results into the calculation of the weighted mean, this approach minimizes the risk to get biased certified values in the case where one or several laboratories either underestimate their measurement uncertainties or do not identify all measurement biases. This statistical method has been applied to a new CETAMA plutonium reference material certified by interlaboratory comparison and has been compared to the classical weighted mean approach described in ISO Guide 35. This paper shows the benefits of using an 'excess variance' approach for the certification of reference material by interlaboratory comparison. (authors)

  11. Successful Adrenal Venous Sampling by Non-experts with Reference to CT Images

    International Nuclear Information System (INIS)

    Morita, Satoru; Yamazaki, Hiroshi; Sonoyama, Yasuyuki; Nishina, Yu; Ichihara, Atsuhiro; Sakai, Shuji

    2016-01-01

    PurposeTo establish technical success rates and safety of adrenal venous sampling (AVS) performed by non-experts with reference to CT images.Materials and Methods104 AVS procedures with adrenocorticotropic hormone stimulation were performed for patients with suspected primary aldosteronism. One of three radiology residents with 2nd, 5th, and 5th grade experience undertook the procedure under the guidance of an experienced, board-certified interventional radiologist with reference to contrast-enhanced CT images obtained in 102 cases. Successful catheterization of the adrenal veins was assessed using three criteria: an adrenal venous cortisol concentration of more than 200 μg/dL (criterion A); an adrenal vein/inferior vena cava cortisol ratio of more than 5:1 (criterion B); and an adrenal vein/inferior vena cava cortisol ratio of more than 10:1 (criterion C).ResultsThe operators were aware of the anatomy of the left adrenal veins in 102 cases (98 %) and of the right adrenal veins in 99 cases (95 %) prior to the procedure. CT identified the correct position of the right adrenal vein orifice in 82 of 99 cases (83 %). The overall technical success rates for AVS from the right adrenal vein according to criteria A, B, and C, were 96, 96, and 94 %, respectively. Those for the left adrenal vein were 97, 98, and 94 %, respectively. No significant differences in success rates were observed between the operators (p = 0.922–0.984). No major complications, including adrenal vein rupture, were observed.ConclusionsWhen CT images are used to guide AVS, the procedure can be performed successfully and safely even by non-experts.

  12. Successful Adrenal Venous Sampling by Non-experts with Reference to CT Images

    Energy Technology Data Exchange (ETDEWEB)

    Morita, Satoru, E-mail: i@imodey.com; Yamazaki, Hiroshi; Sonoyama, Yasuyuki; Nishina, Yu [Tokyo Women’s Medical University Hospital, Department of Diagnostic Imaging and Nuclear Medicine (Japan); Ichihara, Atsuhiro [Tokyo Women’s Medical University Hospital, Department of Medicine II, Endocrinology and Hypertension (Japan); Sakai, Shuji [Tokyo Women’s Medical University Hospital, Department of Diagnostic Imaging and Nuclear Medicine (Japan)

    2016-07-15

    PurposeTo establish technical success rates and safety of adrenal venous sampling (AVS) performed by non-experts with reference to CT images.Materials and Methods104 AVS procedures with adrenocorticotropic hormone stimulation were performed for patients with suspected primary aldosteronism. One of three radiology residents with 2nd, 5th, and 5th grade experience undertook the procedure under the guidance of an experienced, board-certified interventional radiologist with reference to contrast-enhanced CT images obtained in 102 cases. Successful catheterization of the adrenal veins was assessed using three criteria: an adrenal venous cortisol concentration of more than 200 μg/dL (criterion A); an adrenal vein/inferior vena cava cortisol ratio of more than 5:1 (criterion B); and an adrenal vein/inferior vena cava cortisol ratio of more than 10:1 (criterion C).ResultsThe operators were aware of the anatomy of the left adrenal veins in 102 cases (98 %) and of the right adrenal veins in 99 cases (95 %) prior to the procedure. CT identified the correct position of the right adrenal vein orifice in 82 of 99 cases (83 %). The overall technical success rates for AVS from the right adrenal vein according to criteria A, B, and C, were 96, 96, and 94 %, respectively. Those for the left adrenal vein were 97, 98, and 94 %, respectively. No significant differences in success rates were observed between the operators (p = 0.922–0.984). No major complications, including adrenal vein rupture, were observed.ConclusionsWhen CT images are used to guide AVS, the procedure can be performed successfully and safely even by non-experts.

  13. A New Approach on Sampling Microorganisms from the Lower Stratosphere

    Science.gov (United States)

    Gunawan, B.; Lehnen, J. N.; Prince, J.; Bering, E., III; Rodrigues, D.

    2017-12-01

    University of Houston's Undergraduate Student Instrumentation Project (USIP) astrobiology group will attempt to provide a cross-sectional analysis of microorganisms in the lower stratosphere by collecting living microbial samples using a sterile and lightweight balloon-borne payload. Refer to poster by Dr. Edgar Bering in session ED032. The purpose of this research is two-fold: first, to design a new system that is capable of greater mass air intake, unlike the previous iterations where heavy and power-intensive pumps are used; and second, to provide proof of concept that live samples are accumulated in the upper atmosphere and are viable for extensive studies and consequent examination for their potential weather-altering characteristics. Multiple balloon deployments will be conducted to increase accuracy and to provide larger set of data. This paper will also discuss visual presentation of the payload along with analyzed information of the captured samples. Design details will be presented to NASA investigators for professional studies

  14. Relevance of plastic limit loads to reference stress approach for surface cracked cylinder problems

    International Nuclear Information System (INIS)

    Kim, Yun-Jae; Shim, Do-Jun

    2005-01-01

    To investigate the relevance of the definition of the reference stress to estimate J and C* for surface crack problems, this paper compares finite element (FE) J and C* results for surface cracked pipes with those estimated according to the reference stress approach using various definitions of the reference stress. Pipes with part circumferential inner surface cracks and finite internal axial cracks are considered, subject to internal pressure and global bending. The crack depth and aspect ratio are systematically varied. The reference stress is defined in four different ways using (i) a local limit load (ii), a global limit load, (iii) a global limit load determined from the FE limit analysis, and (iv) the optimised reference load. It is found that the reference stress based on a local limit load gives overall excessively conservative estimates of J and C*. Use of a global limit load clearly reduces the conservatism, compared to that of a local limit load, although it can sometimes provide non-conservative estimates of J and C*. The use of the FE global limit load gives overall non-conservative estimates of J and C*. The reference stress based on the optimised reference load gives overall accurate estimates of J and C*, compared to other definitions of the reference stress. Based on the present findings, general guidance on the choice of the reference stress for surface crack problems is given

  15. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  16. THE LYMAN ALPHA REFERENCE SAMPLE: EXTENDED LYMAN ALPHA HALOS PRODUCED AT LOW DUST CONTENT

    International Nuclear Information System (INIS)

    Hayes, Matthew; Östlin, Göran; Duval, Florent; Guaita, Lucia; Melinder, Jens; Sandberg, Andreas; Schaerer, Daniel; Verhamme, Anne; Orlitová, Ivana; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor; Adamo, Angela; Atek, Hakim; Cannon, John M.; Herenz, E. Christian; Kunth, Daniel; Laursen, Peter

    2013-01-01

    We report on new imaging observations of the Lyman alpha emission line (Lyα), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028 P20 , Lyα radii are larger than those of Hα by factors ranging from 1 to 3.6, with an average of 2.4. The average ratio of Lyα-to-FUV radii is 2.9. This suggests that much of the Lyα light is pushed to large radii by resonance scattering. Defining the Relative Petrosian Extension of Lyα compared to Hα, ξ Lyα = R Lyα P20 /R Hα P20 , we find ξ Lyα to be uncorrelated with total Lyα luminosity. However, ξ Lyα is strongly correlated with quantities that scale with dust content, in the sense that a low dust abundance is a necessary requirement (although not the only one) in order to spread Lyα photons throughout the interstellar medium and drive a large extended Lyα halo.

  17. Milk and serum standard reference materials for monitoring organic contaminants in human samples.

    Science.gov (United States)

    Schantz, Michele M; Eppe, Gauthier; Focant, Jean-François; Hamilton, Coreen; Heckert, N Alan; Heltsley, Rebecca M; Hoover, Dale; Keller, Jennifer M; Leigh, Stefan D; Patterson, Donald G; Pintar, Adam L; Sharpless, Katherine E; Sjödin, Andreas; Turner, Wayman E; Vander Pol, Stacy S; Wise, Stephen A

    2013-02-01

    Four new Standard Reference Materials (SRMs) have been developed to assist in the quality assurance of chemical contaminant measurements required for human biomonitoring studies, SRM 1953 Organic Contaminants in Non-Fortified Human Milk, SRM 1954 Organic Contaminants in Fortified Human Milk, SRM 1957 Organic Contaminants in Non-Fortified Human Serum, and SRM 1958 Organic Contaminants in Fortified Human Serum. These materials were developed as part of a collaboration between the National Institute of Standards and Technology (NIST) and the Centers for Disease Control and Prevention (CDC) with both agencies contributing data used in the certification of mass fraction values for a wide range of organic contaminants including polychlorinated biphenyl (PCB) congeners, chlorinated pesticides, polybrominated diphenyl ether (PBDE) congeners, and polychlorinated dibenzo-p-dioxin (PCDD) and dibenzofuran (PCDF) congeners. The certified mass fractions of the organic contaminants in unfortified samples, SRM 1953 and SRM 1957, ranged from 12 ng/kg to 2200 ng/kg with the exception of 4,4'-DDE in SRM 1953 at 7400 ng/kg with expanded uncertainties generally <14 %. This agreement suggests that there were no significant biases existing among the multiple methods used for analysis.

  18. New approaches to nanoparticle sample fabrication for atom probe tomography

    International Nuclear Information System (INIS)

    Felfer, P.; Li, T.; Eder, K.; Galinski, H.; Magyar, A.P.; Bell, D.C.; Smith, G.D.W.; Kruse, N.; Ringer, S.P.; Cairney, J.M.

    2015-01-01

    Due to their unique properties, nano-sized materials such as nanoparticles and nanowires are receiving considerable attention. However, little data is available about their chemical makeup at the atomic scale, especially in three dimensions (3D). Atom probe tomography is able to answer many important questions about these materials if the challenge of producing a suitable sample can be overcome. In order to achieve this, the nanomaterial needs to be positioned within the end of a tip and fixed there so the sample possesses sufficient structural integrity for analysis. Here we provide a detailed description of various techniques that have been used to position nanoparticles on substrates for atom probe analysis. In some of the approaches, this is combined with deposition techniques to incorporate the particles into a solid matrix, and focused ion beam processing is then used to fabricate atom probe samples from this composite. Using these approaches, data has been achieved from 10–20 nm core–shell nanoparticles that were extracted directly from suspension (i.e. with no chemical modification) with a resolution of better than ±1 nm. - Highlights: • Samples for APT of nanoparticles were fabricated from particle powders and dispersions. • Electrophoresis was suitable for producing samples from dispersions. • Powder lift-out was successfully producing samples from particle agglomerates. • Dispersion application/coating delivered the highest quality results.

  19. New approaches to nanoparticle sample fabrication for atom probe tomography

    Energy Technology Data Exchange (ETDEWEB)

    Felfer, P., E-mail: peter.felfer@sydney.edu.au [School for Aerospace, Mechanical and Mechatronic Engineering/Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); Li, T. [School for Aerospace, Mechanical and Mechatronic Engineering/Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); Materials Department, The University of Oxford, Oxford (United Kingdom); Eder, K. [School for Aerospace, Mechanical and Mechatronic Engineering/Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); Galinski, H. [School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138 (United States); Magyar, A.P.; Bell, D.C. [School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138 (United States); Center for Nanoscale Systems, Harvard University, Cambridge, MA 02138 (United States); Smith, G.D.W. [Materials Department, The University of Oxford, Oxford (United Kingdom); Kruse, N. [Chemical Physics of Materials (Catalysis-Tribology), Université Libre de Bruxelles, Campus Plaine, CP 243, 1050 Brussels (Belgium); Ringer, S.P.; Cairney, J.M. [School for Aerospace, Mechanical and Mechatronic Engineering/Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia)

    2015-12-15

    Due to their unique properties, nano-sized materials such as nanoparticles and nanowires are receiving considerable attention. However, little data is available about their chemical makeup at the atomic scale, especially in three dimensions (3D). Atom probe tomography is able to answer many important questions about these materials if the challenge of producing a suitable sample can be overcome. In order to achieve this, the nanomaterial needs to be positioned within the end of a tip and fixed there so the sample possesses sufficient structural integrity for analysis. Here we provide a detailed description of various techniques that have been used to position nanoparticles on substrates for atom probe analysis. In some of the approaches, this is combined with deposition techniques to incorporate the particles into a solid matrix, and focused ion beam processing is then used to fabricate atom probe samples from this composite. Using these approaches, data has been achieved from 10–20 nm core–shell nanoparticles that were extracted directly from suspension (i.e. with no chemical modification) with a resolution of better than ±1 nm. - Highlights: • Samples for APT of nanoparticles were fabricated from particle powders and dispersions. • Electrophoresis was suitable for producing samples from dispersions. • Powder lift-out was successfully producing samples from particle agglomerates. • Dispersion application/coating delivered the highest quality results.

  20. The Lyα reference sample. I. Survey outline and first results for Markarian 259

    International Nuclear Information System (INIS)

    Östlin, Göran; Hayes, Matthew; Duval, Florent; Sandberg, Andreas; Rivera-Thorsen, Thøger; Marquart, Thomas; Adamo, Angela; Melinder, Jens; Guaita, Lucia; Micheva, Genoveva; Orlitová, Ivana; Atek, Hakim; Cannon, John M.; Pardy, Stephen A.; Gruyters, Pieter; Herenz, Edmund Christian; Kunth, Daniel; Laursen, Peter; Mas-Hesse, J. Miguel; Otí-Floranes, Héctor

    2014-01-01

    The Lyα Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Lyαline of neutral hydrogen. Lyα is the dominant spectral line in use for characterizing high-redshift (z) galaxies. This paper presents an overview of the survey, its selection function, and HST imaging observations. The sample was selected from the combined GALEX+Sloan Digital Sky Survey catalog at z = 0.028-0.19, in order to allow Lyα to be captured with combinations of long-pass filters in the Solar Blind Channel (SBC) of the Advanced Camera for Surveys (ACS) onboard HST. In addition, LARS utilizes Hα and Hβ narrowband and u, b, i broadband imaging with ACS and the Wide Field Camera 3 (WFC3). In order to study galaxies in which large numbers of Lyα photons are produced (whether or not they escape), we demanded an Hα equivalent width W(Hα) ≥100 Å. The final sample of 14 galaxies covers far-UV (FUV, λ ∼ 1500 Å) luminosities that overlap with those of high-z Lyα emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present the reduction steps used to obtain the Lyα images, including our LARS eXtraction software (LaXs), which utilizes pixel-by-pixel spectral synthesis fitting of the energy distribution to determine and subtract the continuum at Lyα. We demonstrate that the use of SBC long-pass-filter combinations increase the signal-to-noise ratio by an order of magnitude compared to the nominal Lyα filter available in SBC. To exemplify the science potential of LARS, we also present some first results for a single galaxy, Mrk 259 (LARS #1). This irregular galaxy shows bright and extended (indicative of resonance scattering) but strongly asymmetric Lyα emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate

  1. The Lyα reference sample. I. Survey outline and first results for Markarian 259

    Energy Technology Data Exchange (ETDEWEB)

    Östlin, Göran; Hayes, Matthew; Duval, Florent; Sandberg, Andreas; Rivera-Thorsen, Thøger; Marquart, Thomas; Adamo, Angela; Melinder, Jens; Guaita, Lucia; Micheva, Genoveva [Department of Astronomy, Stockholm University, Oscar Klein Centre, AlbaNova, Stockholm SE-106 91 (Sweden); Orlitová, Ivana [Observatoire de Genève, Université de Genève, Chemin des Maillettes 51, 1290 Versoix (Switzerland); Atek, Hakim [Laboratoire d' Astrophysique, Ecole Polytechnique Fédérale de Lausanne, Observatoire de Sauverny, CH-1290 Versoix (Switzerland); Cannon, John M.; Pardy, Stephen A. [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Gruyters, Pieter [Department of Physics and Astronomy, Division of Astronomy and Space Physics, Uppsala University, Box 516, 75120 Uppsala (Sweden); Herenz, Edmund Christian [Leibniz-Institute for Astrophysics Potsdam (AIP), innoFSPEC, An der Sternwarte 16, D-14482 Potsdam (Germany); Kunth, Daniel [Institut d' Astrophysique Paris, 98bis Bd Arago, F-75014 Paris (France); Laursen, Peter [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, DK-2100 Copenhagen (Denmark); Mas-Hesse, J. Miguel [Centro de Astrobiologa (CSIC-INTA), Departamento de Astrofsica, POB 78, E-28691, Villanueva de la Cañada (Spain); Otí-Floranes, Héctor [Instituto de Astronoma, Universidad Nacional Autnoma de Mxico, Apdo. Postal 106, Ensenada B. C. 22800 (Mexico); and others

    2014-12-10

    The Lyα Reference Sample (LARS) is a substantial program with the Hubble Space Telescope (HST) that provides a sample of local universe laboratory galaxies in which to study the detailed astrophysics of the visibility and strength of the Lyαline of neutral hydrogen. Lyα is the dominant spectral line in use for characterizing high-redshift (z) galaxies. This paper presents an overview of the survey, its selection function, and HST imaging observations. The sample was selected from the combined GALEX+Sloan Digital Sky Survey catalog at z = 0.028-0.19, in order to allow Lyα to be captured with combinations of long-pass filters in the Solar Blind Channel (SBC) of the Advanced Camera for Surveys (ACS) onboard HST. In addition, LARS utilizes Hα and Hβ narrowband and u, b, i broadband imaging with ACS and the Wide Field Camera 3 (WFC3). In order to study galaxies in which large numbers of Lyα photons are produced (whether or not they escape), we demanded an Hα equivalent width W(Hα) ≥100 Å. The final sample of 14 galaxies covers far-UV (FUV, λ ∼ 1500 Å) luminosities that overlap with those of high-z Lyα emitters (LAEs) and Lyman break galaxies (LBGs), making LARS a valid comparison sample. We present the reduction steps used to obtain the Lyα images, including our LARS eXtraction software (LaXs), which utilizes pixel-by-pixel spectral synthesis fitting of the energy distribution to determine and subtract the continuum at Lyα. We demonstrate that the use of SBC long-pass-filter combinations increase the signal-to-noise ratio by an order of magnitude compared to the nominal Lyα filter available in SBC. To exemplify the science potential of LARS, we also present some first results for a single galaxy, Mrk 259 (LARS #1). This irregular galaxy shows bright and extended (indicative of resonance scattering) but strongly asymmetric Lyα emission. Spectroscopy from the Cosmic Origins Spectrograph on board HST centered on the brightest UV knot shows a moderate

  2. Selection of reference genes for tissue/organ samples on day 3 fifth-instar larvae in silkworm, Bombyx mori.

    Science.gov (United States)

    Wang, Genhong; Chen, Yanfei; Zhang, Xiaoying; Bai, Bingchuan; Yan, Hao; Qin, Daoyuan; Xia, Qingyou

    2018-06-01

    The silkworm, Bombyx mori, is one of the world's most economically important insect. Surveying variations in gene expression among multiple tissue/organ samples will provide clues for gene function assignments and will be helpful for identifying genes related to economic traits or specific cellular processes. To ensure their accuracy, commonly used gene expression quantification methods require a set of stable reference genes for data normalization. In this study, 24 candidate reference genes were assessed in 10 tissue/organ samples of day 3 fifth-instar B. mori larvae using geNorm and NormFinder. The results revealed that, using the combination of the expression of BGIBMGA003186 and BGIBMGA008209 was the optimum choice for normalizing the expression data of the B. mori tissue/organ samples. The most stable gene, BGIBMGA003186, is recommended if just one reference gene is used. Moreover, the commonly used reference gene encoding cytoplasmic actin was the least appropriate reference gene of the samples investigated. The reliability of the selected reference genes was further confirmed by evaluating the expression profiles of two cathepsin genes. Our results may be useful for future studies involving the quantification of relative gene expression levels of different tissue/organ samples in B. mori. © 2018 Wiley Periodicals, Inc.

  3. 3D dentofacial photogrammetry reference values: a novel approach to orthodontic diagnosis.

    Science.gov (United States)

    Masoud, Mohamed I; Bansal, Neetu; C Castillo, Jose; Manosudprasit, Amornrut; Allareddy, Veerasathpurush; Haghi, Arshan; Hawkins, Hannah C; Otárola-Castillo, Erik

    2017-04-01

    Orthodontic diagnostic standards generally use the cranial base as a reference and rely on samples selected by orthodontists. The purpose of this study was to provide male and female standards for a novel non-radiographic approach for orthodontic diagnosis that utilizes 3D dentofacial photogrammetry using the eyes and natural head orientation as references instead of the cranial base. One hundred and eighty females and 200 males between the ages of 18 and 35 years from 2 modeling agencies were orthodontically screened for near ideal occlusion. Subjects that met the inclusion criteria were rated by a sample of 40 lay people for attractiveness on a visual analogue scale. The final sample that had 3D facial and dental imaging included 49 subjects 25 males and 24 females with near ideal occlusion and considered attractive by the public. Inter and Intra-examiner ICC were greater than 0.8 for both landmarking and indexing. Relative to a coronal plane contacting the pupils (MC), the mean sagittal position of the alar curvature (representing the nasomaxillary complex) was 14.36 ± 3.08 mm in males and 12.4 ± 3.58 mm in females. The sagittal position of soft tissue pogonion relative to the pupils was 14.84 ± 3.63 mm in males and 12.78 ± 5.68 mm in females. The angle between the alar curvature and pogonion relative to the pupils was 9° in males and 10° in females. With the exception of the occlusal plane which was steeper in females, no ratios or angular facial measurements showed a significant gender difference. Relative to MC, males had more proclined upper incisors (20° vs 16°) and more retroclined Lower incisors (27° vs 31°; P > 0.05). A Procrustes ANOVA and permutation test showed that the shapes of males and females are different enough to be considered two distinct populations. 1. When using the proposed method for orthodontic diagnosis, male and female patients should be compared to their respective dentofacial standards. 2. Validation of the proposed method

  4. Kaolin Quality Prediction from Samples: A Bayesian Network Approach

    International Nuclear Information System (INIS)

    Rivas, T.; Taboada, J.; Ordonez, C.; Matias, J. M.

    2009-01-01

    We describe the results of an expert system applied to the evaluation of samples of kaolin for industrial use in paper or ceramic manufacture. Different machine learning techniques - classification trees, support vector machines and Bayesian networks - were applied with the aim of evaluating and comparing their interpretability and prediction capacities. The predictive capacity of these models for the samples analyzed was highly satisfactory, both for ceramic quality and paper quality. However, Bayesian networks generally proved to be the most useful technique for our study, as this approach combines good predictive capacity with excellent interpretability of the kaolin quality structure, as it graphically represents relationships between variables and facilitates what-if analyses.

  5. An integrate-over-temperature approach for enhanced sampling.

    Science.gov (United States)

    Gao, Yi Qin

    2008-02-14

    A simple method is introduced to achieve efficient random walking in the energy space in molecular dynamics simulations which thus enhances the sampling over a large energy range. The approach is closely related to multicanonical and replica exchange simulation methods in that it allows configurations of the system to be sampled in a wide energy range by making use of Boltzmann distribution functions at multiple temperatures. A biased potential is quickly generated using this method and is then used in accelerated molecular dynamics simulations.

  6. THE LYMAN ALPHA REFERENCE SAMPLE: EXTENDED LYMAN ALPHA HALOS PRODUCED AT LOW DUST CONTENT

    Energy Technology Data Exchange (ETDEWEB)

    Hayes, Matthew [Universite de Toulouse, UPS-OMP, IRAP, Toulouse (France); Oestlin, Goeran; Duval, Florent; Guaita, Lucia; Melinder, Jens; Sandberg, Andreas [Department of Astronomy, Oskar Klein Centre, Stockholm University, AlbaNova University Centre, SE-106 91 Stockholm (Sweden); Schaerer, Daniel [CNRS, IRAP, 14, avenue Edouard Belin, F-31400 Toulouse (France); Verhamme, Anne; Orlitova, Ivana [Geneva Observatory, University of Geneva, 51 Chemin des Maillettes, CH-1290 Versoix (Switzerland); Mas-Hesse, J. Miguel; Oti-Floranes, Hector [Centro de Astrobiologia (CSIC-INTA), Departamento de Astrofisica, POB 78, 28691 Villanueva de la Canada (Spain); Adamo, Angela [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Atek, Hakim [Laboratoire d' Astrophysique, Ecole Polytechnique Federale de Lausanne (EPFL), Observatoire, CH-1290 Sauverny (Switzerland); Cannon, John M. [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Herenz, E. Christian [Leibniz-Institut fuer Astrophysik (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany); Kunth, Daniel [Institut d' Astrophysique de Paris, UMR 7095 CNRS and UPMC, 98 bis Bd Arago, F-75014 Paris (France); Laursen, Peter, E-mail: matthew@astro.su.se [Dark Cosmology Centre, Niels Bohr Institute, University of Copenhagen, Juliane Maries Vej 30, DK-2100 Copenhagen (Denmark)

    2013-03-10

    We report on new imaging observations of the Lyman alpha emission line (Ly{alpha}), performed with the Hubble Space Telescope, that comprise the backbone of the Lyman alpha Reference Sample. We present images of 14 starburst galaxies at redshifts 0.028 < z < 0.18 in continuum-subtracted Ly{alpha}, H{alpha}, and the far ultraviolet continuum. We show that Ly{alpha} is emitted on scales that systematically exceed those of the massive stellar population and recombination nebulae: as measured by the Petrosian 20% radius, R{sub P20}, Ly{alpha} radii are larger than those of H{alpha} by factors ranging from 1 to 3.6, with an average of 2.4. The average ratio of Ly{alpha}-to-FUV radii is 2.9. This suggests that much of the Ly{alpha} light is pushed to large radii by resonance scattering. Defining the Relative Petrosian Extension of Ly{alpha} compared to H{alpha}, {xi}{sub Ly{alpha}} = R {sup Ly{alpha}}{sub P20}/R {sup H{alpha}}{sub P20}, we find {xi}{sub Ly{alpha}} to be uncorrelated with total Ly{alpha} luminosity. However, {xi}{sub Ly{alpha}} is strongly correlated with quantities that scale with dust content, in the sense that a low dust abundance is a necessary requirement (although not the only one) in order to spread Ly{alpha} photons throughout the interstellar medium and drive a large extended Ly{alpha} halo.

  7. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  8. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  9. A Fault Sample Simulation Approach for Virtual Testability Demonstration Test

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong; QIU Jing; LIU Guanjun; YANG Peng

    2012-01-01

    Virtual testability demonstration test has many advantages,such as low cost,high efficiency,low risk and few restrictions.It brings new requirements to the fault sample generation.A fault sample simulation approach for virtual testability demonstration test based on stochastic process theory is proposed.First,the similarities and differences of fault sample generation between physical testability demonstration test and virtual testability demonstration test are discussed.Second,it is pointed out that the fault occurrence process subject to perfect repair is renewal process.Third,the interarrival time distribution function of the next fault event is given.Steps and flowcharts of fault sample generation are introduced.The number of faults and their occurrence time are obtained by statistical simulation.Finally,experiments are carried out on a stable tracking platform.Because a variety of types of life distributions and maintenance modes are considered and some assumptions are removed,the sample size and structure of fault sample simulation results are more similar to the actual results and more reasonable.The proposed method can effectively guide the fault injection in virtual testability demonstration test.

  10. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Ling-Yu Duan

    2010-01-01

    Full Text Available Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  11. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Tian Yonghong

    2010-01-01

    Full Text Available Abstract Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  12. A Variational Approach to Enhanced Sampling and Free Energy Calculations

    Science.gov (United States)

    Parrinello, Michele

    2015-03-01

    The presence of kinetic bottlenecks severely hampers the ability of widely used sampling methods like molecular dynamics or Monte Carlo to explore complex free energy landscapes. One of the most popular methods for addressing this problem is umbrella sampling which is based on the addition of an external bias which helps overcoming the kinetic barriers. The bias potential is usually taken to be a function of a restricted number of collective variables. However constructing the bias is not simple, especially when the number of collective variables increases. Here we introduce a functional of the bias which, when minimized, allows us to recover the free energy. We demonstrate the usefulness and the flexibility of this approach on a number of examples which include the determination of a six dimensional free energy surface. Besides the practical advantages, the existence of such a variational principle allows us to look at the enhanced sampling problem from a rather convenient vantage point.

  13. Variational Approach to Enhanced Sampling and Free Energy Calculations

    Science.gov (United States)

    Valsson, Omar; Parrinello, Michele

    2014-08-01

    The ability of widely used sampling methods, such as molecular dynamics or Monte Carlo simulations, to explore complex free energy landscapes is severely hampered by the presence of kinetic bottlenecks. A large number of solutions have been proposed to alleviate this problem. Many are based on the introduction of a bias potential which is a function of a small number of collective variables. However constructing such a bias is not simple. Here we introduce a functional of the bias potential and an associated variational principle. The bias that minimizes the functional relates in a simple way to the free energy surface. This variational principle can be turned into a practical, efficient, and flexible sampling method. A number of numerical examples are presented which include the determination of a three-dimensional free energy surface. We argue that, beside being numerically advantageous, our variational approach provides a convenient and novel standpoint for looking at the sampling problem.

  14. The redshift distribution of cosmological samples: a forward modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina, E-mail: joerg.herbel@phys.ethz.ch, E-mail: tomasz.kacprzak@phys.ethz.ch, E-mail: adam.amara@phys.ethz.ch, E-mail: alexandre.refregier@phys.ethz.ch, E-mail: claudio.bruderer@phys.ethz.ch, E-mail: andrina.nicola@phys.ethz.ch [Institute for Astronomy, Department of Physics, ETH Zürich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)

    2017-08-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  15. The redshift distribution of cosmological samples: a forward modeling approach

    Science.gov (United States)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-08-01

    Determining the redshift distribution n(z) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n(z) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc{UFig} (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n(z) distributions for the acceptable models. We demonstrate the method by determining n(z) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n(z) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  16. The redshift distribution of cosmological samples: a forward modeling approach

    International Nuclear Information System (INIS)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-01-01

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  17. Reference Data Layers for Earth and Environmental Science: History, Frameworks, Science Needs, Approaches, and New Technologies

    Science.gov (United States)

    Lenhardt, W. C.

    2015-12-01

    Global Mapping Project, Web-enabled Landsat Data (WELD), International Satellite Land Surface Climatology Project (ISLSCP), hydrology, solid earth dynamics, sedimentary geology, climate modeling, integrated assessments and so on all have needs for or have worked to develop consistently integrated data layers for Earth and environmental science. This paper will present an overview of an abstract notion of data layers of this types, what we are referring to as reference data layers for Earth and environmental science, highlight some historical examples, and delve into new approaches. The concept of reference data layers in this context combines data availability, cyberinfrastructure and data science, as well as domain science drivers. We argue that current advances in cyberinfrastructure such as iPython notebooks and integrated science processing environments such as iPlant's Discovery Environment coupled with vast arrays of new data sources warrant another look at the how to create, maintain, and provide reference data layers. The goal is to provide a context for understanding science needs for reference data layers to conduct their research. In addition, to the topics described above this presentation will also outline some of the challenges to and present some ideas for new approaches to addressing these needs. Promoting the idea of reference data layers is relevant to a number of existing related activities such as EarthCube, RDA, ESIP, the nascent NSF Regional Big Data Innovation Hubs and others.

  18. Sign languages and the Common European Framework of Reference for Languages : Descriptors and approaches to assessment

    NARCIS (Netherlands)

    L. Leeson; Dr. Beppie van den Bogaerde; Tobias Haug; C. Rathmann

    2015-01-01

    This resource establishes European standards for sign languages for professional purposes in line with the Common European Framework of Reference for Languages (CEFR) and provides an overview of assessment descriptors and approaches. Drawing on preliminary work undertaken in adapting the CEFR to

  19. Utilizing a Rapid Prototyping Approach in the Building of a Hypermedia-Based Reference Station.

    Science.gov (United States)

    Sell, Dan

    This paper discusses the building of a hypermedia-based reference station at the Wright Laboratory Technical Library, Wright-Patterson Air Force Base, Ohio. Following this, the paper focuses on an electronic user survey from which data is collected and analysis is made. The survey data is used in a rapid prototyping approach, which is defined as…

  20. Statistical Sampling Handbook for Student Aid Programs: A Reference for Non-Statisticians. Winter 1984.

    Science.gov (United States)

    Office of Student Financial Assistance (ED), Washington, DC.

    A manual on sampling is presented to assist audit and program reviewers, project officers, managers, and program specialists of the U.S. Office of Student Financial Assistance (OSFA). For each of the following types of samples, definitions and examples are provided, along with information on advantages and disadvantages: simple random sampling,…

  1. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  2. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  3. On adequacy of reference sample composition to metal oxide composition in spectral analysis by the method of sublimating additions

    International Nuclear Information System (INIS)

    Zakhariya, N.F.; Turulina, O.P.

    1983-01-01

    The problem on adequacy of zirconium dioxide reference samples to analysed samples is considered. The mechanism of impurities evaporation process in the presence of the sublimating addition for elements of different nature is studied and limiting it stages have been found. It is shown that kinetic probability of impurities transfer into the form of more high-volatile compounds depending on conditions of preparing zirconium dioxide samples may not coincide with the thermodynamic one. With this connected are systematic deviations of the analytical signal value for samples of different technological prehistory

  4. Identification of a set of endogenous reference genes for miRNA expression studies in Parkinson's disease blood samples.

    Science.gov (United States)

    Serafin, Alice; Foco, Luisa; Blankenburg, Hagen; Picard, Anne; Zanigni, Stefano; Zanon, Alessandra; Pramstaller, Peter P; Hicks, Andrew A; Schwienbacher, Christine

    2014-10-10

    Research on microRNAs (miRNAs) is becoming an increasingly attractive field, as these small RNA molecules are involved in several physiological functions and diseases. To date, only few studies have assessed the expression of blood miRNAs related to Parkinson's disease (PD) using microarray and quantitative real-time PCR (qRT-PCR). Measuring miRNA expression involves normalization of qRT-PCR data using endogenous reference genes for calibration, but their choice remains a delicate problem with serious impact on the resulting expression levels. The aim of the present study was to evaluate the suitability of a set of commonly used small RNAs as normalizers and to identify which of these miRNAs might be considered reliable reference genes in qRT-PCR expression analyses on PD blood samples. Commonly used reference genes snoRNA RNU24, snRNA RNU6B, snoRNA Z30 and miR-103a-3p were selected from the literature. We then analyzed the effect of using these genes as reference, alone or in any possible combination, on the measured expression levels of the target genes miR-30b-5p and miR-29a-3p, which have been previously reported to be deregulated in PD blood samples. We identified RNU24 and Z30 as a reliable and stable pair of reference genes in PD blood samples.

  5. User's and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user's guide and a reference guide. The user's guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMA and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices

  6. Structure of the Wechsler Intelligence Scale for Children-Fourth Edition among a National Sample of Referred Students

    Science.gov (United States)

    Watkins, Marley W.

    2010-01-01

    The structure of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV; D. Wechsler, 2003a) was analyzed via confirmatory factor analysis among a national sample of 355 students referred for psychoeducational evaluation by 93 school psychologists from 35 states. The structure of the WISC-IV core battery was best represented by four…

  7. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  8. Reactivity Measurements On Burnt And Reference Fuel Samples In LWR-PROTEUS Phase II

    International Nuclear Information System (INIS)

    Murphy, M.; Jatuff, F.; Grimm, P.; Seiler, R.; Luethi, A.; Van Geemert, R.; Brogli, R.; Chawla, R.; Meier, G.; Berger, H.-D.

    2003-01-01

    During the year 2002, the PROTEUS research reactor was used to make a series of reactivity measurements on Pressurised Water Reactor (PWR) burnt fuel samples, and on a series of specially prepared standards. These investigations have been made in two different neutron spectra. In addition, the intrinsic neutron emissions of the burnt fuel samples have been determined. (author)

  9. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  10. Developing Dynamic Reference Models and a Decision Support Framework for Southeastern Ecosystems: An Integrated Approach

    Science.gov (United States)

    2015-06-01

    and 2010 reference conditions (Figure 23). Based on the PerMANOVA analysis of the ground cover vegetation matrix, dispersion of sample units in...Keeney, editors. Methods of soil analysis Part 2 – Chemical and microbiological properties. CRC Press, Madison, Wisconsin, USA. Kennedy, C. M., P...CRAN.R-project.org/package=labdsv. Rodgers, H. L. and L. Provencher. 1999. Analysis of longleaf pine sandhill vegetation in northwest Florida

  11. Identification of Reliable Reference Genes for Quantification of MicroRNAs in Serum Samples of Sulfur Mustard-Exposed Veterans.

    Science.gov (United States)

    Gharbi, Sedigheh; Shamsara, Mehdi; Khateri, Shahriar; Soroush, Mohammad Reza; Ghorbanmehr, Nassim; Tavallaei, Mahmood; Nourani, Mohammad Reza; Mowla, Seyed Javad

    2015-01-01

    In spite of accumulating information about pathological aspects of sulfur mustard (SM), the precise mechanism responsible for its effects is not well understood. Circulating microRNAs (miRNAs) are promising biomarkers for disease diagnosis and prognosis. Accurate normalization using appropriate reference genes, is a critical step in miRNA expression studies. In this study, we aimed to identify appropriate reference gene for microRNA quantification in serum samples of SM victims. In this case and control experimental study, using quantitative real-time polymerase chain reaction (qRT-PCR), we evaluated the suitability of a panel of small RNAs including SNORD38B, SNORD49A, U6, 5S rRNA, miR-423-3p, miR-191, miR-16 and miR-103 in sera of 28 SM-exposed veterans of Iran-Iraq war (1980-1988) and 15 matched control volunteers. Different statistical algorithms including geNorm, Normfinder, best-keeper and comparative delta-quantification cycle (Cq) method were employed to find the least variable reference gene. miR-423-3p was identified as the most stably expressed reference gene, and miR- 103 and miR-16 ranked after that. We demonstrate that non-miRNA reference genes have the least stabil- ity in serum samples and that some house-keeping miRNAs may be used as more reliable reference genes for miRNAs in serum. In addition, using the geometric mean of two reference genes could increase the reliability of the normalizers.

  12. A certified reference material for radionuclides in the water sample from Irish Sea (IAEA-443)

    DEFF Research Database (Denmark)

    Pham, M.K.; Betti, M.; Povinec, P.P.

    2011-01-01

    A new certified reference material (CRM) for radionuclides in sea water from the Irish sea (IAEA-443) is described and the results of the certification process are presented. Ten radionuclides (3H, 40K, 90Sr, 137Cs, 234U, 235U, 238U, 238Pu, 239+240Pu and 241Am) have been certified, and information...... values on massic activities with 95% confidence intervals are given for four radionuclides (230Th, 232Th, 239Pu and 240Pu). Results for less frequently reported radionuclides (99Tc, 228Th, 237Np and 241Pu) are also reported. The CRM can be used for quality assurance/quality control of the analysis...

  13. The use of reference materials in the elemental analysis of biological samples

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1975-01-01

    Reference materials (RMs) are useful to compare the accuracy and precision of laboratories and techniques. The desirable properties of biological reference materials are listed, and the problems of production, homogenization and storage described. At present there are only 10 biological RMs available compared with 213 geological and 520 metallurgical RMs. There is a need for more biological RMs including special materials for microprobe analysis and for in vivo activation analysis. A study of 650 mean values for elements in RM Kale, analysed by many laboratories, leads to the following conclusions. 61% of the values lie within +-10% of the best mean, and 80% lie within +-20% of the best mean. Atomic absorption spectrometry gives results that are 5-30% high for seven elements, while intrumental neutron activation analysis gives low and imprecise results for K. Other techniques with poor interlaboratory precision include neutron activation for Mg, polarography for Zn and arc-spectrometry for many elements. More than half the values for elements in Kale were obtained by neutron activation, confirming the importance of this technique and the need for RMs. As a rough estimate, 6 x 10 9 elemental analyses of biological materials are carried out each year, mostly by medical, agricultural and food scientists. It seems likely that a substantial percentage of these are inaccurate, a situation that might be improved by quality control using standard RMs. (author)

  14. Development of a Novel Reference Plasmid for Accurate Quantification of Genetically Modified Kefeng6 Rice DNA in Food and Feed Samples

    Directory of Open Access Journals (Sweden)

    Liang Li

    2013-01-01

    Full Text Available Reference plasmids are an essential tool for the quantification of genetically modified (GM events. Quantitative real-time PCR (qPCR is the most commonly used method to characterize and quantify reference plasmids. However, the precision of this method is often limited by calibration curves, and qPCR data can be affected by matrix differences between the standards and samples. Here, we describe a digital PCR (dPCR approach that can be used to accurately measure the novel reference plasmid pKefeng6 and quantify the unauthorized variety of GM rice Kefeng6, eliminating the issues associated with matrix effects in calibration curves. The pKefeng6 plasmid was used as a calibrant for the quantification of Kefeng6 rice by determining the copy numbers of event- (77 bp and taxon-specific (68 bp fragments, their ratios, and their concentrations. The plasmid was diluted to five different concentrations. The third sample (S3 was optimized for the quantification range of dPCR according to previous reports. The ratio between the two fragments was 1.005, which closely approximated the value certified by sequencing, and the concentration was found to be 792 copies/μL. This method was precise, with an RSD of ~3%. These findings demonstrate the advantages of using the dPCR method to characterize reference materials.

  15. A comparison of two sampling approaches for assessing the urban forest canopy cover from aerial photography.

    Science.gov (United States)

    Ucar Zennure; Pete Bettinger; Krista Merry; Jacek Siry; J.M. Bowker

    2016-01-01

    Two different sampling approaches for estimating urban tree canopy cover were applied to two medium-sized cities in the United States, in conjunction with two freely available remotely sensed imagery products. A random point-based sampling approach, which involved 1000 sample points, was compared against a plot/grid sampling (cluster sampling) approach that involved a...

  16. Benchmark reference data on post irradiation analysis of light water reactor fuel samples

    International Nuclear Information System (INIS)

    Guardini, S.; Guzzi, G.

    1983-01-01

    The structure of the present report is as follows: in section I the benchmark activity (BM) is described in detail; characteristics of the reactors and fuel assemblies examinated are given, and the technical aspects of the chemical and analytical processes are discussed. In section II all the techniques used to certify the analytical data are presented, together with a discussion of evaluated random and systematic uncertainties. A comparison with the calculated values and the interpretation with ICT (Isotopic Correlation Techniques) is also presented in this section. Section III presents the results. In practice the complete sets of results referring to all JRC measurements are given here for the sake of the completeness and consistency of this final report

  17. A normative inference approach for optimal sample sizes in decisions from experience

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph

    2015-01-01

    “Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720

  18. The SPAI-18, a brief version of the social phobia and anxiety inventory: reliability and validity in clinically referred and non-referred samples.

    Science.gov (United States)

    de Vente, Wieke; Majdandžić, Mirjana; Voncken, Marisol J; Beidel, Deborah C; Bögels, Susan M

    2014-03-01

    We developed a new version of the Social Phobia and Anxiety Inventory (SPAI) in order to have a brief instrument for measuring social anxiety and social anxiety disorder (SAD) with a strong conceptual foundation. In the construction phase, a set of items representing 5 core aspects of social anxiety was selected by a panel of social anxiety experts. The selected item pool was validated using factor analysis, reliability analysis, and diagnostic analysis in a sample of healthy participants (N = 188) and a sample of clinically referred participants diagnosed with SAD (N = 98). This procedure resulted in an abbreviated version of the Social Phobia Subscale of the SPAI consisting of 18 items (i.e. the SPAI-18), which correlated strongly with the Social Phobia Subscale of the original SPAI (both groups r = .98). Internal consistency and diagnostic characteristics using a clinical cut-off score > 48 were good to excellent (Cronbach's alpha healthy group = .93; patient group = .91; sensitivity: .94; specificity: .88). The SPAI-18 was further validated in a community sample of parents-to-be without SAD (N = 237) and with SAD (N = 65). Internal consistency was again excellent (both groups Cronbach's alpha = .93) and a screening cut-off of > 36 proved to result in good sensitivity and specificity. The SPAI-18 also correlated strongly with other social anxiety instruments, supporting convergent validity. In sum, the SPAI-18 is a psychometrically sound instrument with good screening capacity for social anxiety disorder in clinical as well as community samples. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. VERSE: a novel approach to detect virus integration in host genomes through reference genome customization.

    Science.gov (United States)

    Wang, Qingguo; Jia, Peilin; Zhao, Zhongming

    2015-01-01

    Fueled by widespread applications of high-throughput next generation sequencing (NGS) technologies and urgent need to counter threats of pathogenic viruses, large-scale studies were conducted recently to investigate virus integration in host genomes (for example, human tumor genomes) that may cause carcinogenesis or other diseases. A limiting factor in these studies, however, is rapid virus evolution and resulting polymorphisms, which prevent reads from aligning readily to commonly used virus reference genomes, and, accordingly, make virus integration sites difficult to detect. Another confounding factor is host genomic instability as a result of virus insertions. To tackle these challenges and improve our capability to identify cryptic virus-host fusions, we present a new approach that detects Virus intEgration sites through iterative Reference SEquence customization (VERSE). To the best of our knowledge, VERSE is the first approach to improve detection through customizing reference genomes. Using 19 human tumors and cancer cell lines as test data, we demonstrated that VERSE substantially enhanced the sensitivity of virus integration site detection. VERSE is implemented in the open source package VirusFinder 2 that is available at http://bioinfo.mc.vanderbilt.edu/VirusFinder/.

  20. Global Distribution of Human-Associated Fecal Genetic Markers in Reference Samples from Six Continents.

    Science.gov (United States)

    Mayer, René E; Reischer, Georg H; Ixenmaier, Simone K; Derx, Julia; Blaschke, Alfred Paul; Ebdon, James E; Linke, Rita; Egle, Lukas; Ahmed, Warish; Blanch, Anicet R; Byamukama, Denis; Savill, Marion; Mushi, Douglas; Cristóbal, Héctor A; Edge, Thomas A; Schade, Margit A; Aslan, Asli; Brooks, Yolanda M; Sommer, Regina; Masago, Yoshifumi; Sato, Maria I; Taylor, Huw D; Rose, Joan B; Wuertz, Stefan; Shanks, Orin C; Piringer, Harald; Mach, Robert L; Savio, Domenico; Zessner, Matthias; Farnleitner, Andreas H

    2018-05-01

    Numerous bacterial genetic markers are available for the molecular detection of human sources of fecal pollution in environmental waters. However, widespread application is hindered by a lack of knowledge regarding geographical stability, limiting implementation to a small number of well-characterized regions. This study investigates the geographic distribution of five human-associated genetic markers (HF183/BFDrev, HF183/BacR287, BacHum-UCD, BacH, and Lachno2) in municipal wastewaters (raw and treated) from 29 urban and rural wastewater treatment plants (750-4 400 000 population equivalents) from 13 countries spanning six continents. In addition, genetic markers were tested against 280 human and nonhuman fecal samples from domesticated, agricultural and wild animal sources. Findings revealed that all genetic markers are present in consistently high concentrations in raw (median log 10 7.2-8.0 marker equivalents (ME) 100 mL -1 ) and biologically treated wastewater samples (median log 10 4.6-6.0 ME 100 mL -1 ) regardless of location and population. The false positive rates of the various markers in nonhuman fecal samples ranged from 5% to 47%. Results suggest that several genetic markers have considerable potential for measuring human-associated contamination in polluted environmental waters. This will be helpful in water quality monitoring, pollution modeling and health risk assessment (as demonstrated by QMRAcatch) to guide target-oriented water safety management across the globe.

  1. Manufacturing and testing of reference samples for the definition of acceptance criteria for the ITER divertor

    International Nuclear Information System (INIS)

    Visca, Eliseo; Cacciotti, E.; Libera, S.; Mancini, A.; Pizzuto, A.; Roccella, S.; Riccardi, B.; Escourbiac, F.; Sanguinetti, G.P.

    2010-01-01

    The most critical part of a high heat flux (HHF) plasma facing component (PFC) is the armour to heat sink joint. An experimental study was launched by EFDA in order to define the acceptance criteria to be used for the procurements of the ITER Divertor PFCs. ENEA is involved in the European International Thermonuclear Experimental Reactor (ITER) R and D activities and together with Ansaldo Ricerche S.p.A. has manufactured several PFCs mock-ups using the Hot Radial Pressing and Pre-Brazed Casting technologies. According to the technical specifications issued by EFDA, ENEA and Ansaldo have collaborated to manufacture half of the samples with calibrated artificial defects required for this experimental study. After manufacturing, the samples were examined by ultrasonic and SATIR non-destructive examination (NDE) methods in order to confirm the size and position of the artificial defects. In particular, it was concluded that defects are detectable with these NDE techniques and they finally gave indication about the threshold of propagation during high heat flux experiments relevant with heat fluxes expected in ITER Divertor. This paper reports the manufacturing procedure used to obtain the required calibrated artificial defects in the CFC and W armoured samples as well as the NDE results and the thermal high heat flux results.

  2. Manufacturing and testing of reference samples for the definition of acceptance criteria for the ITER divertor

    Energy Technology Data Exchange (ETDEWEB)

    Visca, Eliseo, E-mail: visca@frascati.enea.i [Associazione EURATOM-ENEA sulla Fusione, Frascati (Italy); Cacciotti, E.; Libera, S.; Mancini, A.; Pizzuto, A.; Roccella, S. [Associazione EURATOM-ENEA sulla Fusione, Frascati (Italy); Riccardi, B., E-mail: Bruno.Riccardi@f4e.europa.e [Fusion For Energy, Barcelona (Spain); Escourbiac, F., E-mail: frederic.escourbiac@iter.or [ITER Organization, Cadarache (France); Sanguinetti, G.P., E-mail: gianpaolo.sanguinetti@aen.ansaldo.i [Ansaldo Energia S.p.A., Genova (Italy)

    2010-12-15

    The most critical part of a high heat flux (HHF) plasma facing component (PFC) is the armour to heat sink joint. An experimental study was launched by EFDA in order to define the acceptance criteria to be used for the procurements of the ITER Divertor PFCs. ENEA is involved in the European International Thermonuclear Experimental Reactor (ITER) R and D activities and together with Ansaldo Ricerche S.p.A. has manufactured several PFCs mock-ups using the Hot Radial Pressing and Pre-Brazed Casting technologies. According to the technical specifications issued by EFDA, ENEA and Ansaldo have collaborated to manufacture half of the samples with calibrated artificial defects required for this experimental study. After manufacturing, the samples were examined by ultrasonic and SATIR non-destructive examination (NDE) methods in order to confirm the size and position of the artificial defects. In particular, it was concluded that defects are detectable with these NDE techniques and they finally gave indication about the threshold of propagation during high heat flux experiments relevant with heat fluxes expected in ITER Divertor. This paper reports the manufacturing procedure used to obtain the required calibrated artificial defects in the CFC and W armoured samples as well as the NDE results and the thermal high heat flux results.

  3. Defining reference conditions for acidified waters using a modern analogue approach

    International Nuclear Information System (INIS)

    Simpson, Gavin L.; Shilland, Ewan M.; Winterbottom, Julie M.; Keay, Janey

    2005-01-01

    Analogue matching is a palaeolimnological technique that aims to find matches for fossil sediment samples from a set of modern surface sediment samples. Modern analogues were identified that closely matched the pre-disturbance conditions of eight of the UK Acid Waters Monitoring Network (AWMN) lakes using diatom- and cladoceran-based analogue matching. These analogue sites were assessed in terms of hydrochemistry, aquatic macrophytes and macro-invertebrates as to their suitability for defining wider hydrochemical and biological reference conditions for acidified sites within the AWMN. The analogues identified for individual AWMN sites show a close degree of similarity in terms of their hydrochemical characteristics, aquatic macrophytes and, to a lesser extent, macro-invertebrate fauna. The reference conditions of acidified AWMN sites are inferred to be less acidic than today and to support a wider range of acid-sensitive aquatic macrophyte and macro-invertebrate taxa than that recorded in the AWMN lakes over the period of monitoring since 1988. - The use of a palaeolimnological technique to identify modern ecological reference analogues for acidified lakes is demonstrated

  4. Re-evaluation and extension of the scope of elements in US Geological Survey Standard Reference Water Samples

    Science.gov (United States)

    Peart, D.B.; Antweiler, Ronald C.; Taylor, Howard E.; Roth, D.A.; Brinton, T.I.

    1998-01-01

    More than 100 US Geological Survey (USGS) Standard Reference Water Samples (SRWSs) were analyzed for numerous trace constituents, including Al, As, B, Ba, Be, Bi, Br, Cd, Cr, Co, Cu, I, Fe, Pb, Li, Mn, Mo, Ni, Rb, Sb, Se, Sr, Te, Tl, U, V, Zn and major elements (Ca, Mg, Na, SiO2, SO4, Cl) by inductively coupled plasma mass spectrometry and inductively coupled plasma atomic emission spectrometry. In addition, 15 USGS SRWSs and National Institute of Standards and Technology (NIST) standard reference material (SRM) 1641b were analyzed for mercury using cold vapor atomic fluorescence spectrometry. Also USGS SRWS Hg-7 was analyzed using isotope dilution-inductively coupled plasma mass spectrometry. The results were compared with the reported certified values of the following standard reference materials: NIST SRM 1643a, 1643b, 1643c and 1643d and National Research Council of Canada Riverine Water Reference Materials for Trace Metals SLRS-1, SLRS-2 and SLRS-3. New concentration values for trace and major elements in the SRWSs, traceable to the certified standards, are reported. Additional concentration values are reported for elements that were neither previously published for the SRWSs nor traceable to the certified reference materials. Robust statistical procedures were used that were insensitive to outliers. These data can be used for quality assurance/quality control purposes in analytical laboratories.

  5. Certified reference material for radionuclides in fish flesh sample IAEA-414 (mixed fish from the Irish Sea and North Sea)

    DEFF Research Database (Denmark)

    Pham, M.K.; Sanchez-Cabeza, J.A.; Povinec, P.P.

    2006-01-01

    A certified reference material (CRM) for radionuclides in fish sample IAEA-414 (mixed fish from the Irish Sea and North Seas) is described and the results of the certification process are presented. Nine radionuclides (K-40, Cs-137, Th-232, U-234, U-235, U-238, Pu-238, Pu239+240 and Am-241) were...... ratios are also included. The CRM can be used for quality assurance/quality control of the analysis of radionuclides in fish sample, for the development and validation of analytical methods and for training purposes. The material is available from IAEA, Vienna, in 100 g units. (c) 2006 Elsevier Ltd. All...

  6. Assessing fundamental motor skills in Belgian children aged 3-8 years highlights differences to US reference sample.

    Science.gov (United States)

    Bardid, Farid; Huyben, Floris; Lenoir, Matthieu; Seghers, Jan; De Martelaer, Kristine; Goodway, Jacqueline D; Deconinck, Frederik J A

    2016-06-01

    This study aimed to understand the fundamental motor skills (FMS) of Belgian children using the process-oriented Test of Gross Motor Development, Second Edition (TGMD-2) and to investigate the suitability of using the United States (USA) test norms in Belgium. FMS were assessed using the TGMD-2. Gender, age and motor performance were examined in 1614 Belgian children aged 3-8 years (52.1% boys) and compared with the US reference sample. More proficient FMS performance was found with increasing age, from 3 to 6 years for locomotor skills and 3 to 7 years for object control skills. Gender differences were observed in object control skills, with boys performing better than girls. In general, Belgian children had lower levels of motor competence than the US reference sample, specifically for object control skills. The score distribution of the Belgian sample was skewed, with 37.4% scoring below average and only 6.9% scoring above average. This study supported the usefulness of the TGMD-2 as a process-oriented instrument to measure gross motor development in early childhood in Belgium. However, it also demonstrated that caution is warranted when using the US reference norms. ©2016 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  7. Solid phase speciation of arsenic by sequential extraction in standard reference materials and industrially contaminated soil samples

    International Nuclear Information System (INIS)

    Herreweghe, Samuel van; Swennen, Rudy; Vandecasteele, Carlo; Cappuyns, Valerie

    2003-01-01

    Leaching experiments, a mineralogical survey and larger samples are preferred when arsenic is present as discrete mineral phases. - Availability, mobility, (phyto)toxicity and potential risk of contaminants is strongly affected by the manner of appearance of elements, the so-called speciation. Operational fractionation methods like sequential extractions have been applied for a long time to determine the solid phase speciation of heavy metals since direct determination of specific chemical compounds can not always be easily achieved. The three-step sequential extraction scheme recommended by the BCR and two extraction schemes based on the phosphorus-like protocol proposed by Manful (1992, Occurrence and Ecochemical Behaviours of Arsenic in a Goldsmelter Impacted Area in Ghana, PhD dissertation, at the RUG) were applied to four standard reference materials (SRM) and to a batch of samples from industrially contaminated sites, heavily contaminated with arsenic and heavy metals. The SRM 2710 (Montana soil) was found to be the most useful reference material for metal (Mn, Cu, Zn, As, Cd and Pb) fractionation using the BCR sequential extraction procedure. Two sequential extraction schemes were developed and compared for arsenic with the aim to establish a better fractionation and recovery rate than the BCR-scheme for this element in the SRM samples. The major part of arsenic was released from the heavily contaminated samples after NaOH-extraction. Inferior extraction variability and recovery in the heavily contaminated samples compared to SRMs could be mainly contributed to subsample heterogeneity

  8. Liquid Water from First Principles: Validation of Different Sampling Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mundy, C J; Kuo, W; Siepmann, J; McGrath, M J; Vondevondele, J; Sprik, M; Hutter, J; Parrinello, M; Mohamed, F; Krack, M; Chen, B; Klein, M

    2004-05-20

    A series of first principles molecular dynamics and Monte Carlo simulations were carried out for liquid water to assess the validity and reproducibility of different sampling approaches. These simulations include Car-Parrinello molecular dynamics simulations using the program CPMD with different values of the fictitious electron mass in the microcanonical and canonical ensembles, Born-Oppenheimer molecular dynamics using the programs CPMD and CP2K in the microcanonical ensemble, and Metropolis Monte Carlo using CP2K in the canonical ensemble. With the exception of one simulation for 128 water molecules, all other simulations were carried out for systems consisting of 64 molecules. It is found that the structural and thermodynamic properties of these simulations are in excellent agreement with each other as long as adiabatic sampling is maintained in the Car-Parrinello molecular dynamics simulations either by choosing a sufficiently small fictitious mass in the microcanonical ensemble or by Nos{acute e}-Hoover thermostats in the canonical ensemble. Using the Becke-Lee-Yang-Parr exchange and correlation energy functionals and norm-conserving Troullier-Martins or Goedecker-Teter-Hutter pseudopotentials, simulations at a fixed density of 1.0 g/cm{sup 3} and a temperature close to 315 K yield a height of the first peak in the oxygen-oxygen radial distribution function of about 3.0, a classical constant-volume heat capacity of about 70 J K{sup -1} mol{sup -1}, and a self-diffusion constant of about 0.1 Angstroms{sup 2}/ps.

  9. Gender differences on the Five to Fifteen questionnaire in a non-referred sample with inattention and hyperactivity-impulsivity and a clinic-referred sample with hyperkinetic disorder

    DEFF Research Database (Denmark)

    Lambek, Rikke; Trillingsgaard, Anegen; Kadesjö, Björn

    2010-01-01

    The aim of the present study was to examine gender differences in children with inattention, hyperactivity, and impulsivity on the Five to Fifteen (FTF) parent questionnaire. First, non-referred girls (n = 43) and boys (n = 51) with problems of attention and hyperactivity-impulsivity and then cli......The aim of the present study was to examine gender differences in children with inattention, hyperactivity, and impulsivity on the Five to Fifteen (FTF) parent questionnaire. First, non-referred girls (n = 43) and boys (n = 51) with problems of attention and hyperactivity...... questionnaire. Secondly, it was examined whether the application of gender mixed norms versus gender specific norms would result in varying proportions of clinic-referred children with HKD being identified as impaired on the subdomains of the FTF questionnaire. Based on results it was concluded that the use...... of a gender mixed normative sample may lead to overestimation of impairment in boys with HKD, but the type of sample applied to define impairment on the FTF should depend on the purpose for applying the questionnaire....

  10. Nuclear forensics and nuclear analytical chemistry - iridium determination in a referred forensic sample

    International Nuclear Information System (INIS)

    Basu, A.K.; Bhadkambekar, C.A.; Tripathi, A.B.R.; Chattopadhyay, N.; Ghosh, P.

    2010-01-01

    Nuclear approaches for compositional characterization has bright application prospect in forensic perspective towards assessment of nature and origin of seized material. The macro and micro physical properties of nuclear materials can be specifically associated with a process or type of nuclear activity. Under the jurisdiction of nuclear analytical chemistry as well as nuclear forensics, thrust areas of scientific endeavor like determination of radioisotopes, isotopic and mass ratios, analysis for impurity contents, arriving at chemical forms/species and physical parameters play supporting evidence in forensic investigations. The analytical methods developed for this purposes can be used in international safeguards as well for nuclear forensics. Nuclear material seized in nuclear trafficking can be identified and a profile of the nuclear material can be created

  11. Referência: qual a referência e como evocá-la? Reference: what is it and how to approach it?

    Directory of Open Access Journals (Sweden)

    Leonor Scliar-Cabral

    2002-01-01

    Full Text Available Discutem-se problemas confrontados pelas teorias que tentam explicar a referência, em particular, a indeterminação. Contudo, categorias ontológicas e processos psicolingüísticos como o uso das pistas extra-lingüísticas e contextuais, ou do conhecimento compartilhado, permitem recuperar os sentidos específicos e infinitos de uma palavra e seus referentes. Os efeitos do letramento e da escolaridade sobre a estruturação e a evocação do significado são demonstrados empiricamente pelos dados de Monteiro, obtidos de 43 sujeitos. Duas estratégias preferenciais, embora não mutuamente exclusivas, foram observadas: a evocação de significados conceituais e de representação de eventos.Problems faced by theories which try to explain reference, namely, indeterminacy are discussed. Ontological categories together with common psycholinguistic processes such as either the use of contextual and extra-linguistic cues or shared knowledge allow to recover the specific and by nature endless sense of a word and its referents. The effect other factors have over structuring and retrieving meaning is empirically demonstrated by Monteiro's data obtained among 43 subjects with different levels of literacy and schooling. Two main preferred although not mutually exclusive strategies were observed: retrieval of linguistic conceptual meanings and event representations.

  12. A reference sample for investigating the stability of the imaging system of x-ray computed tomography

    International Nuclear Information System (INIS)

    Sun, Wenjuan; Brown, Stephen; Flay, Nadia; McCarthy, Michael; McBride, John

    2016-01-01

    The use of x-ray computed tomography for dimensional measurements associated with engineering applications has flourished in recent years. However, error sources associated with the technology are not well understood. In this paper, a novel two-sphere reference sample has been developed and used to investigate the stability of the imaging system that consists of an x-ray tube and a detector. In contrast with other research work reported, this work considered relative positional variation along the x -, y - and z -axes. This sample is a significant improvement over the one sphere sample proposed previously, which can only be used to observe the stability of the imaging system along x - and y -axes. Temperature variations of different parts of the system have been monitored and the relationship between temperature variations and x-ray image stability has been studied. Other effects that may also influence the stability of the imaging system have been discussed. The proposed reference sample and testing method are transferable to other types of x-ray computed tomography systems, for example, systems with transmission targets and systems with sub-micrometre focal spots. (paper)

  13. k0-NAA applied to certified reference materials and hair samples. Evaluation of exposure level in a galvanising industry

    International Nuclear Information System (INIS)

    Menezes, M.A. de B.C.; Pereira Maia, E.C.

    2000-01-01

    The k 0 parametric neutron activation analysis has been applied since 1995 in the Radiochemical Sector/CDTN, Belo Horizonte, Brazil. Several certified reference materials were studied with the aim of analysing biological samples. This work is related to an IAEA co-ordinated research project whose goal is to make a survey of the exposures to metals related to occupational diseases. It has been conducted by CDTN and government departments of health. The hair samples as bioindicators were donated by galvanising factory workers in Belo Horizonte. This city and surrounding area are important industrial centres and that industry is responsible for the majority of patients who look for medical assistance because of metal contamination. The Al, Co, Cu, Cr, La, Mn, Sb and V concentrations determined in the workers' samples suggest endogenous contamination. (author)

  14. A summarization approach for Affymetrix GeneChip data using a reference training set from a large, biologically diverse database

    Directory of Open Access Journals (Sweden)

    Tripputi Mark

    2006-10-01

    Full Text Available Abstract Background Many of the most popular pre-processing methods for Affymetrix expression arrays, such as RMA, gcRMA, and PLIER, simultaneously analyze data across a set of predetermined arrays to improve precision of the final measures of expression. One problem associated with these algorithms is that expression measurements for a particular sample are highly dependent on the set of samples used for normalization and results obtained by normalization with a different set may not be comparable. A related problem is that an organization producing and/or storing large amounts of data in a sequential fashion will need to either re-run the pre-processing algorithm every time an array is added or store them in batches that are pre-processed together. Furthermore, pre-processing of large numbers of arrays requires loading all the feature-level data into memory which is a difficult task even with modern computers. We utilize a scheme that produces all the information necessary for pre-processing using a very large training set that can be used for summarization of samples outside of the training set. All subsequent pre-processing tasks can be done on an individual array basis. We demonstrate the utility of this approach by defining a new version of the Robust Multi-chip Averaging (RMA algorithm which we refer to as refRMA. Results We assess performance based on multiple sets of samples processed over HG U133A Affymetrix GeneChip® arrays. We show that the refRMA workflow, when used in conjunction with a large, biologically diverse training set, results in the same general characteristics as that of RMA in its classic form when comparing overall data structure, sample-to-sample correlation, and variation. Further, we demonstrate that the refRMA workflow and reference set can be robustly applied to naïve organ types and to benchmark data where its performance indicates respectable results. Conclusion Our results indicate that a biologically diverse

  15. Thermal radiation transfer calculations in combustion fields using the SLW model coupled with a modified reference approach

    Science.gov (United States)

    Darbandi, Masoud; Abrar, Bagher

    2018-01-01

    The spectral-line weighted-sum-of-gray-gases (SLW) model is considered as a modern global model, which can be used in predicting the thermal radiation heat transfer within the combustion fields. The past SLW model users have mostly employed the reference approach to calculate the local values of gray gases' absorption coefficient. This classical reference approach assumes that the absorption spectra of gases at different thermodynamic conditions are scalable with the absorption spectrum of gas at a reference thermodynamic state in the domain. However, this assumption cannot be reasonable in combustion fields, where the gas temperature is very different from the reference temperature. Consequently, the results of SLW model incorporated with the classical reference approach, say the classical SLW method, are highly sensitive to the reference temperature magnitude in non-isothermal combustion fields. To lessen this sensitivity, the current work combines the SLW model with a modified reference approach, which is a particular one among the eight possible reference approach forms reported recently by Solovjov, et al. [DOI: 10.1016/j.jqsrt.2017.01.034, 2017]. The combination is called "modified SLW method". This work shows that the modified reference approach can provide more accurate total emissivity calculation than the classical reference approach if it is coupled with the SLW method. This would be particularly helpful for more accurate calculation of radiation transfer in highly non-isothermal combustion fields. To approve this, we use both the classical and modified SLW methods and calculate the radiation transfer in such fields. It is shown that the modified SLW method can almost eliminate the sensitivity of achieved results to the chosen reference temperature in treating highly non-isothermal combustion fields.

  16. Periodic reference tracking control approach for smart material actuators with complex hysteretic characteristics

    Science.gov (United States)

    Sun, Zhiyong; Hao, Lina; Song, Bo; Yang, Ruiguo; Cao, Ruimin; Cheng, Yu

    2016-10-01

    Micro/nano positioning technologies have been attractive for decades for their various applications in both industrial and scientific fields. The actuators employed in these technologies are typically smart material actuators, which possess inherent hysteresis that may cause systems behave unexpectedly. Periodic reference tracking capability is fundamental for apparatuses such as scanning probe microscope, which employs smart material actuators to generate periodic scanning motion. However, traditional controller such as PID method cannot guarantee accurate fast periodic scanning motion. To tackle this problem and to conduct practical implementation in digital devices, this paper proposes a novel control method named discrete extended unparallel Prandtl-Ishlinskii model based internal model (d-EUPI-IM) control approach. To tackle modeling uncertainties, the robust d-EUPI-IM control approach is investigated, and the associated sufficient stabilizing conditions are derived. The advantages of the proposed controller are: it is designed and represented in discrete form, thus practical for digital devices implementation; the extended unparallel Prandtl-Ishlinskii model can precisely represent forward/inverse complex hysteretic characteristics, thus can reduce modeling uncertainties and benefits controllers design; in addition, the internal model principle based control module can be utilized as a natural oscillator for tackling periodic references tracking problem. The proposed controller was verified through comparative experiments on a piezoelectric actuator platform, and convincing results have been achieved.

  17. Composite Sampling Approaches for Bacillus anthracis Surrogate Extracted from Soil.

    Directory of Open Access Journals (Sweden)

    Brian France

    Full Text Available Any release of anthrax spores in the U.S. would require action to decontaminate the site and restore its use and operations as rapidly as possible. The remediation activity would require environmental sampling, both initially to determine the extent of contamination (hazard mapping and post-decon to determine that the site is free of contamination (clearance sampling. Whether the spore contamination is within a building or outdoors, collecting and analyzing what could be thousands of samples can become the factor that limits the pace of restoring operations. To address this sampling and analysis bottleneck and decrease the time needed to recover from an anthrax contamination event, this study investigates the use of composite sampling. Pooling or compositing of samples is an established technique to reduce the number of analyses required, and its use for anthrax spore sampling has recently been investigated. However, use of composite sampling in an anthrax spore remediation event will require well-documented and accepted methods. In particular, previous composite sampling studies have focused on sampling from hard surfaces; data on soil sampling are required to extend the procedure to outdoor use. Further, we must consider whether combining liquid samples, thus increasing the volume, lowers the sensitivity of detection and produces false negatives. In this study, methods to composite bacterial spore samples from soil are demonstrated. B. subtilis spore suspensions were used as a surrogate for anthrax spores. Two soils (Arizona Test Dust and sterilized potting soil were contaminated and spore recovery with composites was shown to match individual sample performance. Results show that dilution can be overcome by concentrating bacterial spores using standard filtration methods. This study shows that composite sampling can be a viable method of pooling samples to reduce the number of analysis that must be performed during anthrax spore remediation.

  18. k0-INAA for APM samples collected in period of June 2004 - March 2005 and some marine certified reference materials

    International Nuclear Information System (INIS)

    Dung, Ho Manh; Vu, Cao Dong; Y, Truong; Sy, Nguyen Thi

    2006-01-01

    The airborne particulate matter (APM) samples have been collected in 2004 using two types of polycarbonate membrane filter PM 2.5 and PM 2.5-10 at two sites of industrial (Ho Chi Mihn City) and rural (Dateh) regions in south of Vietnam. Three marine certified reference materials have been selected to establish a k0-NAA procedure for marine samples. The concentration of trace multi-element in the samples has been determined by the k 0 -INAA procedure using K o -DALAT software developed in Dalat NRI. About 28 elements in 224 APM samples collected at two areas of Dateh and HCMC of Vietnam in period from June, 2004 to March, 2005 were presented in report. The statistical analysis was applied to the data set to investigate the pollution source at sampling sites. The results proved that the k 0 -NAA on the Dalat research reactor is a reliable and effective analytical technique for characterization of trace multi-element in APM and marine samples for air and marine environmental pollution study in Vietnam. (author)

  19. Is avoidant disorder part of the social phobia spectrum in a referred sample of Brazilian children and adolescents?

    Directory of Open Access Journals (Sweden)

    Denardin D.

    2004-01-01

    Full Text Available The diagnosis of avoidant disorder was deleted from the Diagnostic and Statistical Manual of Mental disorders - fourth edition (DSM-IV based on a `committee decision' suggesting that avoidant disorder is part of the social phobia spectrum. The objective of the present study was to examine the nature of this clinical association in a referred sample of Brazilian children and adolescents. We assessed a referred sample of 375 youths using semi-structured diagnostic interview methodology. Demographic (age at admission to the study and sex and clinical (level of impairment, age at onset of symptoms and pattern of comorbidity data were assessed in subsamples of children with avoidant disorder (N = 7, social phobia (N = 26, and comorbidity between both disorders (N = 24. Although a significant difference in the male/female ratio was detected among groups (P = 0.03, none of the other clinical variables differed significantly among subjects that presented each condition separately or in combination. Most of the children with avoidant disorder fulfilled criteria for social phobia. Thus, our findings support the validity of the conceptualization of avoidant disorder as part of the social phobia spectrum in a clinical sample.

  20. Is avoidant disorder part of the social phobia spectrum in a referred sample of Brazilian children and adolescents?

    Directory of Open Access Journals (Sweden)

    D. Denardin

    2004-06-01

    Full Text Available The diagnosis of avoidant disorder was deleted from the Diagnostic and Statistical Manual of Mental disorders - fourth edition (DSM-IV based on a `committee decision' suggesting that avoidant disorder is part of the social phobia spectrum. The objective of the present study was to examine the nature of this clinical association in a referred sample of Brazilian children and adolescents. We assessed a referred sample of 375 youths using semi-structured diagnostic interview methodology. Demographic (age at admission to the study and sex and clinical (level of impairment, age at onset of symptoms and pattern of comorbidity data were assessed in subsamples of children with avoidant disorder (N = 7, social phobia (N = 26, and comorbidity between both disorders (N = 24. Although a significant difference in the male/female ratio was detected among groups (P = 0.03, none of the other clinical variables differed significantly among subjects that presented each condition separately or in combination. Most of the children with avoidant disorder fulfilled criteria for social phobia. Thus, our findings support the validity of the conceptualization of avoidant disorder as part of the social phobia spectrum in a clinical sample.

  1. A novel approach to process carbonate samples for radiocarbon measurements with helium carrier gas

    Energy Technology Data Exchange (ETDEWEB)

    Wacker, L., E-mail: wacker@phys.ethz.ch [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Fueloep, R.-H. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany); Hajdas, I. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Molnar, M. [Laboratory of Ion Beam Physics, ETH Zurich, 8093 Zurich (Switzerland); Institute of Nuclear Research, Hungarian Academy of Sciences, 4026 Debrecen (Hungary); Rethemeyer, J. [Institute of Geology and Mineralogy, University of Cologne, 50674 Cologne (Germany)

    2013-01-15

    Most laboratories prepare carbonates samples for radiocarbon analysis by acid decomposition in evacuated glass tubes and subsequent reduction of the evolved CO{sub 2} to graphite in self-made reduction manifolds. This process is time consuming and labor intensive. In this work, we have tested a new approach for the preparation of carbonate samples, where any high-vacuum system is avoided and helium is used as a carrier gas. The liberation of CO{sub 2} from carbonates with phosphoric acid is performed in a similar way as it is often done in stable isotope ratio mass spectrometry where CO{sub 2} is released with acid in septum sealed tube under helium atmosphere. The formed CO{sub 2} is later flushed in a helium flow by means of a double-walled needle mounted from the tubes to the zeolite trap of the automated graphitization equipment (AGE). It essentially replaces the elemental analyzer normally used for the combustion of organic samples. The process can be fully automated from sampling the released CO{sub 2} in the septum-sealed tubes with a commercially available auto-sampler to the graphitization with the automated graphitization. The new method yields in low sample blanks of about 50000 years. Results of processed reference materials (IAEA-C2, FIRI-C) are in agreement with their consensus values.

  2. New sampling strategy using a Bayesian approach to assess iohexol clearance in kidney transplant recipients.

    Science.gov (United States)

    Benz-de Bretagne, I; Le Guellec, C; Halimi, J M; Gatault, P; Barbet, C; Alnajjar, A; Büchler, M; Lebranchu, Y; Andres, Christian Robert; Vourcʼh, P; Blasco, H

    2012-06-01

    Glomerular filtration rate (GFR) measurement is a major issue in kidney transplant recipients for clinicians. GFR can be determined by estimating the plasma clearance of iohexol, a nonradiolabeled compound. For practical and convenient application for patients and caregivers, it is important that a minimal number of samples are drawn. The aim of this study was to develop and validate a Bayesian model with fewer samples for reliable prediction of GFR in kidney transplant recipients. Iohexol plasma concentration-time curves from 95 patients were divided into an index (n = 63) and a validation set (n = 32). Samples (n = 4-6 per patient) were obtained during the elimination phase, that is, between 120 and 270 minutes. Individual reference values of iohexol clearance (CL(iohexol)) were calculated from k (elimination slope) and V (volume of distribution from intercept). Individual CL(iohexol) values were then introduced into the Bröchner-Mortensen equation to obtain the GFR (reference value). A population pharmacokinetic model was developed from the index set and validated using standard methods. For the validation set, we tested various combinations of 1, 2, or 3 sampling time to estimate CL(iohexol). According to the different combinations tested, a maximum a posteriori Bayesian estimation of CL(iohexol) was obtained from population parameters. Individual estimates of GFR were compared with individual reference values through analysis of bias and precision. A capability analysis allowed us to determine the best sampling strategy for Bayesian estimation. A 1-compartment model best described our data. Covariate analysis showed that uremia, serum creatinine, and age were significantly associated with k(e), and weight with V. The strategy, including samples drawn at 120 and 270 minutes, allowed accurate prediction of GFR (mean bias: -3.71%, mean imprecision: 7.77%). With this strategy, about 20% of individual predictions were outside the bounds of acceptance set at ± 10

  3. Sensitivity/uncertainty analysis of a borehole scenario comparing Latin Hypercube Sampling and deterministic sensitivity approaches

    International Nuclear Information System (INIS)

    Harper, W.V.; Gupta, S.K.

    1983-10-01

    A computer code was used to study steady-state flow for a hypothetical borehole scenario. The model consists of three coupled equations with only eight parameters and three dependent variables. This study focused on steady-state flow as the performance measure of interest. Two different approaches to sensitivity/uncertainty analysis were used on this code. One approach, based on Latin Hypercube Sampling (LHS), is a statistical sampling method, whereas, the second approach is based on the deterministic evaluation of sensitivities. The LHS technique is easy to apply and should work well for codes with a moderate number of parameters. Of deterministic techniques, the direct method is preferred when there are many performance measures of interest and a moderate number of parameters. The adjoint method is recommended when there are a limited number of performance measures and an unlimited number of parameters. This unlimited number of parameters capability can be extremely useful for finite element or finite difference codes with a large number of grid blocks. The Office of Nuclear Waste Isolation will use the technique most appropriate for an individual situation. For example, the adjoint method may be used to reduce the scope to a size that can be readily handled by a technique such as LHS. Other techniques for sensitivity/uncertainty analysis, e.g., kriging followed by conditional simulation, will be used also. 15 references, 4 figures, 9 tables

  4. An Efficient Approach for Mars Sample Return Using Emerging Commercial Capabilities.

    Science.gov (United States)

    Gonzales, Andrew A; Stoker, Carol R

    2016-06-01

    Mars Sample Return is the highest priority science mission for the next decade as recommended by the 2011 Decadal Survey of Planetary Science [1]. This article presents the results of a feasibility study for a Mars Sample Return mission that efficiently uses emerging commercial capabilities expected to be available in the near future. The motivation of our study was the recognition that emerging commercial capabilities might be used to perform Mars Sample Return with an Earth-direct architecture, and that this may offer a desirable simpler and lower cost approach. The objective of the study was to determine whether these capabilities can be used to optimize the number of mission systems and launches required to return the samples, with the goal of achieving the desired simplicity. All of the major element required for the Mars Sample Return mission are described. Mission system elements were analyzed with either direct techniques or by using parametric mass estimating relationships. The analysis shows the feasibility of a complete and closed Mars Sample Return mission design based on the following scenario: A SpaceX Falcon Heavy launch vehicle places a modified version of a SpaceX Dragon capsule, referred to as "Red Dragon", onto a Trans Mars Injection trajectory. The capsule carries all the hardware needed to return to Earth Orbit samples collected by a prior mission, such as the planned NASA Mars 2020 sample collection rover. The payload includes a fully fueled Mars Ascent Vehicle; a fueled Earth Return Vehicle, support equipment, and a mechanism to transfer samples from the sample cache system onboard the rover to the Earth Return Vehicle. The Red Dragon descends to land on the surface of Mars using Supersonic Retropropulsion. After collected samples are transferred to the Earth Return Vehicle, the single-stage Mars Ascent Vehicle launches the Earth Return Vehicle from the surface of Mars to a Mars phasing orbit. After a brief phasing period, the Earth Return

  5. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    Science.gov (United States)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  6. An approach to local diagnostic reference levels (DRL's) in the context of national and international DRL's

    International Nuclear Information System (INIS)

    Rogers, A.T.

    2001-01-01

    In recent years there has been a greater focus on the management of patient doses. This effort has been driven by the realisation of both the increasing magnitude of patient doses and their variation both intra- and inter-nationally. Legislators and guidance-issuing bodies have developed the idea of 'Diagnostic Reference Levels' (DRL's). In particular, the European Union, in their Council Directive 97/43/Euratom, required Member States to develop DRL's. The UK Government, when consolidating this EU Directive into UK legislation, extended the concept of DRL's from a national to an employer level. However, the methodologies used for development of national and international DRL's do not translate to a local level and hence a new approach is required. This paper describes one particular approach made by a UK hospital to introduce 'Local DRL's' in such a manner as to aid the optimisation process. This approach utilises a dose index, based on the local patient population, which is monitored for trends. Any trend in patient dose triggers an investigation linked to the clinical audit system within the Clinical Radiology Department. It is the audit cycle that ensures a continuing move towards an optimised situation. Additional triggers may be employed such as large patient dose variations. (author)

  7. Applicability of the "Frame of Reference" approach for environmental monitoring of offshore renewable energy projects.

    Science.gov (United States)

    Garel, Erwan; Rey, Cibran Camba; Ferreira, Oscar; van Koningsveld, Mark

    2014-08-01

    This paper assesses the applicability of the Frame of Reference (FoR) approach for the environmental monitoring of large-scale offshore Marine Renewable Energy (MRE) projects. The focus is on projects harvesting energy from winds, waves and currents. Environmental concerns induced by MRE projects are reported based on a classification scheme identifying stressors, receptors, effects and impacts. Although the potential effects of stressors on most receptors are identified, there are large knowledge gaps regarding the corresponding (positive and negative) impacts. In that context, the development of offshore MRE requires the implementation of fit-for-purpose monitoring activities aimed at environmental protection and knowledge development. Taking European legislation as an example, it is suggested to adopt standardized monitoring protocols for the enhanced usage and utility of environmental indicators. Towards this objective, the use of the FoR approach is advocated since it provides guidance for the definition and use of coherent set of environmental state indicators. After a description of this framework, various examples of applications are provided considering a virtual MRE project located in European waters. Finally, some conclusions and recommendations are provided for the successful implementation of the FoR approach and for future studies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A study on identification of bacteria in environmental samples using single-cell Raman spectroscopy: feasibility and reference libraries.

    Science.gov (United States)

    Baritaux, Jean-Charles; Simon, Anne-Catherine; Schultz, Emmanuelle; Emain, C; Laurent, P; Dinten, Jean-Marc

    2016-05-01

    We report on our recent efforts towards identifying bacteria in environmental samples by means of Raman spectroscopy. We established a database of Raman spectra from bacteria submitted to various environmental conditions. This dataset was used to verify that Raman typing is possible from measurements performed in non-ideal conditions. Starting from the same dataset, we then varied the phenotype and matrix diversity content included in the reference library used to train the statistical model. The results show that it is possible to obtain models with an extended coverage of spectral variabilities, compared to environment-specific models trained on spectra from a restricted set of conditions. Broad coverage models are desirable for environmental samples since the exact conditions of the bacteria cannot be controlled.

  9. Certified reference material for radionuclides in fish flesh sample IAEA-414 (mixed fish from the Irish Sea and North Sea)

    International Nuclear Information System (INIS)

    Pham, M.K.; Sanchez-Cabeza, J.A.; Povinec, P.P.; Arnold, D.; Benmansour, M.; Bojanowski, R.; Carvalho, F.P.; Kim, C.K.; Esposito, M.; Gastaud, J.; Gasco, C.L.; Ham, G.J.; Hegde, A.G.; Holm, E.; Jaskierowicz, D.; Kanisch, G.; Llaurado, M.; La Rosa, J.; Lee, S.-H.; Liong Wee Kwong, L.; Le Petit, G.; Maruo, Y.; Nielsen, S.P.; Oh, J.-S.; Oregioni, B.; Palomares, J.; Pettersson, H.B.L.; Rulik, P.; Ryan, T.P.; Sato, K.; Schikowski, J.; Skwarzec, B.; Smedley, P.A.; Tarjan, S.; Vajda, N.; Wyse, E.

    2006-01-01

    A certified reference material (CRM) for radionuclides in fish sample IAEA-414 (mixed fish from the Irish Sea and North Seas) is described and the results of the certification process are presented. Nine radionuclides ( 4 K, 137 Cs, 232 Th, 234 U, 235 U, 238 U, 238 Pu, 239+24 Pu and 241 Am) were certified for this material. Information on massic activities with 95% confidence intervals is given for six other radionuclides ( 9 Sr, 21 Pb( 21 Po), 226 Ra, 239 Pu, 24 Pu 241 Pu). Less frequently reported radionuclides ( 99 Tc, 129 I, 228 Th, 23 Th and 237 Np) and information on some activity and mass ratios are also included. The CRM can be used for quality assurance/quality control of the analysis of radionuclides in fish sample, for the development and validation of analytical methods and for training purposes. The material is available from IAEA, Vienna, in 100 g units

  10. Short Note An integrated remote sampling approach for aquatic ...

    African Journals Online (AJOL)

    A sampling method and apparatus for collecting meaningful and quantifiable samples of aquatic macroinvertebrates, and the macrophytes they are associated with, are presented. Where physical danger from wildlife is a significant factor, especially in Africa, this apparatus offers some safety in that it can be operated from a ...

  11. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples

    Directory of Open Access Journals (Sweden)

    Rígel Licier

    2016-10-01

    Full Text Available The proper handling of samples to be analyzed by mass spectrometry (MS can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  12. A Quantitative Proteomics Approach to Clinical Research with Non-Traditional Samples.

    Science.gov (United States)

    Licier, Rígel; Miranda, Eric; Serrano, Horacio

    2016-10-17

    The proper handling of samples to be analyzed by mass spectrometry (MS) can guarantee excellent results and a greater depth of analysis when working in quantitative proteomics. This is critical when trying to assess non-traditional sources such as ear wax, saliva, vitreous humor, aqueous humor, tears, nipple aspirate fluid, breast milk/colostrum, cervical-vaginal fluid, nasal secretions, bronco-alveolar lavage fluid, and stools. We intend to provide the investigator with relevant aspects of quantitative proteomics and to recognize the most recent clinical research work conducted with atypical samples and analyzed by quantitative proteomics. Having as reference the most recent and different approaches used with non-traditional sources allows us to compare new strategies in the development of novel experimental models. On the other hand, these references help us to contribute significantly to the understanding of the proportions of proteins in different proteomes of clinical interest and may lead to potential advances in the emerging field of precision medicine.

  13. Computing the Free Energy Barriers for Less by Sampling with a Coarse Reference Potential while Retaining Accuracy of the Target Fine Model.

    Science.gov (United States)

    Plotnikov, Nikolay V

    2014-08-12

    Proposed in this contribution is a protocol for calculating fine-physics (e.g., ab initio QM/MM) free-energy surfaces at a high level of accuracy locally (e.g., only at reactants and at the transition state for computing the activation barrier) from targeted fine-physics sampling and extensive exploratory coarse-physics sampling. The full free-energy surface is still computed but at a lower level of accuracy from coarse-physics sampling. The method is analytically derived in terms of the umbrella sampling and the free-energy perturbation methods which are combined with the thermodynamic cycle and the targeted sampling strategy of the paradynamics approach. The algorithm starts by computing low-accuracy fine-physics free-energy surfaces from the coarse-physics sampling in order to identify the reaction path and to select regions for targeted sampling. Thus, the algorithm does not rely on the coarse-physics minimum free-energy reaction path. Next, segments of high-accuracy free-energy surface are computed locally at selected regions from the targeted fine-physics sampling and are positioned relative to the coarse-physics free-energy shifts. The positioning is done by averaging the free-energy perturbations computed with multistep linear response approximation method. This method is analytically shown to provide results of the thermodynamic integration and the free-energy interpolation methods, while being extremely simple in implementation. Incorporating the metadynamics sampling to the algorithm is also briefly outlined. The application is demonstrated by calculating the B3LYP//6-31G*/MM free-energy barrier for an enzymatic reaction using a semiempirical PM6/MM reference potential. These modifications allow computing the activation free energies at a significantly reduced computational cost but at the same level of accuracy compared to computing full potential of mean force.

  14. A new approach to motion control of torque-constrained manipulators by using time-scaling of reference trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Valenzuela, Javier; Orozco-Manriquez, Ernesto [Digital del IPN, CITEDI-IPN, Tijuana, (Mexico)

    2009-12-15

    We introduce a control scheme based on using a trajectory tracking controller and an algorithm for on-line time scaling of the reference trajectories. The reference trajectories are time-scaled according to the measured tracking errors and the detected torque/acceleration saturation. Experiments are presented to illustrate the advantages of the proposed approach

  15. A new approach to motion control of torque-constrained manipulators by using time-scaling of reference trajectories

    International Nuclear Information System (INIS)

    Moreno-Valenzuela, Javier; Orozco-Manriquez, Ernesto

    2009-01-01

    We introduce a control scheme based on using a trajectory tracking controller and an algorithm for on-line time scaling of the reference trajectories. The reference trajectories are time-scaled according to the measured tracking errors and the detected torque/acceleration saturation. Experiments are presented to illustrate the advantages of the proposed approach

  16. French approach on the definition of reference defects to be considered for fracture mechanics analyses at design state

    Energy Technology Data Exchange (ETDEWEB)

    Grandemange, J M; Pellissier-Tanon, A [Societe Franco-Americaine de Constructions Atomiques (FRAMATOME), 92 - Paris-La-Defense (France)

    1988-12-31

    This document describes the french approach for verifying fracture resistance of PWR primary components. Three reference defects have been defined, namely the envelope defect, the exceptional defect and the conventional defect. It appears that a precise estimation of the available margins may be obtained by analyzing a set of reference defects representative of the flaws likely to exist in the components. (TEC). 5 refs.

  17. Critical assessment of the performance of electronic moisture analyzers for small amounts of environmental samples and biological reference materials.

    Science.gov (United States)

    Krachler, M

    2001-12-01

    Two electronic moisture analyzers were critically evaluated with regard to their suitability for determining moisture in small amounts (environmental matrices such as leaves, needles, soil, peat, sediments, and sewage sludge, as well as various biological reference materials. To this end, several homogeneous bulk materials were prepared which were subsequently employed for the development and optimization of all analytical procedures. The key features of the moisture analyzers included a halogen or ceramic heater and an integrated balance with a resolution of 0.1 mg, which is an essential prerequisite for obtaining precise results. Oven drying of the bulk materials in a conventional oven at 105 degrees C until constant mass served as reference method. A heating temperature of 65degrees C was found to provide accurate and precise results for almost all matrices investigated. To further improve the accuracy and precision, other critical parameters such as handling of sample pans, standby temperature, and measurement delay were optimized. Because of its ponderous heating behavior, the performance of the ceramic radiator was inferior to that of the halogen heater, which produced moisture results comparable to those obtained by oven drying. The developed drying procedures were successfully applied to the fast moisture analysis (1.4-6.3 min) of certified biological reference materials of similar provenance to the investigated the bulk materials. Moisture results for 200 mg aliquots ranged from 1.4 to 7.8% and good agreement was obtained between the recommended drying procedure for the reference materials and the electronic moisture analyzers with absolute uncertainties amounting to 0.1% and 0.2-0.3%, respectively.

  18. Development of an evidence-based approach to external quality assurance for breast cancer hormone receptor immunohistochemistry: comparison of reference values.

    Science.gov (United States)

    Makretsov, Nikita; Gilks, C Blake; Alaghehbandan, Reza; Garratt, John; Quenneville, Louise; Mercer, Joel; Palavdzic, Dragana; Torlakovic, Emina E

    2011-07-01

    External quality assurance and proficiency testing programs for breast cancer predictive biomarkers are based largely on traditional ad hoc design; at present there is no universal consensus on definition of a standard reference value for samples used in external quality assurance programs. To explore reference values for estrogen receptor and progesterone receptor immunohistochemistry in order to develop an evidence-based analytic platform for external quality assurance. There were 31 participating laboratories, 4 of which were previously designated as "expert" laboratories. Each participant tested a tissue microarray slide with 44 breast carcinomas for estrogen receptor and progesterone receptor and submitted it to the Canadian Immunohistochemistry Quality Control Program for analysis. Nuclear staining in 1% or more of the tumor cells was a positive score. Five methods for determining reference values were compared. All reference values showed 100% agreement for estrogen receptor and progesterone receptor scores, when indeterminate results were excluded. Individual laboratory performance (agreement rates, test sensitivity, test specificity, positive predictive value, negative predictive value, and κ value) was very similar for all reference values. Identification of suboptimal performance by all methods was identical for 30 of 31 laboratories. Estrogen receptor assessment of 1 laboratory was discordant: agreement was less than 90% for 3 of 5 reference values and greater than 90% with the use of 2 other reference values. Various reference values provide equivalent laboratory rating. In addition to descriptive feedback, our approach allows calculation of technical test sensitivity and specificity, positive and negative predictive values, agreement rates, and κ values to guide corrective actions.

  19. A Kalman filter approach for the determination of celestial reference frames

    Science.gov (United States)

    Soja, Benedikt; Gross, Richard; Jacobs, Christopher; Chin, Toshio; Karbon, Maria; Nilsson, Tobias; Heinkelmann, Robert; Schuh, Harald

    2017-04-01

    The coordinate model of radio sources in International Celestial Reference Frames (ICRF), such as the ICRF2, has traditionally been a constant offset. While sufficient for a large part of radio sources considering current accuracy requirements, several sources exhibit significant temporal coordinate variations. In particular, the group of the so-called special handling sources is characterized by large fluctuations in the source positions. For these sources and for several from the "others" category of radio sources, a coordinate model that goes beyond a constant offset would be beneficial. However, due to the sheer amount of radio sources in catalogs like the ICRF2, and even more so with the upcoming ICRF3, it is difficult to find the most appropriate coordinate model for every single radio source. For this reason, we have developed a time series approach to the determination of celestial reference frames (CRF). We feed the radio source coordinates derived from single very long baseline interferometry (VLBI) sessions sequentially into a Kalman filter and smoother, retaining their full covariances. The estimation of the source coordinates is carried out with a temporal resolution identical to the input data, i.e. usually 1-4 days. The coordinates are assumed to behave like random walk processes, an assumption which has already successfully been made for the determination of terrestrial reference frames such as the JTRF2014. To be able to apply the most suitable process noise value for every single radio source, their statistical properties are analyzed by computing their Allan standard deviations (ADEV). Additional to the determination of process noise values, the ADEV allows drawing conclusions whether the variations in certain radio source positions significantly deviate from random walk processes. Our investigations also deal with other means of source characterization, such as the structure index, in order to derive a suitable process noise model. The Kalman

  20. Non-response weighting adjustment approach in survey sampling ...

    African Journals Online (AJOL)

    Hence the discussion is illustrated with real examples from surveys (in particular 2003 KDHS) conducted by Central Bureau of Statistics (CBS) - Kenya. Some suggestions are made for improving the quality of non-response weighting. Keywords: Survey non-response; non-response adjustment factors; weighting; sampling ...

  1. Constrained optimisation of spatial sampling : a geostatistical approach

    NARCIS (Netherlands)

    Groenigen, van J.W.

    1999-01-01

    Aims

    This thesis aims at the development of optimal sampling strategies for geostatistical studies. Special emphasis is on the optimal use of ancillary data, such as co-related imagery, preliminary observations and historic knowledge. Although the object of all studies

  2. Asynchronous sampled-data approach for event-triggered systems

    Science.gov (United States)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  3. Green approaches in sample preparation of bioanalytical samples prior to chromatographic analysis.

    Science.gov (United States)

    Filippou, Olga; Bitas, Dimitrios; Samanidou, Victoria

    2017-02-01

    Sample preparation is considered to be the most challenging step of the analytical procedure, since it has an effect on the whole analytical methodology, therefore it contributes significantly to the greenness or lack of it of the entire process. The elimination of the sample treatment steps, pursuing at the same time the reduction of the amount of the sample, strong reductions in consumption of hazardous reagents and energy also maximizing safety for operators and environment, the avoidance of the use of big amount of organic solvents, form the basis for greening sample preparation and analytical methods. In the last decade, the development and utilization of greener and sustainable microextraction techniques is an alternative to classical sample preparation procedures. In this review, the main green microextraction techniques (solid phase microextraction, stir bar sorptive extraction, hollow-fiber liquid phase microextraction, dispersive liquid - liquid microextraction, etc.) will be presented, with special attention to bioanalytical applications of these environment-friendly sample preparation techniques which comply with the green analytical chemistry principles. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Electrophysiological Responses to Expectancy Violations in Semantic and Gambling Tasks: A Comparison of Different EEG Reference Approaches

    Science.gov (United States)

    Li, Ya; Wang, Yongchun; Zhang, Baoqiang; Wang, Yonghui; Zhou, Xiaolin

    2018-01-01

    Dynamically evaluating the outcomes of our actions and thoughts is a fundamental cognitive ability. Given its excellent temporal resolution, the event-related potential (ERP) technology has been used to address this issue. The feedback-related negativity (FRN) component of ERPs has been studied intensively with the averaged linked mastoid reference method (LM). However, it is unknown whether FRN can be induced by an expectancy violation in an antonym relations context and whether LM is the most suitable reference approach. To address these issues, the current research directly compared the ERP components induced by expectancy violations in antonym expectation and gambling tasks with a within-subjects design and investigated the effect of the reference approach on the experimental effects. Specifically, we systematically compared the influence of the LM, reference electrode standardization technique (REST) and average reference (AVE) approaches on the amplitude, scalp distribution and magnitude of ERP effects as a function of expectancy violation type. The expectancy deviation in the antonym expectation task elicited an N400 effect that differed from the FRN effect induced in the gambling task; this difference was confirmed by all the three reference methods. Both the amplitudes of the ERP effects (N400 and FRN) and the magnitude as the expectancy violation increased were greater under the LM approach than those under the REST approach, followed by those under the AVE approach. Based on the statistical results, the electrode sites that showed the N400 and FRN effects critically depended on the reference method, and the results of the REST analysis were consistent with previous ERP studies. Combined with evidence from simulation studies, we suggest that REST is an optional reference method to be used in future ERP data analysis. PMID:29615858

  5. Diversification Strategies and Firm Performance: A Sample Selection Approach

    OpenAIRE

    Santarelli, Enrico; Tran, Hien Thu

    2013-01-01

    This paper is based upon the assumption that firm profitability is determined by its degree of diversification which in turn is strongly related to the antecedent decision to carry out diversification activities. This calls for an empirical approach that permits the joint analysis of the three interrelated and consecutive stages of the overall diversification process: diversification decision, degree of diversification, and outcome of diversification. We apply parametric and semiparametric ap...

  6. Structure of the Wechsler Intelligence Scale for Children--Fourth Edition among a national sample of referred students.

    Science.gov (United States)

    Watkins, Marley W

    2010-12-01

    The structure of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV; D. Wechsler, 2003a) was analyzed via confirmatory factor analysis among a national sample of 355 students referred for psychoeducational evaluation by 93 school psychologists from 35 states. The structure of the WISC-IV core battery was best represented by four first-order factors as per D. Wechsler (2003b), plus a general intelligence factor in a direct hierarchical model. The general factor was the predominate source of variation among WISC-IV subtests, accounting for 48% of the total variance and 75% of the common variance. The largest 1st-order factor, Processing Speed, only accounted for 6.1% total and 9.5% common variance. Given these explanatory contributions, recommendations favoring interpretation of the 1st-order factor scores over the general intelligence score appear to be misguided.

  7. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...

  8. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federation policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.

  9. Evaluation of gene expression data generated from expired Affymetrix GeneChip® microarrays using MAQC reference RNA samples

    Directory of Open Access Journals (Sweden)

    Tong Weida

    2010-10-01

    Full Text Available Abstract Background The Affymetrix GeneChip® system is a commonly used platform for microarray analysis but the technology is inherently expensive. Unfortunately, changes in experimental planning and execution, such as the unavailability of previously anticipated samples or a shift in research focus, may render significant numbers of pre-purchased GeneChip® microarrays unprocessed before their manufacturer’s expiration dates. Researchers and microarray core facilities wonder whether expired microarrays are still useful for gene expression analysis. In addition, it was not clear whether the two human reference RNA samples established by the MAQC project in 2005 still maintained their transcriptome integrity over a period of four years. Experiments were conducted to answer these questions. Results Microarray data were generated in 2009 in three replicates for each of the two MAQC samples with either expired Affymetrix U133A or unexpired U133Plus2 microarrays. These results were compared with data obtained in 2005 on the U133Plus2 microarray. The percentage of overlap between the lists of differentially expressed genes (DEGs from U133Plus2 microarray data generated in 2009 and in 2005 was 97.44%. While there was some degree of fold change compression in the expired U133A microarrays, the percentage of overlap between the lists of DEGs from the expired and unexpired microarrays was as high as 96.99%. Moreover, the microarray data generated using the expired U133A microarrays in 2009 were highly concordant with microarray and TaqMan® data generated by the MAQC project in 2005. Conclusions Our results demonstrated that microarray data generated using U133A microarrays, which were more than four years past the manufacturer’s expiration date, were highly specific and consistent with those from unexpired microarrays in identifying DEGs despite some appreciable fold change compression and decrease in sensitivity. Our data also suggested that the

  10. Equilibrium Passive Sampling of POP in Lipid-Rich and Lean Fish Tissue: Quality Control Using Performance Reference Compounds.

    Science.gov (United States)

    Rusina, Tatsiana P; Carlsson, Pernilla; Vrana, Branislav; Smedes, Foppe

    2017-10-03

    Passive sampling is widely used to measure levels of contaminants in various environmental matrices, including fish tissue. Equilibrium passive sampling (EPS) of persistent organic pollutants (POP) in fish tissue has been hitherto limited to application in lipid-rich tissue. We tested several exposure methods to extend EPS applicability to lean tissue. Thin-film polydimethylsiloxane (PDMS) passive samplers were exposed statically to intact fillet and fish homogenate and dynamically by rolling with cut fillet cubes. The release of performance reference compounds (PRC) dosed to passive samplers prior to exposure was used to monitor the exchange process. The sampler-tissue exchange was isotropic, and PRC were shown to be good indicators of sampler-tissue equilibration status. The dynamic exposures demonstrated equilibrium attainment in less than 2 days for all three tested fish species, including lean fish containing 1% lipid. Lipid-based concentrations derived from EPS were in good agreement with lipid-normalized concentrations obtained using conventional solvent extraction. The developed in-tissue EPS method is robust and has potential for application in chemical monitoring of biota and bioaccumulation studies.

  11. Continuous age- and sex-adjusted reference intervals of urinary markers for cerebral creatine deficiency syndromes: a novel approach to the definition of reference intervals.

    Science.gov (United States)

    Mørkrid, Lars; Rowe, Alexander D; Elgstoen, Katja B P; Olesen, Jess H; Ruijter, George; Hall, Patricia L; Tortorelli, Silvia; Schulze, Andreas; Kyriakopoulou, Lianna; Wamelink, Mirjam M C; van de Kamp, Jiddeke M; Salomons, Gajja S; Rinaldo, Piero

    2015-05-01

    Urinary concentrations of creatine and guanidinoacetic acid divided by creatinine are informative markers for cerebral creatine deficiency syndromes (CDSs). The renal excretion of these substances varies substantially with age and sex, challenging the sensitivity and specificity of postanalytical interpretation. Results from 155 patients with CDS and 12 507 reference individuals were contributed by 5 diagnostic laboratories. They were binned into 104 adjacent age intervals and renormalized with Box-Cox transforms (Ξ). Estimates for central tendency (μ) and dispersion (σ) of Ξ were obtained for each bin. Polynomial regression analysis was used to establish the age dependence of both μ[log(age)] and σ[log(age)]. The regression residuals were then calculated as z-scores = {Ξ - μ[log(age)]}/σ[log(age)]. The process was iterated until all z-scores outside Tukey fences ±3.372 were identified and removed. Continuous percentile charts were then calculated and plotted by retransformation. Statistically significant and biologically relevant subgroups of z-scores were identified. Significantly higher marker values were seen in females than males, necessitating separate reference intervals in both adolescents and adults. Comparison between our reconstructed reference percentiles and current standard age-matched reference intervals highlights an underlying risk of false-positive and false-negative events at certain ages. Disease markers depending strongly on covariates such as age and sex require large numbers of reference individuals to establish peripheral percentiles with sufficient precision. This is feasible only through collaborative data sharing and the use of appropriate statistical methods. Broad application of this approach can be implemented through freely available Web-based software. © 2015 American Association for Clinical Chemistry.

  12. No-Reference Stereoscopic IQA Approach: From Nonlinear Effect to Parallax Compensation

    Directory of Open Access Journals (Sweden)

    Ke Gu

    2012-01-01

    Full Text Available The last decade has seen a booming of the applications of stereoscopic images/videos and the corresponding technologies, such as 3D modeling, reconstruction, and disparity estimation. However, only a very limited number of stereoscopic image quality assessment metrics was proposed through the years. In this paper, we propose a new no-reference stereoscopic image quality assessment algorithm based on the nonlinear additive model, ocular dominance model, and saliency based parallax compensation. Our studies using the Toyama database result in three valuable findings. First, quality of the stereoscopic image has a nonlinear relationship with a direct summation of two monoscopic image qualities. Second, it is a rational assumption that the right-eye response has the higher impact on the stereoscopic image quality, which is based on a sampling survey in the ocular dominance research. Third, the saliency based parallax compensation, resulted from different stereoscopic image contents, is considerably valid to improve the prediction performance of image quality metrics. Experimental results confirm that our proposed stereoscopic image quality assessment paradigm has superior prediction accuracy as compared to state-of-the-art competitors.

  13. THE LYMAN ALPHA REFERENCE SAMPLE. V. THE IMPACT OF NEUTRAL ISM KINEMATICS AND GEOMETRY ON Lyα ESCAPE

    International Nuclear Information System (INIS)

    Rivera-Thorsen, Thøger E.; Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Adamo, Angela; Orlitová, Ivana; Verhamme, Anne; Schaerer, Daniel; Mas-Hesse, J. Miguel; Cannon, John M.; Otí-Floranes, Héctor; Atek, Hakim; Herenz, E. Christian; Kunth, Daniel

    2015-01-01

    We present high-resolution far-UV spectroscopy of the 14 galaxies of the Lyα Reference Sample; a sample of strongly star-forming galaxies at low redshifts (0.028 < z < 0.18). We compare the derived properties to global properties derived from multi-band imaging and 21 cm H i interferometry and single-dish observations, as well as archival optical SDSS spectra. Besides the Lyα line, the spectra contain a number of metal absorption features allowing us to probe the kinematics of the neutral ISM and evaluate the optical depth and and covering fraction of the neutral medium as a function of line of sight velocity. Furthermore, we show how this, in combination with the precise determination of systemic velocity and good Lyα spectra, can be used to distinguish a model in which separate clumps together fully cover the background source, from the “picket fence” model named by Heckman et al. We find that no one single effect dominates in governing Lyα radiative transfer and escape. Lyα escape in our sample coincides with a maximum velocity-binned covering fraction of ≲0.9 and bulk outflow velocities of ≳50 km s −1 , although a number of galaxies show these characteristics and yet little or no Lyα escape. We find that Lyα peak velocities, where available, are not consistent with a strong backscattered component, but rather with a simpler model of an intrinsic emission line overlaid by a blueshifted absorption profile from the outflowing wind. Finally, we find a strong anticorrelation between Hα equivalent width and maximum velocity-binned covering factor, and propose a heuristic explanatory model

  14. THE LYMAN ALPHA REFERENCE SAMPLE. V. THE IMPACT OF NEUTRAL ISM KINEMATICS AND GEOMETRY ON Lyα ESCAPE

    Energy Technology Data Exchange (ETDEWEB)

    Rivera-Thorsen, Thøger E.; Hayes, Matthew; Östlin, Göran; Duval, Florent; Sandberg, Andreas; Guaita, Lucia; Adamo, Angela [Department of Astronomy, Oskar Klein Centre, Stockholm University, AlbaNova University Centre, SE-106 91 Stockholm (Sweden); Orlitová, Ivana [Astronomical Institute, Academy of Sciences of the Czech Republic, Boční II, CZ-14131 Prague (Czech Republic); Verhamme, Anne; Schaerer, Daniel [Geneva Observatory, University of Geneva, 51 Chemin des Maillettes, CH-1290 Versoix (Switzerland); Mas-Hesse, J. Miguel [Centro de Astrobiología (CSIC–INTA), Departamento de Astrofísica, P.O. Box 78, E-28691 Villanueva de la Cañada (Spain); Cannon, John M. [Department of Physics and Astronomy, Macalester College, 1600 Grand Avenue, Saint Paul, MN 55105 (United States); Otí-Floranes, Héctor [Instituto de Astronomía, Universidad Nacional Autónoma de México, Apdo. Postal 106, B. C. 22800 Ensenada (Mexico); Atek, Hakim [Laboratoire d’Astrophysique, École Polytechnique Fédérale de Lausanne (EPFL), Observatoire, CH-1290 Sauverny (Switzerland); Herenz, E. Christian [Leibniz-Institut für Astrophysik (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany); Kunth, Daniel, E-mail: trive@astro.su.se [Institut d’Astrophysique de Paris, UMR 7095 CNRS and UPMC, 98 bis Bd Arago, F-75014 Paris (France); and others

    2015-05-20

    We present high-resolution far-UV spectroscopy of the 14 galaxies of the Lyα Reference Sample; a sample of strongly star-forming galaxies at low redshifts (0.028 < z < 0.18). We compare the derived properties to global properties derived from multi-band imaging and 21 cm H i interferometry and single-dish observations, as well as archival optical SDSS spectra. Besides the Lyα line, the spectra contain a number of metal absorption features allowing us to probe the kinematics of the neutral ISM and evaluate the optical depth and and covering fraction of the neutral medium as a function of line of sight velocity. Furthermore, we show how this, in combination with the precise determination of systemic velocity and good Lyα spectra, can be used to distinguish a model in which separate clumps together fully cover the background source, from the “picket fence” model named by Heckman et al. We find that no one single effect dominates in governing Lyα radiative transfer and escape. Lyα escape in our sample coincides with a maximum velocity-binned covering fraction of ≲0.9 and bulk outflow velocities of ≳50 km s{sup −1}, although a number of galaxies show these characteristics and yet little or no Lyα escape. We find that Lyα peak velocities, where available, are not consistent with a strong backscattered component, but rather with a simpler model of an intrinsic emission line overlaid by a blueshifted absorption profile from the outflowing wind. Finally, we find a strong anticorrelation between Hα equivalent width and maximum velocity-binned covering factor, and propose a heuristic explanatory model.

  15. Complete super-sample lensing covariance in the response approach

    Science.gov (United States)

    Barreira, Alexandre; Krause, Elisabeth; Schmidt, Fabian

    2018-06-01

    We derive the complete super-sample covariance (SSC) of the matter and weak lensing convergence power spectra using the power spectrum response formalism to accurately describe the coupling of super- to sub-survey modes. The SSC term is completely characterized by the survey window function, the nonlinear matter power spectrum and the full first-order nonlinear power spectrum response function, which describes the response to super-survey density and tidal field perturbations. Generalized separate universe simulations can efficiently measure these responses in the nonlinear regime of structure formation, which is necessary for lensing applications. We derive the lensing SSC formulae for two cases: one under the Limber and flat-sky approximations, and a more general one that goes beyond the Limber approximation in the super-survey mode and is valid for curved sky applications. Quantitatively, we find that for sky fractions fsky ≈ 0.3 and a single source redshift at zS=1, the use of the flat-sky and Limber approximation underestimates the total SSC contribution by ≈ 10%. The contribution from super-survey tidal fields to the lensing SSC, which has not been included in cosmological analyses so far, is shown to represent about 5% of the total lensing covariance on multipoles l1,l2 gtrsim 300. The SSC is the dominant off-diagonal contribution to the total lensing covariance, making it appropriate to include these tidal terms and beyond flat-sky/Limber corrections in cosmic shear analyses.

  16. Association of nail biting and psychiatric disorders in children and their parents in a psychiatrically referred sample of children

    Directory of Open Access Journals (Sweden)

    Ghanizadeh Ahmad

    2008-06-01

    Full Text Available Abstract Background Nail biting (NB is a very common unwanted behavior. The majority of children are motivated to stop NB and have already tried to stop it, but are generally unsuccessful in doing so. It is a difficult behavior to modify or treat. The objective of this study was to investigate the prevalence of co-morbid psychiatric disorders in a clinical sample of children with NB who present at a child and adolescent mental healthcare outpatient clinic and the prevalence of psychiatric disorders in their parents. Method A consecutive sample of 450 referred children was examined for NB and 63 (14% were found to have NB. The children and adolescents with nail biting and their parents were interviewed according to DSM-IV diagnostic criteria. They were also asked about lip biting, head banging, skin biting, and hair pulling behaviors. Results Nail biting is common amongst children and adolescents referred to a child and adolescent mental health clinic. The most common co-morbid psychiatric disorders in these children were attention deficit hyperactivity disorder (74.6%, oppositional defiant disorder (36%, separation anxiety disorder (20.6%, enuresis (15.6%, tic disorder (12.7% and obsessive compulsive disorder (11.1%. The rates of major depressive disorder, mental retardation, and pervasive developmental disorder were 6.7%, 9.5%, 3.2%, respectively. There was no association between the age of onset of nail biting and the co-morbid psychiatric disorder. Severity and frequency of NB were not associated with any co-morbid psychiatric disorder. About 56.8% of the mothers and 45.9% of the fathers were suffering from at least one psychiatric disorder. The most common psychiatric disorder found in these parents was major depression. Conclusion Nail biting presents in a significant proportion of referrals to a mental healthcare clinic setting. Nail biting should be routinely looked for and asked for in the child and adolescent mental healthcare setting

  17. Production of artifact methylmercury during the analysis of certified reference sediments: Use of ionic exchange in the sample treatment step to minimise the problem

    International Nuclear Information System (INIS)

    Delgado, Alejandra; Prieto, Ailette; Zuloaga, Olatz; Diego, Alberto de; Madariaga, Juan Manuel

    2007-01-01

    Production of artifact methylmercury (MeHg + ) during the analysis of two certified reference sediments, CRM-580 and IAEA-405, was investigated. Leaching of the analyte from the solid sample was achieved by ultrasound assisted acidic extraction. The aqueous leachate was either ethylated (NaBEt 4 ) or phenylated (NaBPh 4 ) using acetic/acetate or citric/citrate to buffer the solution. Preconcentration of the volatile compounds was carried out by extraction with an organic solvent (n-hexane) or solid phase microextraction (SPME). MeHg + was finally separated and detected by gas chromatography with atomic emission or mass spectrometry detection (GC-MIP-AED or GC-MS). In all the cases the concentrations obtained for MeHg + in the CRM-580 were significantly higher than the certified value. For the IAEA-405, however, the MeHg + concentration found was always statistically indistinguishable from the certified value. Experiments were also conducted with synthetic samples, such as aqueous mixtures of MeHg + and inorganic mercury (Hg 2+ ) or silica-gel spiked with both compounds. The methylation rates found (defined as the percentage of Hg 2+ present in the sample which methylates to give artifact MeHg + ) ranged from not observable (in certain synthetic aqueous mixtures) to 0.57% (analysis of CRM-580 under certain conditions). As the amount of Hg 2+ available in the sample seems to be the main factor controlling the magnitude of the artifact, several experiments were conducted using an ionic exchange resin (Dowex M-41) in order to minimise the concentration of this chemical in the reaction medium. First, a hydrochloric leachate of the sample was passed through a microcolumn packed with the exchanger. Second, the resin was mixed with the sample prior to extraction with HCl. In both cases, the predominant Hg 2+ species, HgCl 4 2- , was adsorbed on the resin, whereas MeHg + , mainly as MeHgCl, remained in solution. Following the second option, a new method to analyse MeHg + in

  18. Production of artifact methylmercury during the analysis of certified reference sediments: Use of ionic exchange in the sample treatment step to minimise the problem

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, Alejandra [Kimika Analitikoa Saila, Euskal Herriko Unibertsitatea, 644 P.K., E-48080 Bilbao (Spain); Prieto, Ailette [Kimika Analitikoa Saila, Euskal Herriko Unibertsitatea, 644 P.K., E-48080 Bilbao (Spain); Zuloaga, Olatz [Kimika Analitikoa Saila, Euskal Herriko Unibertsitatea, 644 P.K., E-48080 Bilbao (Spain); Diego, Alberto de [Kimika Analitikoa Saila, Euskal Herriko Unibertsitatea, 644 P.K., E-48080 Bilbao (Spain)]. E-mail: alberto.dediego@ehu.es; Madariaga, Juan Manuel [Kimika Analitikoa Saila, Euskal Herriko Unibertsitatea, 644 P.K., E-48080 Bilbao (Spain)

    2007-01-16

    Production of artifact methylmercury (MeHg{sup +}) during the analysis of two certified reference sediments, CRM-580 and IAEA-405, was investigated. Leaching of the analyte from the solid sample was achieved by ultrasound assisted acidic extraction. The aqueous leachate was either ethylated (NaBEt{sub 4}) or phenylated (NaBPh{sub 4}) using acetic/acetate or citric/citrate to buffer the solution. Preconcentration of the volatile compounds was carried out by extraction with an organic solvent (n-hexane) or solid phase microextraction (SPME). MeHg{sup +} was finally separated and detected by gas chromatography with atomic emission or mass spectrometry detection (GC-MIP-AED or GC-MS). In all the cases the concentrations obtained for MeHg{sup +} in the CRM-580 were significantly higher than the certified value. For the IAEA-405, however, the MeHg{sup +} concentration found was always statistically indistinguishable from the certified value. Experiments were also conducted with synthetic samples, such as aqueous mixtures of MeHg{sup +} and inorganic mercury (Hg{sup 2+}) or silica-gel spiked with both compounds. The methylation rates found (defined as the percentage of Hg{sup 2+} present in the sample which methylates to give artifact MeHg{sup +}) ranged from not observable (in certain synthetic aqueous mixtures) to 0.57% (analysis of CRM-580 under certain conditions). As the amount of Hg{sup 2+} available in the sample seems to be the main factor controlling the magnitude of the artifact, several experiments were conducted using an ionic exchange resin (Dowex M-41) in order to minimise the concentration of this chemical in the reaction medium. First, a hydrochloric leachate of the sample was passed through a microcolumn packed with the exchanger. Second, the resin was mixed with the sample prior to extraction with HCl. In both cases, the predominant Hg{sup 2+} species, HgCl{sub 4} {sup 2-}, was adsorbed on the resin, whereas MeHg{sup +}, mainly as MeHgCl, remained in

  19. Sensitivity analysis of monthly reference crop evapotranspiration trends in Iran: a qualitative approach

    Science.gov (United States)

    Mosaedi, Abolfazl; Ghabaei Sough, Mohammad; Sadeghi, Sayed-Hossein; Mooshakhian, Yousof; Bannayan, Mohammad

    2017-05-01

    The main objective of this study was to analyze the sensitivity of the monthly reference crop evapotranspiration (ETo) trends to key climatic factors (minimum and maximum air temperature ( T max and T min), relative humidity (RH), sunshine hours ( t sun), and wind speed ( U 2)) in Iran by applying a qualitative detrended method, rather than the historical mathematical approach. Meteorological data for the period of 1963-2007 from five synoptic stations with different climatic characteristics, including Mashhad (mountains), Tabriz (mountains), Tehran (semi-desert), Anzali (coastal wet), and Shiraz (semi-mountains) were used to address this objective. The Mann-Kendall test was employed to assess the trends of ETo and the climatic variables. The results indicated a significant increasing trend of the monthly ETo for Mashhad and Tabriz for most part of the year while the opposite conclusion was drawn for Tehran, Anzali, and Shiraz. Based on the detrended method, RH and U 2 were the two main variables enhancing the negative ETo trends in Tehran and Anzali stations whereas U 2 and temperature were responsible for this observation in Shiraz. On the other hand, the main meteorological variables affecting the significant positive trend of ETo were RH and t sun in Tabriz and T min, RH, and U 2 in Mashhad. Although a relative agreement was observed in terms of identifying one of the first two key climatic variables affecting the ETo trend, the qualitative and the quantitative sensitivity analysis solutions did never coincide. Further research is needed to evaluate this interesting finding for other geographic locations, and also to search for the major causes of this discrepancy.

  20. An empirical test of reference price theories using a semiparametric approach

    DEFF Research Database (Denmark)

    Boztug, Yasemin; Hildebrandt, Lutz

      In this paper we estimate and empirically test different behavioral theories of consumer reference price formation. Two major theories are proposed to model the reference price reaction: assimilation contrast theory and prospect theory. We assume that different consumer segments will use...

  1. Inter-laboratory variation in the chemical analysis of acidic forest soil reference samples from eastern North America

    Science.gov (United States)

    Ross, Donald S.; Bailiey, Scott W; Briggs, Russell D; Curry, Johanna; Fernandez, Ivan J.; Fredriksen, Guinevere; Goodale, Christine L.; Hazlett, Paul W.; Heine, Paul R; Johnson, Chris E.; Larson, John T; Lawrence, Gregory B.; Kolka, Randy K; Ouimet, Rock; Pare, D; Richter, Daniel D.; Shirmer, Charles D; Warby, Richard A.F.

    2015-01-01

    Long-term forest soil monitoring and research often requires a comparison of laboratory data generated at different times and in different laboratories. Quantifying the uncertainty associated with these analyses is necessary to assess temporal changes in soil properties. Forest soil chemical properties, and methods to measure these properties, often differ from agronomic and horticultural soils. Soil proficiency programs do not generally include forest soil samples that are highly acidic, high in extractable Al, low in extractable Ca and often high in carbon. To determine the uncertainty associated with specific analytical methods for forest soils, we collected and distributed samples from two soil horizons (Oa and Bs) to 15 laboratories in the eastern United States and Canada. Soil properties measured included total organic carbon and nitrogen, pH and exchangeable cations. Overall, results were consistent despite some differences in methodology. We calculated the median absolute deviation (MAD) for each measurement and considered the acceptable range to be the median 6 2.5 3 MAD. Variability among laboratories was usually as low as the typical variability within a laboratory. A few areas of concern include a lack of consistency in the measurement and expression of results on a dry weight basis, relatively high variability in the C/N ratio in the Bs horizon, challenges associated with determining exchangeable cations at concentrations near the lower reporting range of some laboratories and the operationally defined nature of aluminum extractability. Recommendations include a continuation of reference forest soil exchange programs to quantify the uncertainty associated with these analyses in conjunction with ongoing efforts to review and standardize laboratory methods.

  2. New Approach Based on Compressive Sampling for Sample Rate Enhancement in DASs for Low-Cost Sensing Nodes

    Directory of Open Access Journals (Sweden)

    Francesco Bonavolontà

    2014-10-01

    Full Text Available The paper deals with the problem of improving the maximum sample rate of analog-to-digital converters (ADCs included in low cost wireless sensing nodes. To this aim, the authors propose an efficient acquisition strategy based on the combined use of high-resolution time-basis and compressive sampling. In particular, the high-resolution time-basis is adopted to provide a proper sequence of random sampling instants, and a suitable software procedure, based on compressive sampling approach, is exploited to reconstruct the signal of interest from the acquired samples. Thanks to the proposed strategy, the effective sample rate of the reconstructed signal can be as high as the frequency of the considered time-basis, thus significantly improving the inherent ADC sample rate. Several tests are carried out in simulated and real conditions to assess the performance of the proposed acquisition strategy in terms of reconstruction error. In particular, the results obtained in experimental tests with ADC included in actual 8- and 32-bits microcontrollers highlight the possibility of achieving effective sample rate up to 50 times higher than that of the original ADC sample rate.

  3. Quality of determinations obtained from laboratory reference samples used in the calibration of X-ray electron probe microanalysis of silicate minerals

    International Nuclear Information System (INIS)

    Pavlova, Ludmila A.; Suvorova, Ludmila F.; Belozerova, Olga Yu.; Pavlov, Sergey M.

    2003-01-01

    Nine simple minerals and oxides, traditionally used as laboratory reference samples in the electron probe microanalysis (EPMA) of silicate minerals, have been quantitatively evaluated. Three separate series of data, comprising the average concentration, standard deviation, relative standard deviation, confidence interval and the z-score of data quality, were calculated for 21 control samples derived from calibrations obtained from three sets of reference samples: (1) simple minerals; (2) oxides; and (3) certified glass reference materials. No systematic difference was observed between the concentrations obtained from these three calibration sets when analyzed results were compared to certified compositions. The relative standard deviations obtained for each element were smaller than target values for all determinations. The z-score values for all elements determined fell within acceptable limits (-2< z<2) for concentrations ranging from 0.1 to 100%. These experiments show that the quality of data obtained from laboratory reference calibration samples is not inferior to that from certified reference glasses. The quality of results obtained corresponds to the 'applied geochemistry' type of analysis (category 2) as defined in the GeoPT proficiency testing program. Therefore, the laboratory reference samples can be used for calibrating EPMA techniques in the analysis of silicate minerals and for controlling the quality of results

  4. Validity of the reduced-sample insulin modified frequently-sampled intravenous glucose tolerance test using the nonlinear regression approach.

    Science.gov (United States)

    Sumner, Anne E; Luercio, Marcella F; Frempong, Barbara A; Ricks, Madia; Sen, Sabyasachi; Kushner, Harvey; Tulloch-Reid, Marshall K

    2009-02-01

    The disposition index, the product of the insulin sensitivity index (S(I)) and the acute insulin response to glucose, is linked in African Americans to chromosome 11q. This link was determined with S(I) calculated with the nonlinear regression approach to the minimal model and data from the reduced-sample insulin-modified frequently-sampled intravenous glucose tolerance test (Reduced-Sample-IM-FSIGT). However, the application of the nonlinear regression approach to calculate S(I) using data from the Reduced-Sample-IM-FSIGT has been challenged as being not only inaccurate but also having a high failure rate in insulin-resistant subjects. Our goal was to determine the accuracy and failure rate of the Reduced-Sample-IM-FSIGT using the nonlinear regression approach to the minimal model. With S(I) from the Full-Sample-IM-FSIGT considered the standard and using the nonlinear regression approach to the minimal model, we compared the agreement between S(I) from the Full- and Reduced-Sample-IM-FSIGT protocols. One hundred African Americans (body mass index, 31.3 +/- 7.6 kg/m(2) [mean +/- SD]; range, 19.0-56.9 kg/m(2)) had FSIGTs. Glucose (0.3 g/kg) was given at baseline. Insulin was infused from 20 to 25 minutes (total insulin dose, 0.02 U/kg). For the Full-Sample-IM-FSIGT, S(I) was calculated based on the glucose and insulin samples taken at -1, 1, 2, 3, 4, 5, 6, 7, 8,10, 12, 14, 16, 19, 22, 23, 24, 25, 27, 30, 40, 50, 60, 70, 80, 90, 100, 120, 150, and 180 minutes. For the Reduced-Sample-FSIGT, S(I) was calculated based on the time points that appear in bold. Agreement was determined by Spearman correlation, concordance, and the Bland-Altman method. In addition, for both protocols, the population was divided into tertiles of S(I). Insulin resistance was defined by the lowest tertile of S(I) from the Full-Sample-IM-FSIGT. The distribution of subjects across tertiles was compared by rank order and kappa statistic. We found that the rate of failure of resolution of S(I) by

  5. CLSI-based transference of the CALIPER database of pediatric reference intervals from Abbott to Beckman, Ortho, Roche and Siemens Clinical Chemistry Assays: direct validation using reference samples from the CALIPER cohort.

    Science.gov (United States)

    Estey, Mathew P; Cohen, Ashley H; Colantonio, David A; Chan, Man Khun; Marvasti, Tina Binesh; Randell, Edward; Delvin, Edgard; Cousineau, Jocelyne; Grey, Vijaylaxmi; Greenway, Donald; Meng, Qing H; Jung, Benjamin; Bhuiyan, Jalaluddin; Seccombe, David; Adeli, Khosrow

    2013-09-01

    The CALIPER program recently established a comprehensive database of age- and sex-stratified pediatric reference intervals for 40 biochemical markers. However, this database was only directly applicable for Abbott ARCHITECT assays. We therefore sought to expand the scope of this database to biochemical assays from other major manufacturers, allowing for a much wider application of the CALIPER database. Based on CLSI C28-A3 and EP9-A2 guidelines, CALIPER reference intervals were transferred (using specific statistical criteria) to assays performed on four other commonly used clinical chemistry platforms including Beckman Coulter DxC800, Ortho Vitros 5600, Roche Cobas 6000, and Siemens Vista 1500. The resulting reference intervals were subjected to a thorough validation using 100 reference specimens (healthy community children and adolescents) from the CALIPER bio-bank, and all testing centers participated in an external quality assessment (EQA) evaluation. In general, the transferred pediatric reference intervals were similar to those established in our previous study. However, assay-specific differences in reference limits were observed for many analytes, and in some instances were considerable. The results of the EQA evaluation generally mimicked the similarities and differences in reference limits among the five manufacturers' assays. In addition, the majority of transferred reference intervals were validated through the analysis of CALIPER reference samples. This study greatly extends the utility of the CALIPER reference interval database which is now directly applicable for assays performed on five major analytical platforms in clinical use, and should permit the worldwide application of CALIPER pediatric reference intervals. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Defining Leadership as Process Reference Model: Translating Organizational Goals into Practice Using a Structured Leadership Approach

    OpenAIRE

    Tuffley , David

    2010-01-01

    International audience; Effective leadership in organisations is important to the achievement of organizational objectives. Yet leadership is widely seen as a quality that individuals innately possess, and which cannot be learned. This paper makes two assertions; (a) that leadership is a skill that not only can be learned, but which can be formalized into a Process Reference Model that is intelligible from an Enterprise Architecture perspective, and (b) that Process Reference Models in the st...

  7. Certified Reference Materials for Radioactivity Measurements in Environmental Samples of Soil and Water: IAEA-444 and IAEA-445

    International Nuclear Information System (INIS)

    2011-01-01

    Reference Materials are an important requirement for any sort of quantitative chemical and radiochemical analysis. Laboratories need them for calibration and quality control throughout their analytical work. The IAEA started to produce reference materials in the early 1960's to meet the needs of the analytical laboratories in its Member States that required reference materials for quality control of their measurements. The initial efforts were focused on the preparation of environmental reference materials containing anthropogenic radionuclides for use by those laboratories employing nuclear analytical techniques. These reference materials were characterized for their radionuclide content through interlaboratory comparison involving a core group of some 10 to 20 specialist laboratories. The success of these early exercises led the IAEA to extend its activities to encompass both terrestrial and marine reference materials containing primordial radionuclides and trace elements. Within the frame of IAEA activities in production and certification of reference materials, this report describes the certification of the IAEA-444 and IAEA-445: soil and water spiked with gamma emitting radionuclides respectively. Details are given on methodologies and data evaluation

  8. Effects of achievement differences for internal/external frame of reference model investigations: A test of robustness of findings over diverse student samples.

    Science.gov (United States)

    Schmidt, Isabelle; Brunner, Martin; Preckel, Franzis

    2017-11-12

    Achievement in math and achievement in verbal school subjects are more strongly correlated than the respective academic self-concepts. The internal/external frame of reference model (I/E model; Marsh, 1986, Am. Educ. Res. J., 23, 129) explains this finding by social and dimensional comparison processes. We investigated a key assumption of the model that dimensional comparisons mainly depend on the difference in achievement between subjects. We compared correlations between subject-specific self-concepts of groups of elementary and secondary school students with or without achievement differences in the respective subjects. The main goals were (1) to show that effects of dimensional comparisons depend to a large degree on the existence of achievement differences between subjects, (2) to demonstrate the generalizability of findings over different grade levels and self-concept scales, and (3) to test a rarely used correlation comparison approach (CCA) for the investigation of I/E model assumptions. We analysed eight German elementary and secondary school student samples (grades 3-8) from three independent studies (Ns 326-878). Correlations between math and German self-concepts of students with identical grades in the respective subjects were compared with the correlation of self-concepts of students having different grades using Fisher's Z test for independent samples. In all samples, correlations between math self-concept and German self-concept were higher for students having identical grades than for students having different grades. Differences in median correlations had small effect sizes for elementary school students and moderate effect sizes for secondary school students. Findings generalized over grades and indicated a developmental aspect in self-concept formation. The CCA complements investigations within I/E-research. © 2017 The British Psychological Society.

  9. A semi-empirical approach to calculate gamma activities in environmental samples

    International Nuclear Information System (INIS)

    Palacios, D.; Barros, H.; Alfonso, J.; Perez, K.; Trujillo, M.; Losada, M.

    2006-01-01

    We propose a semi-empirical method to calculate radionuclide concentrations in environmental samples without the use of reference material and avoiding the typical complexity of Monte-Carlo codes. The calculation of total efficiencies was carried out from a relative efficiency curve (obtained from the gamma spectra data), and the geometric (simulated by Monte-Carlo), absorption, sample and intrinsic efficiencies at energies between 130 and 3000 keV. The absorption and sample efficiencies were determined from the mass absorption coefficients, obtained by the web program XCOM. Deviations between computed results and measured efficiencies for the RGTh-1 reference material are mostly within 10%. Radionuclide activities in marine sediment samples calculated by the proposed method and by the experimental relative method were in satisfactory agreement. The developed method can be used for routine environmental monitoring when efficiency uncertainties of 10% can be sufficient.(Author)

  10. Polymers as reference partitioning phase: polymer calibration for an analytically operational approach to quantify multimedia phase partitioning

    DEFF Research Database (Denmark)

    Gilbert, Dorothea; Witt, Gesine; Smedes, Foppe

    2016-01-01

    Polymers are increasingly applied for the enrichment of hydrophobic organic chemicals (HOCs) from various types of samples and media in many analytical partitioning-based measuring techniques. We propose using polymers as a reference partitioning phase and introduce polymer-polymer partitioning......-air) and multimedia partition coefficients (lipid-water, air-water) were calculated by applying the new concept of a polymer as reference partitioning phase and by using polymer-polymer partition coefficients as conversion factors. The present study encourages the use of polymer-polymer partition coefficients...

  11. Reference gene validation for gene expression normalization in canine osteosarcoma : a geNorm algorithm approach

    NARCIS (Netherlands)

    Selvarajah, G.T.; Bonestroo, F.A.S.; Timmermans Sprang, E.P.M.; Kirpensteijn, J.|info:eu-repo/dai/nl/189846992; Mol, J.A.|info:eu-repo/dai/nl/070918775

    2017-01-01

    Background Quantitative PCR (qPCR) is a common method for quantifying mRNA expression. Given the heterogeneity present in tumor tissues, it is crucial to normalize target mRNA expression data using appropriate reference genes that are stably expressed under a variety of pathological and experimental

  12. Development of human protein reference database as an initial platform for approaching systems biology in humans

    DEFF Research Database (Denmark)

    Peri, Suraj; Navarro, J Daniel; Amanchy, Ramars

    2003-01-01

    Human Protein Reference Database (HPRD) is an object database that integrates a wealth of information relevant to the function of human proteins in health and disease. Data pertaining to thousands of protein-protein interactions, posttranslational modifications, enzyme/substrate relationships...

  13. Concepts and approaches for marine ecosystem research with reference to the tropics

    OpenAIRE

    Matthias Wolff

    2002-01-01

    The present article gives an overview on the leading concepts and modelling approaches for marine ecosystems’ research including (1) The trophodynamic theory of pelagic ecosystems, (2) Compartment/network models, (3) Mesocosm experiments and (4) Individual based modelling approaches and virtual ecosystems (VE). The main research questions addressed, as well as the potential and limits of each approach, are summarized and discussed and it is shown how the concept of ecosystem has changed over ...

  14. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    Science.gov (United States)

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Descriptions of Sampling Practices Within Five Approaches to Qualitative Research in Education and the Health Sciences

    Directory of Open Access Journals (Sweden)

    Timothy C. Guetterman

    2015-05-01

    Full Text Available Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, grounded theory methodology, narrative inquiry, and phenomenology. I analyzed the 51 most highly cited studies using predetermined content categories and noteworthy sampling characteristics that emerged. In brief, the findings revealed a mean sample size of 87. Less than half of the studies identified a sampling strategy. I include a description of findings by approach and recommendations for sampling to assist methodologists, reviewers, program officers, graduate students, and other qualitative researchers in understanding qualitative sampling practices in recent studies. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1502256

  16. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  17. Dynamic flow-through approaches for metal fractionation in environmentally relevant solid samples

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Chomchoei, Roongrat

    2005-01-01

    generations of flow-injection analysis. Special attention is also paid to a novel, robust, non-invasive approach for on-site continuous sampling of soil solutions, capitalizing on flow-through microdialysis, which presents itself as an appealing complementary approach to the conventional lysimeter experiments...

  18. Improving the efficiency of quantitative (1)H NMR: an innovative external standard-internal reference approach.

    Science.gov (United States)

    Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V

    2014-01-01

    The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Role of Sample Processing Strategies at the European Union National Reference Laboratories (NRLs) Concerning the Analysis of Pesticide Residues

    DEFF Research Database (Denmark)

    Hajeb, Parvaneh; Herrmann, Susan Strange; Poulsen, Mette Erecius

    2017-01-01

    to the European Union Reference Laboratory on Cereals and Feedingstuff (EURL) for the determination of the particle size distribution and pesticide residue recovery. The results showed that the NRLs used several different brands and types of mills. Large variations in the particle size distributions and pesticide...

  20. A tuning approach for offset-free MPC with conditional reference adaptation

    DEFF Research Database (Denmark)

    Waschl, Harald; Jørgensen, John Bagterp; Huusom, Jakob Kjøbsted

    2014-01-01

    Model predictive control has become a widely accepted strategy in industrial applications in the recent years. Often mentioned reasons for the success are the optimization based on a system model, consideration of constraints and an intuitive tuning process. However, as soon as unknown disturbances...... properties these controllers can be tuned separate and by known guidelines. To address conditions with active input constraints, additionally a conditional reference adaptation scheme is introduced. The tuning strategy is evaluated on a simulated linear Wood-Berry binary distillation column example....

  1. A sampling approach to constructing Lyapunov functions for nonlinear continuous–time systems

    NARCIS (Netherlands)

    Bobiti, R.V.; Lazar, M.

    2016-01-01

    The problem of constructing a Lyapunov function for continuous-time nonlinear dynamical systems is tackled in this paper via a sampling-based approach. The main idea of the sampling-based method is to verify a Lyapunov-type inequality for a finite number of points (known state vectors) in the

  2. A model based approach to reference-free straightness measurement at the Nanometer Comparator

    Science.gov (United States)

    Weichert, C.; Stavridis, M.; Walzel, M.; Elster, C.; Wiegmann, A.; Schulz, M.; Köning, R.; Flügge, J.; Tutsch, R.

    2009-06-01

    The Nanometer Comparator is the PTB reference length measuring machine for high precision calibrations of line scales and encoder systems. Up to now the Nanometer Comparator allows to measure the position of line structures in one dimension only. For high precision characterisations of masks, scales and incremental encoders, the measurement of the straightness of graduations is a requirement from emerging lithography techniques. Therefore the Nanometer Comparator will be equipped with an additional short range measurement system in the Y-direction, realized as a single path plane mirror interferometer and supposed to achieve sub-nm uncertainties. To compensate the topography of the Y-mirror, the Traceable Multi Sensor (TMS) method will be implemented to achieve a reference-free straightness measurement. Virtual experiments are used to estimate the lower accuracy limit and to determine the sensitive parameters. The virtual experiments contain the influence of the positioning devices, interferometer errors as well as non-perfect adjustment and fabrication of the machine geometry. The whole dynamic measurement process of the Nanometer Comparator including its influence on the TMS analysis, e.g. non-equally spaced measurement points, is simulated. We will present the results of these virtual experiments as well as the most relevant error sources for straightness measurement, incorporating the low uncertainties of the existing and planned measurement systems.

  3. Ecosystem Approach to Fisheries Management in Indonesia: Review on Indicators and Reference Values

    NARCIS (Netherlands)

    Hutubessy, B.G.; Mosse, J.W.

    2015-01-01

    Although many definitions present the concept of the ecosystem approach to fisheries (EAF),there are lack of consensus on the definition and scope for the management. Design and implementation of this specific management are still ambiguous because the formulation criteria are not specified.

  4. On Virtual Face-Work: An Ethnography of Communication Approach to a Live Chat Reference Interaction

    Science.gov (United States)

    Radford, Marie L.; Radford, Gary P.; Connaway, Lynn Silipigni; DeAngelis, Jocelyn A.

    2011-01-01

    Erving Goffman's theoretical framework and concept of face-work has the potential to greatly increase the understanding of interpersonal dynamics in computer-mediated communication realms. This research used an ethnography of communication approach and the concept of face-work to analyze the transcript of an interaction between a librarian and a…

  5. The Framework of Reference for Pluralistic Approaches to Languages and Cultures

    DEFF Research Database (Denmark)

    Candelier, Michel; Daryai-Hansen, Petra Gilliyard; Schröder-Sura, Anna

    2012-01-01

    enterprise conforming to this concept. In the third section, the focus lies on the FREPA framework itself, presenting the framework in more detail while stressing its role as an innovative complement to the CEFR. The fourth section aims to show how the FREPA can support the use of pluralistic approaches...

  6. Anthology of the renin-angiotensin system: a one hundred reference approach to angiotensin II antagonists.

    Science.gov (United States)

    Ménard, J

    1993-04-01

    To provide a historical overview of the renin-angiotensin system as a guide to the introduction of a new therapeutic pathway, non-peptide inhibition of a angiotensin II. One hundred references were selected as a personal preference, for their originality or for their potential impact on medicine. This review raises the following questions for future research. (1) Will the long-term cardiovascular effects of angiotensin converting enzyme (ACE) inhibition, angiotensin II antagonism and renin inhibition be similar or not, and dependent or independent of blood pressure levels? (2) What are the local-regional interactions between vasoconstrictor and vasodilator systems, and does the renin-angiotensin system synchronize these regional hemodynamic regulatory mechanisms? (3) If hypertension is the result of an interaction between genetic and environmental factors, do proteins secreted through constitutive pathways contribute to the genetic abnormality (prorenin, angiotensinogen, ACE) while regulated secretion (renin) and other regulatory mechanisms (angiotensin II receptors) provide biological support for the environmental effects?

  7. A reference stress approach for the characterisation of the creep failure of dissimilar welds under isothermal conditions

    International Nuclear Information System (INIS)

    Nicholson, R.D.; Williams, J.A.

    1988-11-01

    In high temperature power plant, welds between austenitic and ferritic steels are required to operate under plant conditions for up to 250,000h. The experience and failure modes for such joints are briefly surveyed in this report. A semi-empirical reference stress approach is used to define the failure life of joints under isothermal conditions. The reference stress is based on a previously published form for multiaxial creep fracture of homogeneous materials but modified to include an additional factor to reflect the complex strains present close to the interface in a dissimilar weld. This reference stress can be modified to give approximate bounds characterised by the equivalent stress or the axial stress on the weld. The reference stress, when applied to the 21/4Cr1Mo:Type 316 welded component data base, gives conservative results for the test data available although conservatism is low for the 9Cr1Mo:Alloy 600 combination. The existing data base for welded components is limited. More data are needed covering a wider range of stress ratios and incorporating bending loads. (author)

  8. Pi sampling: a methodical and flexible approach to initial macromolecular crystallization screening

    International Nuclear Information System (INIS)

    Gorrec, Fabrice; Palmer, Colin M.; Lebon, Guillaume; Warne, Tony

    2011-01-01

    Pi sampling, derived from the incomplete factorial approach, is an effort to maximize the diversity of macromolecular crystallization conditions and to facilitate the preparation of 96-condition initial screens. The Pi sampling method is derived from the incomplete factorial approach to macromolecular crystallization screen design. The resulting ‘Pi screens’ have a modular distribution of a given set of up to 36 stock solutions. Maximally diverse conditions can be produced by taking into account the properties of the chemicals used in the formulation and the concentrations of the corresponding solutions. The Pi sampling method has been implemented in a web-based application that generates screen formulations and recipes. It is particularly adapted to screens consisting of 96 different conditions. The flexibility and efficiency of Pi sampling is demonstrated by the crystallization of soluble proteins and of an integral membrane-protein sample

  9. Importance of participation rate in sampling of data in population based studies, with special reference to bone mass in Sweden.

    OpenAIRE

    Düppe, H; Gärdsell, P; Hanson, B S; Johnell, O; Nilsson, B E

    1996-01-01

    OBJECTIVE: To study the effects of participation rate in sampling on "normative" bone mass data. DESIGN: This was a comparison between two randomly selected samples from the same population. The participation rates in the two samples were 61.9% and 83.6%. Measurements were made of bone mass at different skeletal sites and of muscle strength, as well as an assessment of physical activity. SETTING: Malmö, Sweden. SUBJECTS: There were 230 subjects (117 men, 113 women), aged 21 to 42 years. RESUL...

  10. Non-destructive alpha-particle activation analysis of P, Cl, K and Ca in marine macro-alga samples using synthetic multielement reference material as comparative standard

    International Nuclear Information System (INIS)

    Iwata, Y.; Naitoh, H.; Suzuki, N.

    1992-01-01

    A Synthetic Reference Material (SyRM) composed with accurately known amounts of 12 elements has been prepared. The elemental composition of the SyRM is closely similar to that of marine macro-algae sample. The elemental composition of the SyRM was regulated by the starting materials used for the synthesis. The SyRM was used as a comparative standard for non-destructive alpha-particle activation analysis of marine macro-alga samples. P, Cl, K and Ca were determined simultaneously without correction for alpha range due to difference in the elemental composition between the analytical samples and the comparative standard. (author) 19 refs.; 4 tabs

  11. Microscopic diagnosis of sodium acetate-acetic acid-formalin-fixed stool samples for helminths and intestinal protozoa: a comparison among European reference laboratories.

    Science.gov (United States)

    Utzinger, J; Botero-Kleiven, S; Castelli, F; Chiodini, P L; Edwards, H; Köhler, N; Gulletta, M; Lebbad, M; Manser, M; Matthys, B; N'Goran, E K; Tannich, E; Vounatsou, P; Marti, H

    2010-03-01

    The present study aimed to compare the diagnostic performance of different European reference laboratories in diagnosing helminths and intestinal protozoa, using an ether-concentration method applied to sodium acetate-acetic acid-formalin (SAF)-preserved faecal samples. In total, 102 stool specimens were analysed during a cross-sectional parasitological survey in urban farming communities in Côte d'Ivoire. Five SAF-preserved faecal samples were prepared from each specimen and forwarded to the participating reference laboratories, processed and examined under a microscope adhering to a standard operating procedure (SOP). Schistosoma mansoni (cumulative prevalence: 51.0%) and hookworm (cumulative prevalence: 39.2%) were the predominant helminths. There was excellent agreement (kappa > 0.8; p protozoa were Entamoeba coli (median prevalence: 67.6%), Blastocystis hominis (median prevalence: 55.9%) and Entamoeba histolytica/Entamoeba dispar (median prevalence: 47.1%). Substantial agreement among reference laboratories was found for E. coli (kappa = 0.69), but only fair or moderate agreement was found for other Entamoeba species, Giardia intestinalis and Chilomastix mesnili. There was only poor agreement for B. hominis, Isospora belli and Trichomonas intestinalis. In conclusion, although common helminths were reliably diagnosed by European reference laboratories, there was only moderate agreement between centres for pathogenic intestinal protozoa. Continued external quality assessment and the establishment of a formal network of reference laboratories is necessary to further enhance both accuracy and uniformity in parasite diagnosis.

  12. Sample preparation with solid phase microextraction and exhaustive extraction approaches: Comparison for challenging cases.

    Science.gov (United States)

    Boyacı, Ezel; Rodríguez-Lafuente, Ángel; Gorynski, Krzysztof; Mirnaghi, Fatemeh; Souza-Silva, Érica A; Hein, Dietmar; Pawliszyn, Janusz

    2015-05-11

    In chemical analysis, sample preparation is frequently considered the bottleneck of the entire analytical method. The success of the final method strongly depends on understanding the entire process of analysis of a particular type of analyte in a sample, namely: the physicochemical properties of the analytes (solubility, volatility, polarity etc.), the environmental conditions, and the matrix components of the sample. Various sample preparation strategies have been developed based on exhaustive or non-exhaustive extraction of analytes from matrices. Undoubtedly, amongst all sample preparation approaches, liquid extraction, including liquid-liquid (LLE) and solid phase extraction (SPE), are the most well-known, widely used, and commonly accepted methods by many international organizations and accredited laboratories. Both methods are well documented and there are many well defined procedures, which make them, at first sight, the methods of choice. However, many challenging tasks, such as complex matrix applications, on-site and in vivo applications, and determination of matrix-bound and free concentrations of analytes, are not easily attainable with these classical approaches for sample preparation. In the last two decades, the introduction of solid phase microextraction (SPME) has brought significant progress in the sample preparation area by facilitating on-site and in vivo applications, time weighted average (TWA) and instantaneous concentration determinations. Recently introduced matrix compatible coatings for SPME facilitate direct extraction from complex matrices and fill the gap in direct sampling from challenging matrices. Following introduction of SPME, numerous other microextraction approaches evolved to address limitations of the above mentioned techniques. There is not a single method that can be considered as a universal solution for sample preparation. This review aims to show the main advantages and limitations of the above mentioned sample

  13. Gamma-hydroxybutyric acid endogenous production and post-mortem behaviour - the importance of different biological matrices, cut-off reference values, sample collection and storage conditions.

    Science.gov (United States)

    Castro, André L; Dias, Mário; Reis, Flávio; Teixeira, Helena M

    2014-10-01

    Gamma-Hydroxybutyric Acid (GHB) is an endogenous compound with a story of clinical use, since the 1960's. However, due to its secondary effects, it has become a controlled substance, entering the illicit market for recreational and "dance club scene" use, muscle enhancement purposes and drug-facilitated sexual assaults. Its endogenous context can bring some difficulties when interpreting, in a forensic context, the analytical values achieved in biological samples. This manuscript reviewed several crucial aspects related to GHB forensic toxicology evaluation, such as its post-mortem behaviour in biological samples; endogenous production values, whether in in vivo and in post-mortem samples; sampling and storage conditions (including stability tests); and cut-off reference values evaluation for different biological samples, such as whole blood, plasma, serum, urine, saliva, bile, vitreous humour and hair. This revision highlights the need of specific sampling care, storage conditions, and cut-off reference values interpretation in different biological samples, essential for proper practical application in forensic toxicology. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. [Reference citation].

    Science.gov (United States)

    Brkić, Silvija

    2013-01-01

    Scientific and professional papers represent the information basis for scientific research and professional work. References important for the paper should be cited within the text, and listed at the end of the paper. This paper deals with different styles of reference citation. Special emphasis was placed on the Vancouver Style for reference citation in biomedical journals established by the International Committee of Medical Journal Editors. It includes original samples for citing various types of articles, both printed and electronic, as well as recommendations related to reference citation in accordance with the methodology and ethics of scientific research and guidelines for preparing manuscripts for publication.

  15. Morphometric X-ray Absorptiometry: Reference Data for Vertebral Dimensions in a Population-based Sample of Young Danish Men

    International Nuclear Information System (INIS)

    Wulff, R.; Koch Holst, A.; Nielsen, T.L.; Andersen, M.; Hagen, C.; Brixen, K.

    2004-01-01

    Purpose: To determine reference values for vertebral heights in healthy young Danish males using morphometric X-ray absorptiometry (MXA). Material and Methods: A population-based study group of 487 males aged between 20 and 30 years (mean 25 years) from the county of Funen, Denmark, were recruited. Using a Hologic QDR 4500 (dual energy X-ray absorptiometry) DXA-scanner, MXA scans covering the vertebrae from T4 to L4 were acquired for each subject. Anterior (Ha), middle (Hm), and posterior (Hp) heights of the thoracic (T4-T12) and lumbar (L1-L4) vertebral bodies were measured. Moreover, wedge, mid-wedge, crush I, and crush II ratios were calculated. Results: No correlation between vertebral dimensions and crush indices on the one hand and age or weight on the other were found. Body height, however, correlated significantly with the cumulated vertebral heights. Reference data for vertebral dimensions, wedge, mid-wedge, crush I, and crush II are tabulated. Conclusion: The anterior, middle, and posterior heights of the vertebral bodies of T4 to L4 can be measured reproducible with MXA. In young men, the cumulative vertebral heights correlated with body height but not with age. Moreover, the wedge and crush indices were unrelated of both age and height

  16. Ecosystem approach to fisheries: Exploring environmental and trophic effects on Maximum Sustainable Yield (MSY reference point estimates.

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar

    Full Text Available We present a comprehensive analysis of estimation of fisheries Maximum Sustainable Yield (MSY reference points using an ecosystem model built for Mille Lacs Lake, the second largest lake within Minnesota, USA. Data from single-species modelling output, extensive annual sampling for species abundances, annual catch-survey, stomach-content analysis for predatory-prey interactions, and expert opinions were brought together within the framework of an Ecopath with Ecosim (EwE ecosystem model. An increase in the lake water temperature was observed in the last few decades; therefore, we also incorporated a temperature forcing function in the EwE model to capture the influences of changing temperature on the species composition and food web. The EwE model was fitted to abundance and catch time-series for the period 1985 to 2006. Using the ecosystem model, we estimated reference points for most of the fished species in the lake at single-species as well as ecosystem levels with and without considering the influence of temperature change; therefore, our analysis investigated the trophic and temperature effects on the reference points. The paper concludes that reference points such as MSY are not stationary, but change when (1 environmental conditions alter species productivity and (2 fishing on predators alters the compensatory response of their prey. Thus, it is necessary for the management to re-estimate or re-evaluate the reference points when changes in environmental conditions and/or major shifts in species abundance or community structure are observed.

  17. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    Science.gov (United States)

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  18. A Holistic Approach with Special Reference to Heat Transfer in Multi-Component Porous Media Systems

    Directory of Open Access Journals (Sweden)

    A. K. Borah

    2010-06-01

    Full Text Available Problems involving multiphase flow, heat transfer and multi-component mass transport in porous media arise in a number of scientific engineering disciplines. Important technological applications include thermally enhanced oil recovery, subsurface contamination and remediation, capillary assisted thermal technologies, drying process, thermal insulation materials, multiphase trickle bed reactors, nuclear reactor safety analysis, high level radioactive waste repositories and geothermal energy exploitation. In this paper we demonstrate multiphase flows in porous media are driven by gravitational, capillary and viscous forces. But gravity causes phase migration in the direction of the gravitational field. Microscopic modelling efforts were made to accurately incorporate microscopic interfacial phenomena. Multi-scale modelling approaches were attempted in order to transmit information over various lengths scales, ranging from micro-scale, meso-scale, macro-scale and finally to the field scale.

  19. Raman spectroscopy for forensic examination of β-ketophenethylamine "legal highs": reference and seized samples of cathinone derivatives.

    Science.gov (United States)

    Stewart, Samantha P; Bell, Steven E J; Fletcher, Nicholas C; Bouazzaoui, Samira; Ho, Yen Cheng; Speers, S James; Peters, K Laota

    2012-01-20

    Raman spectra of a representative range of β-ketophenethylamine (β-KP), the rapidly growing family of cathinone-related "legal high" recreational drugs, have been recorded. These spectra showed characteristic changes that were associated with the pattern of substitution on the aromatic rings, for example, the compounds carrying substituents at the 4- position could be distinguished from 3,4-methylenedioxy "ecstasy" derivatives. They also showed small but detectable changes with differences in substitution on the ethylamine substituent. These features allowed the β-KPs present in seized casework samples to be identified. The seized samples typically contained only small amounts of bulking agents, which meant that the band intensities of these components within averaged data were very small. In contrast, grid sampling normally gave at least some spectra which had a higher than average proportion of the bulking agent(s), which allowed them to also be identified. This study therefore demonstrates that Raman spectroscopy can be used both to provide a rapid, non-destructive technique for identification of this class of drugs in seized samples and to detect minor constituents, giving a composition profile which can be used for drugs intelligence work. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  20. Another look at economic approaches to environmental management and policy with reference to developments in South Africa

    Directory of Open Access Journals (Sweden)

    Martin De Wit

    2016-12-01

    Full Text Available The wide acceptance of economic approaches to environmental management and policy, masks increasing heterogeneity in the field. This editorial addresses the question whether the economic approach is still warranted and under which conditions. A broad outline of the trends in both orthodox and heterodox economic approaches is also presented. The traditional split between environmental and ecological economics is not doing justice to recent developments in the field. Instead it is proposed to rather refer to Environmental, Resource and Ecological Economics (EREE, Ecological-Economic Systems (EES and Socio-Ecological Systems (SES approaches as well as Heterodox approaches to Environment and Sustainability (HEES. The contributions made to this special issue are placed within their respective subfields of influence. It is concluded that a deeper, self-critical exposition of moral philosophies and values as well as models of reality are needed. A strategy of engagement in an attitude of self-criticism, humility and in participation with others is proposed as a viable way forward. For such a process to be successful two conditions are required, namely valuing the human person and accepting the reality of a nondeterminate world full of meaning.

  1. Laser ablation aerosol particle time-of-flight mass spectrometer (LAAPTOF): performance, reference spectra and classification of atmospheric samples

    Science.gov (United States)

    Shen, Xiaoli; Ramisetty, Ramakrishna; Mohr, Claudia; Huang, Wei; Leisner, Thomas; Saathoff, Harald

    2018-04-01

    The laser ablation aerosol particle time-of-flight mass spectrometer (LAAPTOF, AeroMegt GmbH) is able to identify the chemical composition and mixing state of individual aerosol particles, and thus is a tool for elucidating their impacts on human health, visibility, ecosystem, and climate. The overall detection efficiency (ODE) of the instrument we use was determined to range from ˜ (0.01 ± 0.01) to ˜ (4.23 ± 2.36) % for polystyrene latex (PSL) in the size range of 200 to 2000 nm, ˜ (0.44 ± 0.19) to ˜ (6.57 ± 2.38) % for ammonium nitrate (NH4NO3), and ˜ (0.14 ± 0.02) to ˜ (1.46 ± 0.08) % for sodium chloride (NaCl) particles in the size range of 300 to 1000 nm. Reference mass spectra of 32 different particle types relevant for atmospheric aerosol (e.g. pure compounds NH4NO3, K2SO4, NaCl, oxalic acid, pinic acid, and pinonic acid; internal mixtures of e.g. salts, secondary organic aerosol, and metallic core-organic shell particles; more complex particles such as soot and dust particles) were determined. Our results show that internally mixed aerosol particles can result in spectra with new clusters of ions, rather than simply a combination of the spectra from the single components. An exemplary 1-day ambient data set was analysed by both classical fuzzy clustering and a reference-spectra-based classification method. Resulting identified particle types were generally well correlated. We show how a combination of both methods can greatly improve the interpretation of single-particle data in field measurements.

  2. A dimensional approach to personality disorders in a sample of juvenile offenders

    Directory of Open Access Journals (Sweden)

    Daniela Cantone

    2012-03-01

    Full Text Available In a sample of 60 male Italian subjects imprisoned at a juvenile detention institute (JDI, psychopathological aspects of the AXIS II were described and the validity of a psychopathological dimensional approach for describing criminological issues was examined. The data show that the sample has psychopathological characteristics which revolve around ego weakness and poor management of relations and aggression. Statistically these psychopathological characteristics explain 85% of criminal behavior.

  3. An approach and a tool for setting sustainable energy retrofitting strategies referring to the 2010 EP

    Directory of Open Access Journals (Sweden)

    Charlot-Valdieu, C.

    2011-10-01

    Full Text Available The 2010 EPBD asks for an economic and social analysis in order to preserve social equity and to promote innovation and building productivity. This is possible with a life cycle energy cost (LCEC analysis, such as with the SEC (Sustainable Energy Cost model whose bottom up approach begins with a building typology including inhabitants. Then the analysis of some representative buildings includes the identification of a technico-economical optimum and energy retrofitting scenarios for each retrofitting programme and the extrapolation for the whole building stock. An extrapolation for the whole building stock allows to set up the strategy and to identify the needed means for reaching the objectives. SEC is a decision aid tool for optimising sustainable energy retrofitting strategies for buildings at territorial and patrimonial scales inside a sustainable development approach towards the factor 4. Various versions of the SEC model are now available for housing and for tertiary buildings.

    La directiva europea de 2010 sobre eficiencia energética en los edificios exige un análisis económico y social con el objetivo de preservar la equidad social, promover la innovación y reforzar la productividad en la construcción. Esto es posible con el análisis del coste global ampliado y especialmente con el modelo SEC. El análisis “bottom up” realizado con la SEC se basa en una tipología de edificio/usuario y en el análisis de edificios representativos: la identificación del óptimo técnico-económico y elaboración de escenarios antes de hacer una extrapolación al conjunto del parque. SEC es una herramienta de ayuda a la decisión para desarrollar estrategias territoriales o patrimoniales de rehabilitación energética. Existen diversas versiones del modelo: para edificios residenciales (unifamiliares y plurifamiliares, públicos y privados y para edificios terciarios.

  4. A Data-Driven Control Design Approach for Freeway Traffic Ramp Metering with Virtual Reference Feedback Tuning

    Directory of Open Access Journals (Sweden)

    Shangtai Jin

    2014-01-01

    Full Text Available ALINEA is a simple, efficient, and easily implemented ramp metering strategy. Virtual reference feedback tuning (VRFT is most suitable for many practical systems since it is a “one-shot” data-driven control design methodology. This paper presents an application of VRFT to a ramp metering problem of freeway traffic system. When there is not enough prior knowledge of the controlled system to select a proper parameter of ALINEA, the VRFT approach is used to optimize the ALINEA's parameter by only using a batch of input and output data collected from the freeway traffic system. The extensive simulations are built on both the macroscopic MATLAB platform and the microscopic PARAMICS platform to show the effectiveness and applicability of the proposed data-driven controller tuning approach.

  5. Multi-reference approach to the calculation of photoelectron spectra including spin-orbit coupling

    Energy Technology Data Exchange (ETDEWEB)

    Grell, Gilbert; Bokarev, Sergey I., E-mail: sergey.bokarev@uni-rostock.de; Kühn, Oliver [Institut für Physik, Universität Rostock, D-18051 Rostock (Germany); Winter, Bernd; Seidel, Robert [Helmholtz-Zentrum Berlin für Materialien und Energie, Methods for Material Development, Albert-Einstein-Strasse 15, D-12489 Berlin (Germany); Aziz, Emad F. [Helmholtz-Zentrum Berlin für Materialien und Energie, Methods for Material Development, Albert-Einstein-Strasse 15, D-12489 Berlin (Germany); Department of Physics, Freie Universität Berlin, Arnimalle 14, D-14159 Berlin (Germany); Aziz, Saadullah G. [Chemistry Department, Faculty of Science, King Abdulaziz University, 21589 Jeddah (Saudi Arabia)

    2015-08-21

    X-ray photoelectron spectra provide a wealth of information on the electronic structure. The extraction of molecular details requires adequate theoretical methods, which in case of transition metal complexes has to account for effects due to the multi-configurational and spin-mixed nature of the many-electron wave function. Here, the restricted active space self-consistent field method including spin-orbit coupling is used to cope with this challenge and to calculate valence- and core-level photoelectron spectra. The intensities are estimated within the frameworks of the Dyson orbital formalism and the sudden approximation. Thereby, we utilize an efficient computational algorithm that is based on a biorthonormal basis transformation. The approach is applied to the valence photoionization of the gas phase water molecule and to the core ionization spectrum of the [Fe(H{sub 2}O){sub 6}]{sup 2+} complex. The results show good agreement with the experimental data obtained in this work, whereas the sudden approximation demonstrates distinct deviations from experiments.

  6. Development of a reference method and sampling system for continuous monitoring of environmental HT and HTO concentration in the air

    International Nuclear Information System (INIS)

    Uchrin, G.

    1992-06-01

    A differential sampling system to monitor environmental Tritiated Water Vapour (HT) and Tritiated Hydrogen Gas (HTO) concentrations in the atmosphere was developed and tested. The sampler consists of an aerosol filter, diaphragm pump, absorption trap for HTO (molecular sieve), supply of H 2 carrier (electrolysis unit), conversion trap for HT (Pd-impregnated molecular sieve), flow meter and gas meter. The sampler operates with a flow rate between 30 and 80 1/h, with a typical sampling period of one week. Vacuum desorption at high temperature is used to extract the HTO collected in the absorption and conversion traps. Tritium analysis is carried out using liquid scintillation spectrometry or gas proportional counting. The sampler is equipped with built-in safety systems and can operate in remote places. Refs, figs and tabs

  7. Application of Isotope Dilution Mass Spectrometry for Reference Measurements of Cadmium. Copper, Mercury, Lead, Zinc and Methyl Mercury in Marine Sediment Sample

    Directory of Open Access Journals (Sweden)

    Vasileva E.

    2013-04-01

    Full Text Available Marine sediment was selected as a test sample for the laboratory inter-comparison studies organized by the Environment Laboratoryes of the International Atomic Energy. The analytical procedure to establish the reference values for the Cd, Cu, Hg, Methyl Hg, Pb and Zn amount contents was based on Isotope Dilution Inductively Coupled Plasma-Mass Spectrometry (ID ICP-MS applied as a primary method of measurement..The Hg and Methyl Hg determination will be detailed more specifically because of the problems encountered with this element, including sample homogeneity issues, memory effects and possible matrix effects during the ICP- MS measurement stage. Reference values, traceable to the SI, with total uncertainties of less than 2% relative expanded uncertainty (k=2 were obtained for Cd, Cu, Zn and Pb and around 5% for Hg and CH3Hg.

  8. Cytological preparations for molecular analysis: A review of technical procedures, advantages and limitations for referring samples for testing.

    Science.gov (United States)

    da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P

    2018-04-01

    Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.

  9. Laser ablation aerosol particle time-of-flight mass spectrometer (LAAPTOF: performance, reference spectra and classification of atmospheric samples

    Directory of Open Access Journals (Sweden)

    X. Shen

    2018-04-01

    Full Text Available The laser ablation aerosol particle time-of-flight mass spectrometer (LAAPTOF, AeroMegt GmbH is able to identify the chemical composition and mixing state of individual aerosol particles, and thus is a tool for elucidating their impacts on human health, visibility, ecosystem, and climate. The overall detection efficiency (ODE of the instrument we use was determined to range from  ∼  (0.01 ± 0.01 to  ∼  (4.23 ± 2.36 % for polystyrene latex (PSL in the size range of 200 to 2000 nm,  ∼  (0.44 ± 0.19 to  ∼  (6.57 ± 2.38 % for ammonium nitrate (NH4NO3, and  ∼  (0.14 ± 0.02 to  ∼  (1.46 ± 0.08 % for sodium chloride (NaCl particles in the size range of 300 to 1000 nm. Reference mass spectra of 32 different particle types relevant for atmospheric aerosol (e.g. pure compounds NH4NO3, K2SO4, NaCl, oxalic acid, pinic acid, and pinonic acid; internal mixtures of e.g. salts, secondary organic aerosol, and metallic core–organic shell particles; more complex particles such as soot and dust particles were determined. Our results show that internally mixed aerosol particles can result in spectra with new clusters of ions, rather than simply a combination of the spectra from the single components. An exemplary 1-day ambient data set was analysed by both classical fuzzy clustering and a reference-spectra-based classification method. Resulting identified particle types were generally well correlated. We show how a combination of both methods can greatly improve the interpretation of single-particle data in field measurements.

  10. Concepts and approaches for marine ecosystem research with reference to the tropics

    Directory of Open Access Journals (Sweden)

    Matthias Wolff

    2002-06-01

    Full Text Available The present article gives an overview on the leading concepts and modelling approaches for marine ecosystems’ research including (1 The trophodynamic theory of pelagic ecosystems, (2 Compartment/network models, (3 Mesocosm experiments and (4 Individual based modelling approaches and virtual ecosystems (VE. The main research questions addressed, as well as the potential and limits of each approach, are summarized and discussed and it is shown how the concept of ecosystem has changed over time. Aquatic biomas spectra (derived from the theory of pelagic ecosystems can give insight into the trophic structure of different systems, and can show how organism sizes are distributed within the system and how different size groups participate in the system’s metabolism and production. Compartment/network models allow for a more detailed description of the trophic structure of ecosystems and of the energy/biomass fluxes through the explicit modelling of P/B-and food consumption rates and biomasses for each system compartment. Moreover, system indices for a characterization and comparison with other systems can be obtained such as average trophic efficiency, energy throughput, and degree of connectivity, degree of maturity, and others. Recent dynamic extensions of trophic network models allow for exploring past and future impacts of fishing and environmental disturbances as well as to explore policies such as marine protected areas. Mesocosm experiments address a multitude of questions related to aquatic processes (i.e. primary production, grazing, predation, energy transfer between trophic levels etc. and the behaviour of organisms (i.e. growth, migration, response to contaminants etc. under semi-natural conditions. As processes within mesocosms often differ in rate and magnitude from those occurring in nature, mesocosms should be viewed as large in vitro experiments designed to test selected components of the ecosystem and not as an attempt to enclose

  11. Fusion Approaches for Land Cover Map Production Using High Resolution Image Time Series without Reference Data of the Corresponding Period

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2017-11-01

    Full Text Available Optical sensor time series images allow one to produce land cover maps at a large scale. The supervised classification algorithms have been shown to be the best to produce maps automatically with good accuracy. The main drawback of these methods is the need for reference data, the collection of which can introduce important production delays. Therefore, the maps are often available too late for some applications. Domain adaptation methods seem to be efficient for using past data for land cover map production. According to this idea, the main goal of this study is to propose several simple past data fusion schemes to override the current land cover map production delays. A single classifier approach and three voting rules are considered to produce maps without reference data of the corresponding period. These four approaches reach an overall accuracy of around 80% with a 17-class nomenclature using Formosat-2 image time series. A study of the impact of the number of past periods used is also done. It shows that the overall accuracy increases with the number of periods used. The proposed methods require at least two or three previous years to be used.

  12. Calibration-free quantitative elemental analysis of meteor plasma using reference laser-induced breakdown spectroscopy of meteorite samples

    Science.gov (United States)

    Ferus, Martin; Koukal, Jakub; Lenža, Libor; Srba, Jiří; Kubelík, Petr; Laitl, Vojtěch; Zanozina, Ekaterina M.; Váňa, Pavel; Kaiserová, Tereza; Knížek, Antonín; Rimmer, Paul; Chatzitheodoridis, Elias; Civiš, Svatopluk

    2018-03-01

    Aims: We aim to analyse real-time Perseid and Leonid meteor spectra using a novel calibration-free (CF) method, which is usually applied in the laboratory for laser-induced breakdown spectroscopic (LIBS) chemical analysis. Methods: Reference laser ablation spectra of specimens of chondritic meteorites were measured in situ simultaneously with a high-resolution laboratory echelle spectrograph and a spectral camera for meteor observation. Laboratory data were subsequently evaluated via the CF method and compared with real meteor emission spectra. Additionally, spectral features related to airglow plasma were compared with the spectra of laser-induced breakdown and electric discharge in the air. Results: We show that this method can be applied in the evaluation of meteor spectral data observed in real time. Specifically, CF analysis can be used to determine the chemical composition of meteor plasma, which, in the case of the Perseid and Leonid meteors analysed in this study, corresponds to that of the C-group of chondrites.

  13. Revision of the SNPforID 34-plex forensic ancestry test: Assay enhancements, standard reference sample genotypes and extended population studies.

    Science.gov (United States)

    Fondevila, M; Phillips, C; Santos, C; Freire Aradas, A; Vallone, P M; Butler, J M; Lareu, M V; Carracedo, A

    2013-01-01

    A revision of an established 34 SNP forensic ancestry test has been made by swapping the under-performing rs727811 component SNP with the highly informative rs3827760 that shows a near-fixed East Asian specific allele. We collated SNP variability data for the revised SNP set in 66 reference populations from 1000 Genomes and HGDP-CEPH panels and used this as reference data to analyse four U.S. populations showing a range of admixture patterns. The U.S. Hispanics sample in particular displayed heterogeneous values of co-ancestry between European, Native American and African contributors, likely to reflect in part, the way this disparate group is defined using cultural as well as population genetic parameters. The genotyping of over 700 U.S. population samples also provided the opportunity to thoroughly gauge peak mobility variation and peak height ratios observed from routine use of the single base extension chemistry of the 34-plex test. Finally, the genotyping of the widely used DNA profiling Standard Reference Material samples plus other control DNAs completes the audit of the 34-plex assay to allow forensic practitioners to apply this test more readily in their own laboratories. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Constraining Unsaturated Hydraulic Parameters Using the Latin Hypercube Sampling Method and Coupled Hydrogeophysical Approach

    Science.gov (United States)

    Farzamian, Mohammad; Monteiro Santos, Fernando A.; Khalil, Mohamed A.

    2017-12-01

    The coupled hydrogeophysical approach has proved to be a valuable tool for improving the use of geoelectrical data for hydrological model parameterization. In the coupled approach, hydrological parameters are directly inferred from geoelectrical measurements in a forward manner to eliminate the uncertainty connected to the independent inversion of electrical resistivity data. Several numerical studies have been conducted to demonstrate the advantages of a coupled approach; however, only a few attempts have been made to apply the coupled approach to actual field data. In this study, we developed a 1D coupled hydrogeophysical code to estimate the van Genuchten-Mualem model parameters, K s, n, θ r and α, from time-lapse vertical electrical sounding data collected during a constant inflow infiltration experiment. van Genuchten-Mualem parameters were sampled using the Latin hypercube sampling method to provide a full coverage of the range of each parameter from their distributions. By applying the coupled approach, vertical electrical sounding data were coupled to hydrological models inferred from van Genuchten-Mualem parameter samples to investigate the feasibility of constraining the hydrological model. The key approaches taken in the study are to (1) integrate electrical resistivity and hydrological data and avoiding data inversion, (2) estimate the total water mass recovery of electrical resistivity data and consider it in van Genuchten-Mualem parameters evaluation and (3) correct the influence of subsurface temperature fluctuations during the infiltration experiment on electrical resistivity data. The results of the study revealed that the coupled hydrogeophysical approach can improve the value of geophysical measurements in hydrological model parameterization. However, the approach cannot overcome the technical limitations of the geoelectrical method associated with resolution and of water mass recovery.

  15. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    Science.gov (United States)

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  16. Reference Intervals for Urinary Cotinine Levels and the Influence of Sampling Time and Other Predictors on Its Excretion Among Italian Schoolchildren

    Directory of Open Access Journals (Sweden)

    Carmela Protano

    2018-04-01

    Full Text Available (1 Background: Environmental Tobacco Smoke (ETS exposure remains a public health problem worldwide. The aims are to establish urinary (u- cotinine reference values for healthy Italian children, to evaluate the role of the sampling time and of other factors on children’s u-cotinine excretion. (2 Methods: A cross-sectional study was performed on 330 children. Information on participants was gathered by a questionnaire and u-cotinine was determined in two samples for each child, collected during the evening and the next morning. (3 Results: Reference intervals (as the 2.5th and 97.5th percentiles of the distribution in evening and morning samples were respectively equal to 0.98–4.29 and 0.91–4.50 µg L−1 (ETS unexposed and 1.39–16.34 and 1.49–20.95 µg L−1 (ETS exposed. No statistical differences were recovered between median values found in evening and morning samples, both in ETS unexposed and exposed. Significant predictors of u-cotinine excretions were ponderal status according to body mass index of children (β = 0.202; p-value = 0.041 for evening samples; β = 0.169; p-value = 0.039 for morning samples and paternal educational level (β = −0.258; p-value = 0.010; for evening samples; β = −0.013; p-value = 0.003 for morning samples. (4 Conclusions: The results evidenced the need of further studies for assessing the role of confounding factors on ETS exposure, and the necessity of educational interventions on smokers for rising their awareness about ETS.

  17. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  18. Comprehensive profiling of retroviral integration sites using target enrichment methods from historical koala samples without an assembled reference genome

    Directory of Open Access Journals (Sweden)

    Pin Cui

    2016-03-01

    Full Text Available Background. Retroviral integration into the host germline results in permanent viral colonization of vertebrate genomes. The koala retrovirus (KoRV is currently invading the germline of the koala (Phascolarctos cinereus and provides a unique opportunity for studying retroviral endogenization. Previous analysis of KoRV integration patterns in modern koalas demonstrate that they share integration sites primarily if they are related, indicating that the process is currently driven by vertical transmission rather than infection. However, due to methodological challenges, KoRV integrations have not been comprehensively characterized. Results. To overcome these challenges, we applied and compared three target enrichment techniques coupled with next generation sequencing (NGS and a newly customized sequence-clustering based computational pipeline to determine the integration sites for 10 museum Queensland and New South Wales (NSW koala samples collected between the 1870s and late 1980s. A secondary aim of this study sought to identify common integration sites across modern and historical specimens by comparing our dataset to previously published studies. Several million sequences were processed, and the KoRV integration sites in each koala were characterized. Conclusions. Although the three enrichment methods each exhibited bias in integration site retrieval, a combination of two methods, Primer Extension Capture and hybridization capture is recommended for future studies on historical samples. Moreover, identification of integration sites shows that the proportion of integration sites shared between any two koalas is quite small.

  19. Pulsed photothermal profiling of water-based samples using a spectrally composite reconstruction approach

    International Nuclear Information System (INIS)

    Majaron, B; Milanic, M

    2010-01-01

    Pulsed photothermal profiling involves reconstruction of temperature depth profile induced in a layered sample by single-pulse laser exposure, based on transient change in mid-infrared (IR) emission from its surface. Earlier studies have indicated that in watery tissues, featuring a pronounced spectral variation of mid-IR absorption coefficient, analysis of broadband radiometric signals within the customary monochromatic approximation adversely affects profiling accuracy. We present here an experimental comparison of pulsed photothermal profiling in layered agar gel samples utilizing a spectrally composite kernel matrix vs. the customary approach. By utilizing a custom reconstruction code, the augmented approach reduces broadening of individual temperature peaks to 14% of the absorber depth, in contrast to 21% obtained with the customary approach.

  20. Audit sampling: A qualitative study on the role of statistical and non-statistical sampling approaches on audit practices in Sweden

    OpenAIRE

    Ayam, Rufus Tekoh

    2011-01-01

    PURPOSE: The two approaches to audit sampling; statistical and nonstatistical have been examined in this study. The overall purpose of the study is to explore the current extent at which statistical and nonstatistical sampling approaches are utilized by independent auditors during auditing practices. Moreover, the study also seeks to achieve two additional purposes; the first is to find out whether auditors utilize different sampling techniques when auditing SME´s (Small and Medium-Sized Ente...

  1. The use of laser microdissection in the identification of suitable reference genes for normalization of quantitative real-time PCR in human FFPE epithelial ovarian tissue samples.

    Directory of Open Access Journals (Sweden)

    Jing Cai

    Full Text Available Quantitative real-time PCR (qPCR is a powerful and reproducible method of gene expression analysis in which expression levels are quantified by normalization against reference genes. Therefore, to investigate the potential biomarkers and therapeutic targets for epithelial ovarian cancer by qPCR, it is critical to identify stable reference genes. In this study, twelve housekeeping genes (ACTB, GAPDH, 18S rRNA, GUSB, PPIA, PBGD, PUM1, TBP, HRPT1, RPLP0, RPL13A, and B2M were analyzed in 50 ovarian samples from normal, benign, borderline, and malignant tissues. For reliable results, laser microdissection (LMD, an effective technique used to prepare homogeneous starting material, was utilized to precisely excise target tissues or cells. One-way analysis of variance (ANOVA and nonparametric (Kruskal-Wallis tests were used to compare the expression differences. NormFinder and geNorm software were employed to further validate the suitability and stability of the candidate genes. Results showed that epithelial cells occupied a small percentage of the normal ovary indeed. The expression of ACTB, PPIA, RPL13A, RPLP0, and TBP were stable independent of the disease progression. In addition, NormFinder and geNorm identified the most stable combination (ACTB, PPIA, RPLP0, and TBP and the relatively unstable reference gene GAPDH from the twelve commonly used housekeeping genes. Our results highlight the use of homogeneous ovarian tissues and multiple-reference normalization strategy, e.g. the combination of ACTB, PPIA, RPLP0, and TBP, for qPCR in epithelial ovarian tissues, whereas GAPDH, the most commonly used reference gene, is not recommended, especially as a single reference gene.

  2. The use of laser microdissection in the identification of suitable reference genes for normalization of quantitative real-time PCR in human FFPE epithelial ovarian tissue samples.

    Science.gov (United States)

    Cai, Jing; Li, Tao; Huang, Bangxing; Cheng, Henghui; Ding, Hui; Dong, Weihong; Xiao, Man; Liu, Ling; Wang, Zehua

    2014-01-01

    Quantitative real-time PCR (qPCR) is a powerful and reproducible method of gene expression analysis in which expression levels are quantified by normalization against reference genes. Therefore, to investigate the potential biomarkers and therapeutic targets for epithelial ovarian cancer by qPCR, it is critical to identify stable reference genes. In this study, twelve housekeeping genes (ACTB, GAPDH, 18S rRNA, GUSB, PPIA, PBGD, PUM1, TBP, HRPT1, RPLP0, RPL13A, and B2M) were analyzed in 50 ovarian samples from normal, benign, borderline, and malignant tissues. For reliable results, laser microdissection (LMD), an effective technique used to prepare homogeneous starting material, was utilized to precisely excise target tissues or cells. One-way analysis of variance (ANOVA) and nonparametric (Kruskal-Wallis) tests were used to compare the expression differences. NormFinder and geNorm software were employed to further validate the suitability and stability of the candidate genes. Results showed that epithelial cells occupied a small percentage of the normal ovary indeed. The expression of ACTB, PPIA, RPL13A, RPLP0, and TBP were stable independent of the disease progression. In addition, NormFinder and geNorm identified the most stable combination (ACTB, PPIA, RPLP0, and TBP) and the relatively unstable reference gene GAPDH from the twelve commonly used housekeeping genes. Our results highlight the use of homogeneous ovarian tissues and multiple-reference normalization strategy, e.g. the combination of ACTB, PPIA, RPLP0, and TBP, for qPCR in epithelial ovarian tissues, whereas GAPDH, the most commonly used reference gene, is not recommended, especially as a single reference gene.

  3. Cross validation of two partitioning-based sampling approaches in mesocosms containing PCB contaminated field sediment, biota, and activated carbon amendment

    DEFF Research Database (Denmark)

    Nørgaard Schmidt, Stine; Wang, Alice P.; Gidley, Philip T

    2017-01-01

    with multiple thicknesses of silicone and in situ pre-equilibrium sampling with low density polyethylene (LDPE) loaded with performance reference compounds were applied independently to measure polychlorinated biphenyls (PCBs) in mesocosms with (1) New Bedford Harbor sediment (MA, USA), (2) sediment and biota......, and (3) activated carbon amended sediment and biota. The aim was to cross validate the two different sampling approaches. Around 100 PCB congeners were quantified in the two sampling polymers, and the results confirmed the good precision of both methods and were in overall good agreement with recently...... published silicone to LDPE partition ratios. Further, the methods yielded Cfree in good agreement for all three experiments. The average ratio between Cfree determined by the two methods was factor 1.4±0.3 (range: 0.6-2.0), and the results thus cross-validated the two sampling approaches. For future...

  4. A modified approach to estimating sample size for simple logistic regression with one continuous covariate.

    Science.gov (United States)

    Novikov, I; Fund, N; Freedman, L S

    2010-01-15

    Different methods for the calculation of sample size for simple logistic regression (LR) with one normally distributed continuous covariate give different results. Sometimes the difference can be large. Furthermore, some methods require the user to specify the prevalence of cases when the covariate equals its population mean, rather than the more natural population prevalence. We focus on two commonly used methods and show through simulations that the power for a given sample size may differ substantially from the nominal value for one method, especially when the covariate effect is large, while the other method performs poorly if the user provides the population prevalence instead of the required parameter. We propose a modification of the method of Hsieh et al. that requires specification of the population prevalence and that employs Schouten's sample size formula for a t-test with unequal variances and group sizes. This approach appears to increase the accuracy of the sample size estimates for LR with one continuous covariate.

  5. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  6. A critical review of microextraction by packed sorbent as a sample preparation approach in drug bioanalysis.

    Science.gov (United States)

    Alves, Gilberto; Rodrigues, Márcio; Fortuna, Ana; Falcão, Amílcar; Queiroz, João

    2013-06-01

    Sample preparation is widely accepted as the most labor-intensive and error-prone part of the bioanalytical process. The recent advances in this field have been focused on the miniaturization and integration of sample preparation online with analytical instrumentation, in order to reduce laboratory workload and increase analytical performance. From this perspective, microextraction by packed sorbent (MEPS) has emerged in the last few years as a powerful sample preparation approach suitable to be easily automated with liquid and gas chromatographic systems applied in a variety of bioanalytical areas (pharmaceutical, clinical, toxicological, environmental and food research). This paper aims to provide an overview and a critical discussion of recent bioanalytical methods reported in literature based on MEPS, with special emphasis on those developed for the quantification of therapeutic drugs and/or metabolites in biological samples. The advantages and some limitations of MEPS, as well as its comparison with other extraction techniques, are also addressed herein.

  7. Free Energy Calculations using a Swarm-Enhanced Sampling Molecular Dynamics Approach.

    Science.gov (United States)

    Burusco, Kepa K; Bruce, Neil J; Alibay, Irfan; Bryce, Richard A

    2015-10-26

    Free energy simulations are an established computational tool in modelling chemical change in the condensed phase. However, sampling of kinetically distinct substates remains a challenge to these approaches. As a route to addressing this, we link the methods of thermodynamic integration (TI) and swarm-enhanced sampling molecular dynamics (sesMD), where simulation replicas interact cooperatively to aid transitions over energy barriers. We illustrate the approach by using alchemical alkane transformations in solution, comparing them with the multiple independent trajectory TI (IT-TI) method. Free energy changes for transitions computed by using IT-TI grew increasingly inaccurate as the intramolecular barrier was heightened. By contrast, swarm-enhanced sampling TI (sesTI) calculations showed clear improvements in sampling efficiency, leading to more accurate computed free energy differences, even in the case of the highest barrier height. The sesTI approach, therefore, has potential in addressing chemical change in systems where conformations exist in slow exchange. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Investigating the Prevalence of Pervasive Developmental Disorders According to Sex in a Sample of Iranian Children Referred to Medical-Rehabilitation Centers and Psychiatrics Clinics

    Directory of Open Access Journals (Sweden)

    K. Khushabi

    2006-04-01

    Full Text Available Introduction & Objective: According to significance of pervasive developmental disorders (PDD in children and the increasing rate of its prevalence in referred patients to clinic in recent years and due to absence of any report about the rate of PPD in our country, this study was carried out. The aim of this study was to determine the prevalence of PPD in a sample of Iranian children who referred to medical and rehabilitation centers.Materials & Methods: 248 children who referred to three medical-rehabilitation centers were participated in the research. Accessible sampling with diagnosis of PDD based on DSM-IV criteria was chosen. The obtained data were analyzed using descriptive statistics methods such as percent and frequency distribution. Results: Autistics disorder was most prevalent among pervasive developmental disorders. In this research Autistic disorder (proportion 4/1 to 1, Asperger disorder (proportion 3 to 1 and childhood disintegrative disease were more prevalent in boys than girls. Ret disorders was observed only in girls and pervasive developmental disease (NOS was seen in both sexes. Conclusion: The results showed that pervasive developmental disorders are 4 times more prevalent in boys than girls and the findings of this research were consistent with those of previous studies.

  9. Determination of Selected Polycyclic Aromatic Compounds in Particulate Matter Samples with Low Mass Loading: An Approach to Test Method Accuracy

    Directory of Open Access Journals (Sweden)

    Susana García-Alonso

    2017-01-01

    Full Text Available A miniaturized analytical procedure to determine selected polycyclic aromatic compounds (PACs in low mass loadings (<10 mg of particulate matter (PM is evaluated. The proposed method is based on a simple sonication/agitation method using small amounts of solvent for extraction. The use of a reduced sample size of particulate matter is often limiting for allowing the quantification of analytes. This also leads to the need for changing analytical procedures and evaluating its performance. The trueness and precision of the proposed method were tested using ambient air samples. Analytical results from the proposed method were compared with those of pressurized liquid and microwave extractions. Selected PACs (polycyclic aromatic hydrocarbons (PAHs and nitro polycyclic aromatic hydrocarbons (NPAHs were determined by liquid chromatography with fluorescence detection (HPLC/FD. Taking results from pressurized liquid extractions as reference values, recovery rates of sonication/agitation method were over 80% for the most abundant PAHs. Recovery rates of selected NPAHs were lower. Enhanced rates were obtained when methanol was used as a modifier. Intermediate precision was estimated by data comparison from two mathematical approaches: normalized difference data and pooled relative deviations. Intermediate precision was in the range of 10–20%. The effectiveness of the proposed method was evaluated in PM aerosol samples collected with very low mass loadings (<0.2 mg during characterization studies from turbofan engine exhausts.

  10. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part II: applications.

    Science.gov (United States)

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    The organoleptic quality of virgin olive oil depends on positive and negative sensory attributes. These attributes are related to volatile organic compounds and phenolic compounds that represent the aroma and taste (flavour) of the virgin olive oil. The flavour is the characteristic that can be measured by a taster panel. However, as for any analytical measuring device, the tasters, individually, and the panel, as a whole, should be harmonized and validated and proper olive oil standards are needed. In the present study, multivariate approaches are put into practice in addition to the rules to build a multivariate control chart from chromatographic volatile fingerprinting and chemometrics. Fingerprinting techniques provide analytical information without identify and quantify the analytes. This methodology is used to monitor the stability of sensory reference materials. The similarity indices have been calculated to build multivariate control chart with two olive oils certified reference materials that have been used as examples to monitor their stabilities. This methodology with chromatographic data could be applied in parallel with the 'panel test' sensory method to reduce the work of sensory analysis. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  11. Magnesium, Iron and Aluminum in LLNL Air Particulate and Rain Samples with Reference to Magnesium in Industrial Storm Water

    Energy Technology Data Exchange (ETDEWEB)

    Esser, Bradley K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bibby, Richard K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Fish, Craig [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-25

    Storm water runoff from the Lawrence Livermore National Laboratory’s (LLNL’s) main site and Site 300 periodically exceeds the Discharge Permit Numeric Action Level (NAL) for Magnesium (Mg) under the Industrial General Permit (IGP) Order No. 2014-0057-DWQ. Of particular interest is the source of magnesium in storm water runoff from the site. This special study compares new metals data from air particulate and precipitation samples from the LLNL main site and Site 300 to previous metals data for storm water from the main site and Site 300 and alluvial sediment from the main site to investigate the potential source of elevated Mg in storm water runoff. Data for three metals (Mg, Iron {Fe}, and Aluminum {Al}) were available from all media; data for additional metals, such as Europium (Eu), were available from rain, air particulates, and alluvial sediment. To attribute source, this study compared metals concentration data (for Mg, Al, and Fe) in storm water and rain; metal-metal correlations (Mg with Fe, Mg with Al, Al with Fe, Mg with Eu, Eu with Fe, and Eu with Al) in storm water, rain, air particulates, and sediments; and metal-metal ratios ((Mg/Fe, Mg/Al, Al/Fe, Mg/Eu, Eu/Fe, and Eu/Al) in storm water, rain, air particulates and sediments. The results presented in this study are consistent with a simple conceptual model where the source of Mg in storm water runoff is air particulate matter that has dry-deposited on impervious surfaces and subsequently entrained in runoff during precipitation events. Such a conceptual model is consistent with 1) higher concentrations of metals in storm water runoff than in precipitation, 2) the strong correlation of Mg with Aluminum (Al) and Iron (Fe) in both storm water and air particulates, and 3) the similarity in metal mass ratios between storm water and air particulates in contrast to the dissimilarity of metal mass ratios between storm water and precipitation or alluvial sediment. The strong correlation of Mg with Fe and Al

  12. New Approaches and Technologies to Sequence de novo Plant reference Genomes (2013 DOE JGI Genomics of Energy and Environment 8th Annual User Meeting)

    Energy Technology Data Exchange (ETDEWEB)

    Schmutz, Jeremy

    2013-03-01

    Jeremy Schmutz of the HudsonAlpha Institute for Biotechnology on New approaches and technologies to sequence de novo plant reference genomes at the 8th Annual Genomics of Energy Environment Meeting on March 27, 2013 in Walnut Creek, CA.

  13. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    Science.gov (United States)

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  14. An inversion-relaxation approach for sampling stationary points of spin model Hamiltonians

    International Nuclear Information System (INIS)

    Hughes, Ciaran; Mehta, Dhagash; Wales, David J.

    2014-01-01

    Sampling the stationary points of a complicated potential energy landscape is a challenging problem. Here, we introduce a sampling method based on relaxation from stationary points of the highest index of the Hessian matrix. We illustrate how this approach can find all the stationary points for potentials or Hamiltonians bounded from above, which includes a large class of important spin models, and we show that it is far more efficient than previous methods. For potentials unbounded from above, the relaxation part of the method is still efficient in finding minima and transition states, which are usually the primary focus of attention for atomistic systems

  15. Sampled-Data Control of Spacecraft Rendezvous with Discontinuous Lyapunov Approach

    Directory of Open Access Journals (Sweden)

    Zhuoshi Li

    2013-01-01

    Full Text Available This paper investigates the sampled-data stabilization problem of spacecraft relative positional holding with improved Lyapunov function approach. The classical Clohessy-Wiltshire equation is adopted to describe the relative dynamic model. The relative position holding problem is converted into an output tracking control problem using sampling signals. A time-dependent discontinuous Lyapunov functionals approach is developed, which will lead to essentially less conservative results for the stability analysis and controller design of the corresponding closed-loop system. Sufficient conditions for the exponential stability analysis and the existence of the proposed controller are provided, respectively. Finally, a simulation result is established to illustrate the effectiveness of the proposed control scheme.

  16. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  17. Comparison of multiplex RT-PCR and real-time HybProbe assay for serotyping of dengue virus using reference strains and clinical samples from India

    Directory of Open Access Journals (Sweden)

    Anita Chakravarti

    2016-01-01

    Full Text Available Background: Dengue virus serotyping is crucial from clinical management and epidemiological point of view. Aims: To compare efficacy of two molecular detection and typing methods, namely, multiplex reverse transcription polymerase chain reaction (RT-PCR and real-time Hybprobe assay using a panel of known dilution of four reference Dengue virus strains and a panel of sera collected from clinically suspected dengue patients. Settings: This study was conducted at a tertiary-care teaching hospital in Delhi, India. Materials and Methods: Dengue serotype specific virus strains were used as prototypes for serotyping assays. Viral load was quantified by quantitative real time reverse transcription polymerase chain reaction (qRT-PCR. Acute phase serum samples were collected from 79 patients with clinically suspected Dengue fever on their first day of presentation during September-October 2012. Viral RNA from serum and cell culture supernatant was extracted. Reverse transcription was carried out. Quantitative detection of DENV RNA from reference strain culture supernatants and each of the 79 patient samples by real-time PCR was performed using light cycler Taqman master mix kit. Serotyping was done by multiplex RT-PCR assay and Hybprobe assay. Results: The multiplex RT-PCR assay, though found to be 100% specific, couldn't serotype either patient or reference strains with viral load less than 1000 RNA copies/ml. The Hybprobe assay was found to have 100% specificity and had a lower limit of serotype detection of merely 3.54 RNA copies/ml. Conclusions: HybProbe assay has an important role especially in situations where serotyping is to be performed in clinical samples with low viral load.

  18. A novel approach for small sample size family-based association studies: sequential tests.

    Science.gov (United States)

    Ilk, Ozlem; Rajabli, Farid; Dungul, Dilay Ciglidag; Ozdag, Hilal; Ilk, Hakki Gokhan

    2011-08-01

    In this paper, we propose a sequential probability ratio test (SPRT) to overcome the problem of limited samples in studies related to complex genetic diseases. The results of this novel approach are compared with the ones obtained from the traditional transmission disequilibrium test (TDT) on simulated data. Although TDT classifies single-nucleotide polymorphisms (SNPs) to only two groups (SNPs associated with the disease and the others), SPRT has the flexibility of assigning SNPs to a third group, that is, those for which we do not have enough evidence and should keep sampling. It is shown that SPRT results in smaller ratios of false positives and negatives, as well as better accuracy and sensitivity values for classifying SNPs when compared with TDT. By using SPRT, data with small sample size become usable for an accurate association analysis.

  19. Racing Sampling Based Microimmune Optimization Approach Solving Constrained Expected Value Programming

    Directory of Open Access Journals (Sweden)

    Kai Yang

    2016-01-01

    Full Text Available This work investigates a bioinspired microimmune optimization algorithm to solve a general kind of single-objective nonlinear constrained expected value programming without any prior distribution. In the study of algorithm, two lower bound sample estimates of random variables are theoretically developed to estimate the empirical values of individuals. Two adaptive racing sampling schemes are designed to identify those competitive individuals in a given population, by which high-quality individuals can obtain large sampling size. An immune evolutionary mechanism, along with a local search approach, is constructed to evolve the current population. The comparative experiments have showed that the proposed algorithm can effectively solve higher-dimensional benchmark problems and is of potential for further applications.

  20. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  1. Determination of the platinum - group elements (PGE) and gold (Au) in the manganese nodule reference samples by nickel sulfide fire-assay and Te coprecipitation with ICP-MS

    Digital Repository Service at National Institute of Oceanography (India)

    Balaram, V.; Mathur, R.; Banakar, V.K.; Hein, J.R.; Rao, C.R.M.; Rao, T.G.; Dasaram, B.

    Platinum group elements (PGE) and Au data in polymetallic oceanic ferromanganese nodule reference samples and crust samples obtained by inductively coupled plasma mass spectrometry (ICP-MS), after separation and pre-concentration by nickel sulfide...

  2. A New Approach To Soil Sampling For Risk Assessment Of Nutrient Mobilisation.

    Science.gov (United States)

    Jonczyk, J. C.; Owen, G. J.; Snell, M. A.; Barber, N.; Benskin, C.; Reaney, S. M.; Haygarth, P.; Quinn, P. F.; Barker, P. A.; Aftab, A.; Burke, S.; Cleasby, W.; Surridge, B.; Perks, M. T.

    2016-12-01

    Traditionally, risks of nutrient and sediment losses from soils are assessed through a combination of field soil nutrient values on soil samples taken over the whole field and the proximity of the field to water courses. The field average nutrient concentration of the soil is used by farmers to determine fertiliser needs. These data are often used by scientists to assess the risk of nutrient losses to water course, though are not really `fit' for this purpose. The Eden Demonstration Test Catchment (http://www.edendtc.org.uk/) is a research project based in the River Eden catchment, NW UK, with the aim of cost effectively mitigating diffuse pollution from agriculture whilst maintaining agricultural productivity. Three instrumented focus catchments have been monitored since 2011, providing high resolution in-stream chemistry and ecological data, alongside some spatial data on soils, land use and nutrient inputs. An approach to mitigation was demonstrated in a small sub-catchment, where surface runoff was identified as the key drivers of nutrient losses, using a suite of runoff attenuation features. Other issues identified were management of hard- standings and soil compaction. A new approach for evaluating nutrient losses from soils is assessed in the Eden DTC project. The Sensitive Catchment Integrated Modelling and Prediction (SCIMAP) model is a risk-mapping framework designed to identify where in the landscape diffuse pollution is most likely to be originating (http://www.scimap.org.uk) and was used to look at the spatial pattern of erosion potential. The aim of this work was to assess if erosion potential identified through the model could be used to inform a new soil sampling strategy, to better assess risk of erosion and risk of transport of sediment-bound phosphorus. Soil samples were taken from areas with different erosion potential. The chemical analysis of these targeted samples are compared to those obtained using more traditional sampling approaches

  3. A Novel Method of Adrenal Venous Sampling via an Antecubital Approach

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Xiongjing, E-mail: jxj103@hotmail.com; Dong, Hui; Peng, Meng; Che, Wuqiang; Zou, Yubao; Song, Lei; Zhang, Huimin; Wu, Haiying [Chinese Academy of Medical Sciences and Peking Union Medical College, Department of Cardiology, Fuwai Hospital, National Center for Cardiovascular Disease (China)

    2017-03-15

    PurposeCurrently, almost all adrenal venous sampling (AVS) procedures are performed by femoral vein access. The purpose of this study was to establish the technique of AVS via an antecubital approach and evaluate its safety and feasibility.Materials and MethodsFrom January 2012 to June 2015, 194 consecutive patients diagnosed as primary aldosteronism underwent AVS via an antecubital approach without ACTH simulation. Catheters used for bilateral adrenal cannulations were recorded. The success rate of bilateral adrenal sampling, operation time, fluoroscopy time, dosage of contrast, and incidence of complications were calculated.ResultsA 5F MPA1 catheter was first used to attempt right adrenal cannulation in all patients. Cannulation of the right adrenal vein was successfully performed in 164 (84.5%) patients. The 5F JR5, Cobra2, and TIG catheters were the ultimate catheters for right adrenal cannulation in 16 (8.2%), 5 (2.6%), and 9 (4.6%) patients, respectively. For left adrenal cannulation, JR5 and Cobra2 catheters were used in 19 (9.8%) and 10 (5.2%) patients, respectively, while only TIG catheters were used in the remaining 165 (85.1%) patients. The rate of successful adrenal sampling on the right, left, and bilateral sides was 91.8%, 93.3%, and 87.6%, respectively. The mean time of operation was (16.3 ± 4.3) minutes, mean fluoroscopy time was (4.7 ± 1.3) minutes, and the mean use of contrast was (14.3 ± 4.7) ml. The incidence of adrenal hematoma was 1.0%.ConclusionsThis study showed that AVS via an antecubital approach was safe and feasible, with a high rate of successful sampling.

  4. Biomarker discovery in heterogeneous tissue samples -taking the in-silico deconfounding approach

    Directory of Open Access Journals (Sweden)

    Parida Shreemanta K

    2010-01-01

    Full Text Available Abstract Background For heterogeneous tissues, such as blood, measurements of gene expression are confounded by relative proportions of cell types involved. Conclusions have to rely on estimation of gene expression signals for homogeneous cell populations, e.g. by applying micro-dissection, fluorescence activated cell sorting, or in-silico deconfounding. We studied feasibility and validity of a non-negative matrix decomposition algorithm using experimental gene expression data for blood and sorted cells from the same donor samples. Our objective was to optimize the algorithm regarding detection of differentially expressed genes and to enable its use for classification in the difficult scenario of reversely regulated genes. This would be of importance for the identification of candidate biomarkers in heterogeneous tissues. Results Experimental data and simulation studies involving noise parameters estimated from these data revealed that for valid detection of differential gene expression, quantile normalization and use of non-log data are optimal. We demonstrate the feasibility of predicting proportions of constituting cell types from gene expression data of single samples, as a prerequisite for a deconfounding-based classification approach. Classification cross-validation errors with and without using deconfounding results are reported as well as sample-size dependencies. Implementation of the algorithm, simulation and analysis scripts are available. Conclusions The deconfounding algorithm without decorrelation using quantile normalization on non-log data is proposed for biomarkers that are difficult to detect, and for cases where confounding by varying proportions of cell types is the suspected reason. In this case, a deconfounding ranking approach can be used as a powerful alternative to, or complement of, other statistical learning approaches to define candidate biomarkers for molecular diagnosis and prediction in biomedicine, in

  5. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    Energy Technology Data Exchange (ETDEWEB)

    Shaltout, Abdallah A., E-mail: shaltout_a@hotmail.com [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Moharram, Mohammed A. [Spectroscopy Department, Physics Division, National Research Center, El Behooth Str., 12622 Dokki, Cairo (Egypt); Mostafa, Nasser Y. [Faculty of science, Taif University, 21974 Taif, P.O. Box 888 (Saudi Arabia); Chemistry Department, Faculty of Science, Suez Canal University, Ismailia (Egypt)

    2012-01-15

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: Black-Right-Pointing-Pointer Quantitative analysis of Catha edulis was carried out using standardless WDXRF. Black-Right-Pointing-Pointer Differential thermal analysis was used for determination of the loss of ignition. Black-Right-Pointing-Pointer The existence of hydroxyapatite in Catha edulis plant has been confirmed. Black-Right-Pointing-Pointer The CRM results confirmed the validity of the developed method.

  6. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    International Nuclear Information System (INIS)

    Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.

    2012-01-01

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method. - Highlights: ► Quantitative analysis of Catha edulis was carried out using standardless WDXRF. ► Differential thermal analysis was used for determination of the loss of ignition. ► The existence of hydroxyapatite in Catha edulis plant has been confirmed. ► The CRM results confirmed the validity of the developed method.

  7. Novel approach in k0-NAA for highly concentrated REE Samples.

    Science.gov (United States)

    Abdollahi Neisiani, M; Latifi, M; Chaouki, J; Chilian, C

    2018-04-01

    The present paper presents a new approach for k 0 -NAA for accurate quantification with short turnaround analysis times for rare earth elements (REEs) in high content mineral matrices. REE k 0 and Q 0 values, spectral interferences and nuclear interferences were experimentally evaluated and improved with Alfa Aesar Specpure Plasma Standard 1000mgkg -1 mono-rare earth solutions. The new iterative gamma-ray self-attenuation and neutron self-shielding methods were investigated with powder standards prepared from 100mg of 99.9% Alfa Aesar mono rare earth oxide diluted with silica oxide. The overall performance of the new k 0 -NAA method for REEs was validated using a certified reference material (CRM) from Canadian Certified Reference Materials Project (REE-2) with REE content ranging from 7.2mgkg -1 for Yb to 9610mgkg -1 for Ce. The REE concentration was determined with uncertainty below 7% (at 95% confidence level) and proved good consistency with the CRM certified concentrations. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  9. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  10. Single-sample 99mTc-diethylenetriamine penta-acetate plasma clearance in advanced renal failure by the mean sojourn time approach.

    Science.gov (United States)

    Gref, Margareta C; Karp, Kjell H

    2009-03-01

    The single-sample Tc-diethylenetriamine penta-acetate (DTPA) clearance method by Christensen and Groth is recommended by the Radionuclides in Nephrourology Committee on Renal Clearance for use in adults with an estimated glomerular filtration rate (GFR) > or = 30 ml/min. The purpose of this study was to test a new Tc-DTPA single-sample low clearance formula for GFR lesser than 30 ml/min. Twenty-one adult patients (29 investigations) were included. Reference clearance was calculated with both Cr-EDTA and Tc-DTPA according to Brøchner-Mortensen with samples drawn between 3 and 24 h. Single-sample clearance was calculated from a 24 h sample using the low clearance formula(Equation is included in full-text article.) C(t) is the activity of the tracer in the plasma sample t minutes after the injection and Q0 is the injected amount. ECV is the extracellular volume in ml defined as the distribution volume of the tracer. ECV is estimated from the body surface area as ECV=8116.6xbody surface area-28.2. The mean difference between reference and Tc-DTPA single-sample clearance was -0.5 ml/min (SD 1.0 ml/min) for Tc-DTPA and -0.8 ml/min (SD 1.2 ml/min) for Cr-EDTA as reference clearance. In adult patients it is possible, even with GFR lesser than 30 ml/min, to get an accurate determination of Tc-DTPA plasma clearance from a single sample using the mean sojourn time approach. The blood sample should be obtained about 24 h after injection of the GFR tracer.

  11. Evaluation of Approaches to Analyzing Continuous Correlated Eye Data When Sample Size Is Small.

    Science.gov (United States)

    Huang, Jing; Huang, Jiayan; Chen, Yong; Ying, Gui-Shuang

    2018-02-01

    To evaluate the performance of commonly used statistical methods for analyzing continuous correlated eye data when sample size is small. We simulated correlated continuous data from two designs: (1) two eyes of a subject in two comparison groups; (2) two eyes of a subject in the same comparison group, under various sample size (5-50), inter-eye correlation (0-0.75) and effect size (0-0.8). Simulated data were analyzed using paired t-test, two sample t-test, Wald test and score test using the generalized estimating equations (GEE) and F-test using linear mixed effects model (LMM). We compared type I error rates and statistical powers, and demonstrated analysis approaches through analyzing two real datasets. In design 1, paired t-test and LMM perform better than GEE, with nominal type 1 error rate and higher statistical power. In design 2, no test performs uniformly well: two sample t-test (average of two eyes or a random eye) achieves better control of type I error but yields lower statistical power. In both designs, the GEE Wald test inflates type I error rate and GEE score test has lower power. When sample size is small, some commonly used statistical methods do not perform well. Paired t-test and LMM perform best when two eyes of a subject are in two different comparison groups, and t-test using the average of two eyes performs best when the two eyes are in the same comparison group. When selecting the appropriate analysis approach the study design should be considered.

  12. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  13. A Toolbox for Quantitative Gene Expression in Varroa destructor: RNA Degradation in Field Samples and Systematic Analysis of Reference Gene Stability.

    Directory of Open Access Journals (Sweden)

    Ewan M Campbell

    Full Text Available Varroa destructor is the major pest of Apis mellifera and contributes to the global honey bee health crisis threatening food security. Developing new control strategies to combat Varroa will require the application of molecular biology, including gene expression studies by quantitative real-time reverse transcription-polymerase chain reaction (qRT-PCR. Both high quality RNA samples and suitable stable internal reference genes are required for accurate gene expression studies. In this study, ten candidate genes (succinate dehydrogenase (SDHA, NADH dehydrogenase (NADH, large ribsosmal subunit, TATA-binding protein, glyceraldehyde-3-phosphate dehydrogenase, 18S rRNA (18S, heat-shock protein 90 (HSP90, cyclophilin, α-tubulin, actin, were evaluated for their suitability as normalization genes using the geNorm, Normfinder, BestKeeper, and comparative ΔCq algorithims. Our study proposes the use of no more than two of the four most stable reference genes (NADH, 18S, SDHA and HSP90 in Varroa gene expression studies. These four genes remain stable in phoretic and reproductive stage Varroa and are unaffected by Deformed wing virus load. When used for determining changes in vitellogenin gene expression, the signal-to-noise ratio (SNR for the relatively unstable genes actin and α-tubulin was much lower than for the stable gene combinations (NADH + HSP90 +18S; NADH + HSP90; or NADH. Using both electropherograms and RT-qPCR for short and long amplicons as quality controls, we demonstrate that high quality RNA can be recovered from Varroa up to 10 days later stored at ambient temperature if collected into RNAlater and provided the body is pierced. This protocol allows the exchange of Varroa samples between international collaborators and field sample collectors without requiring frozen collection or shipping. Our results make important contributions to gene expression studies in Varroa by proposing a validated sampling protocol to obtain high quality Varroa

  14. Usefulness of quantitative determination of cerebral blood flow by 123I-IMP SPECT reference sample method in various cerebrovascular disorders

    International Nuclear Information System (INIS)

    Fukuda, Tadaharu; Hasegawa, Kouichi; Yamanaka, Shigehito; Hasue, Masamichi; Ohtubo, Yutaka; Wada, Atsushi; Nakanishi, Hisashi; Nakamura, Tatuya; Itou, Hiroshi.

    1992-01-01

    Cerebral blood flow (CBF) was quantitatively determined by N-isopropyl-p-[ 123 I] iodo-amphetamine (IMP) single photon emission computed tomography (SPECT) with a rotating gamma camera. A ZLC 7500 unit (SIEMENS Inc.) was used for emission CT, and a SCINTIPAK-2400 (Shimadzu Corp. Ltd.) for data processing. For the quantitative determination of CBF, arterial blood samples were collected for 5 minutes during the intravenous injection of 111 MBq of IMP, and a reference sample method corrected by time-activity curve was used. The determination was carried out in 90 patients with various cerebrovascular diseases and 5 normal volunteers. Mean cerebral blood flow (m-CBF) in the normal cases as determined by the above method was 42.4±6.0 (ml/100g/min). In patients with acute phase subarachnoid hemorrhage (SAH), severity on CT was marked in patients with intracerebral hematomas greater than 45 mm in diameter. Patients with non-hemorrhagic arteriovenous malfomation (AVM) whose nidi were 30 mm or more in diameter showed a decrease in CBF on the afferent side. This decrease was caused by a steal phenomenon affecting CBF around the AVM. The size of cerebral infarction on CT was closely correlated with the decrease in CBF, and CBF in patients with stenosis and obstruction of the main trunks was less than that in patients without them. CBF was increased by 10-20% in patients who underwent carotid endarterectomy or superior temporal artery-middle cerebral artery anastomosis for obstruction or stenosis of the internal carotid artery or the middle cerebral artery. The quantitative determination of CBF by IMP SPECT reference sample method was useful for evaluating the morbid condition and estimating the prognosis of cerebrovascular diseases, and evaluating the effects of therapy. (J.P.N.)

  15. MULTI-LEVEL SAMPLING APPROACH FOR CONTINOUS LOSS DETECTION USING ITERATIVE WINDOW AND STATISTICAL MODEL

    OpenAIRE

    Mohd Fo'ad Rohani; Mohd Aizaini Maarof; Ali Selamat; Houssain Kettani

    2010-01-01

    This paper proposes a Multi-Level Sampling (MLS) approach for continuous Loss of Self-Similarity (LoSS) detection using iterative window. The method defines LoSS based on Second Order Self-Similarity (SOSS) statistical model. The Optimization Method (OM) is used to estimate self-similarity parameter since it is fast and more accurate in comparison with other estimation methods known in the literature. Probability of LoSS detection is introduced to measure continuous LoSS detection performance...

  16. Perfluoroalkyl substances in aquatic environment-comparison of fish and passive sampling approaches.

    Science.gov (United States)

    Cerveny, Daniel; Grabic, Roman; Fedorova, Ganna; Grabicova, Katerina; Turek, Jan; Kodes, Vit; Golovko, Oksana; Zlabek, Vladimir; Randak, Tomas

    2016-01-01

    The concentrations of seven perfluoroalkyl substances (PFASs) were investigated in 36 European chub (Squalius cephalus) individuals from six localities in the Czech Republic. Chub muscle and liver tissue were analysed at all sampling sites. In addition, analyses of 16 target PFASs were performed in Polar Organic Chemical Integrative Samplers (POCISs) deployed in the water at the same sampling sites. We evaluated the possibility of using passive samplers as a standardized method for monitoring PFAS contamination in aquatic environments and the mutual relationships between determined concentrations. Only perfluorooctane sulphonate was above the LOQ in fish muscle samples and 52% of the analysed fish individuals exceeded the Environmental Quality Standard for water biota. Fish muscle concentration is also particularly important for risk assessment of fish consumers. The comparison of fish tissue results with published data showed the similarity of the Czech results with those found in Germany and France. However, fish liver analysis and the passive sampling approach resulted in different fish exposure scenarios. The total concentration of PFASs in fish liver tissue was strongly correlated with POCIS data, but pollutant patterns differed between these two matrices. The differences could be attributed to the metabolic activity of the living organism. In addition to providing a different view regarding the real PFAS cocktail to which the fish are exposed, POCISs fulfil the Three Rs strategy (replacement, reduction, and refinement) in animal testing. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Green infrastructure planning for cooling urban communities: Overview of the contemporary approaches with special reference to Serbian experiences

    Directory of Open Access Journals (Sweden)

    Marić Igor

    2015-01-01

    Full Text Available This paper investigates contemporary approaches defined by the policies, programs or standards that favor green infrastructure in urban planning for cooling urban environments with special reference to Serbian experiences. The research results reveal an increasing emphasis on the multifunctionality of green infrastructure as well the determination to the development of policies, guidelines and standards with the support of the overall community. Further, special importance is given to policies that promote ‘cool communities’ strategies resulting in the increase of vegetation-covered areas, what has contributed in adapting urban environments to the impacts of climate change. In addition, this research indicates the important role of local authorities and planners in Serbia in promoting planning policies and programs that take into consideration the role of green infrastructure in terms of improving climatic conditions, quality of life and reducing energy needed for cooling and heating. [Projekat Ministarstva nauke Republike Srbije, br. TR 36035: Spatial, ecological, energy, and social aspects of developing settlements and climate change - mutual impacts i br. 43007: The investigation of climate change and its impacts, climate change adaptation and mitigation

  18. Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach

    Science.gov (United States)

    Ferri, Gabriele; Cococcioni, Marco; Alvarez, Alberto

    2015-01-01

    This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality), used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called Aη, is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support). The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided and show that So

  19. Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach.

    Science.gov (United States)

    Ferri, Gabriele; Cococcioni, Marco; Alvarez, Alberto

    2015-12-26

    This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality), used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called A η , is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support). The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided and show that So

  20. Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach

    Directory of Open Access Journals (Sweden)

    Gabriele Ferri

    2015-12-01

    Full Text Available This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality, used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called A η , is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support. The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided

  1. DEVELOPING AN EXCELLENT SEDIMENT RATING CURVE FROM ONE HYDROLOGICAL YEAR SAMPLING PROGRAMME DATA: APPROACH

    Directory of Open Access Journals (Sweden)

    Preksedis M. Ndomba

    2008-01-01

    Full Text Available This paper presents preliminary findings on the adequacy of one hydrological year sampling programme data in developing an excellent sediment rating curve. The study case is a 1DD1 subcatchment in the upstream of Pangani River Basin (PRB, located in the North Eastern part of Tanzania. 1DD1 is the major runoff-sediment contributing tributary to the downstream hydropower reservoir, the Nyumba Ya Mungu (NYM. In literature sediment rating curve method is known to underestimate the actual sediment load. In the case of developing countries long-term sediment sampling monitoring or conservation campaigns have been reported as unworkable options. Besides, to the best knowledge of the authors, to date there is no consensus on how to develop an excellent rating curve. Daily-midway and intermittent-cross section sediment samples from Depth Integrating sampler (D-74 were used to calibrate the subdaily automatic sediment pumping sampler (ISCO 6712 near bank point samples for developing the rating curve. Sediment load correction factors were derived from both statistical bias estimators and actual sediment load approaches. It should be noted that the ongoing study is guided by findings of other studies in the same catchment. For instance, long term sediment yield rate estimated based on reservoir survey validated the performance of the developed rating curve. The result suggests that excellent rating curve could be developed from one hydrological year sediment sampling programme data. This study has also found that uncorrected rating curve underestimates sediment load. The degreeof underestimation depends on the type of rating curve developed and data used.

  2. A retrospective cross-sectional quantitative molecular approach in biological samples from patients with syphilis.

    Science.gov (United States)

    Pinto, Miguel; Antelo, Minia; Ferreira, Rita; Azevedo, Jacinta; Santo, Irene; Borrego, Maria José; Gomes, João Paulo

    2017-03-01

    Syphilis is the sexually transmitted disease caused by Treponema pallidum, a pathogen highly adapted to the human host. As a multistage disease, syphilis presents distinct clinical manifestations that pose different implications for diagnosis. Nevertheless, the inherent factors leading to diverse disease progressions are still unknown. We aimed to assess the association between treponemal loads and dissimilar disease outcomes, to better understand syphilis. We retrospectively analyzed 309 DNA samples distinct anatomic sites associated with particular syphilis manifestations. All samples had previously tested positive by a PCR-based diagnostic kit. An absolute quantitative real-time PCR procedure was used to precisely quantify the number of treponemal and human cells to determine T. pallidum loads in each sample. In general, lesion exudates presented the highest T. pallidum loads in contrast with blood-derived samples. Within the latter, a higher dispersion of T. pallidum quantities was observed for secondary syphilis. T. pallidum was detected in substantial amounts in 37 samples of seronegative individuals and in 13 cases considered as syphilis-treated. No association was found between treponemal loads and serological results or HIV status. This study suggests a scenario where syphilis may be characterized by: i) heterogeneous and high treponemal loads in primary syphilis, regardless of the anatomic site, reflecting dissimilar duration of chancres development and resolution; ii) high dispersion of bacterial concentrations in secondary syphilis, potentially suggesting replication capability of T. pallidum while in the bloodstream; and iii) bacterial evasiveness, either to the host immune system or antibiotic treatment, while remaining hidden in privileged niches. This work highlights the importance of using molecular approaches to study uncultivable human pathogens, such as T. pallidum, in the infection process. Copyright © 2017 Elsevier Ltd. All rights

  3. A simple Bayesian approach to quantifying confidence level of adverse event incidence proportion in small samples.

    Science.gov (United States)

    Liu, Fang

    2016-01-01

    In both clinical development and post-marketing of a new therapy or a new treatment, incidence of an adverse event (AE) is always a concern. When sample sizes are small, large sample-based inferential approaches on an AE incidence proportion in a certain time period no longer apply. In this brief discussion, we introduce a simple Bayesian framework to quantify, in small sample studies and the rare AE case, (1) the confidence level that the incidence proportion of a particular AE p is over or below a threshold, (2) the lower or upper bounds on p with a certain level of confidence, and (3) the minimum required number of patients with an AE before we can be certain that p surpasses a specific threshold, or the maximum allowable number of patients with an AE after which we can no longer be certain that p is below a certain threshold, given a certain confidence level. The method is easy to understand and implement; the interpretation of the results is intuitive. This article also demonstrates the usefulness of simple Bayesian concepts when it comes to answering practical questions.

  4. SU-F-T-64: An Alternative Approach to Determining the Reference Air-Kerma Rate from Extrapolation Chamber Measurements

    International Nuclear Information System (INIS)

    Schneider, T

    2016-01-01

    Purpose: Since 2008 the Physikalisch-Technische Bundesanstalt (PTB) has been offering the calibration of "1"2"5I-brachytherapy sources in terms of the reference air-kerma rate (RAKR). The primary standard is a large air-filled parallel-plate extrapolation chamber. The measurement principle is based on the fact that the air-kerma rate is proportional to the increment of ionization per increment of chamber volume at chamber depths greater than the range of secondary electrons originating from the electrode x_0. Methods: Two methods for deriving the RAKR from the measured ionization charges are: (1) to determine the RAKR from the slope of the linear fit to the so-called ’extrapolation curve’, the measured ionization charges Q vs. plate separations x or (2) to differentiate Q(x) and to derive the RAKR by a linear extrapolation towards zero plate separation. For both methods, correcting the measured data for all known influencing effects before the evaluation method is applied is a precondition. However, the discrepancy of their results is larger than the uncertainty given for the determination of the RAKR with both methods. Results: A new approach to derive the RAKR from the measurements is investigated as an alternative. The method was developed from the ground up, based on radiation transport theory. A conversion factor C(x_1, x_2) is applied to the difference of charges measured at the two plate separations x_1 and x_2. This factor is composed of quotients of three air-kerma values calculated for different plate separations in the chamber: the air kerma Ka(0) for plate separation zero, and the mean air kermas at the plate separations x_1 and x_2, respectively. The RAKR determined with method (1) yields 4.877 µGy/h, and with method (2) 4.596 µGy/h. The application of the alternative approach results in 4.810 µGy/h. Conclusion: The alternative method shall be established in the future.

  5. Simultaneous determination of macro and trace elements in biological reference materials by microwave induced plasma optical emission spectrometry with slurry sample introduction

    International Nuclear Information System (INIS)

    Matusiewicz, Henryk; Golik, Bartosz

    2004-01-01

    A slurry sampling technique (SST) has been utilized for simultaneous multi-element analysis by microwave-induced plasma optical emission spectrometry (MIP-OES). Slurry samples from a spray chamber are fed directly into the microwave cavity-torch assembly (power 300 W) with no desolvation apparatus. The performance of SST-MIP-OES was demonstrated by the determination of macro (Na, K, Ca, Mg, P) and trace (Cd, Cu, Mn, Sr, Zn) elements in three biological certified reference materials using a V-groove, clog-free Babington-type nebulizer. Slurry concentrations up to 1% m/v (particles 3 (pH 1.2) containing 0.01% of Triton X-100, were used with calibration by the standard additions method. The method offers relatively good precision (R.S.D. ranged from 7 to 11%) with measured concentrations being in satisfactory agreement with certified values for NRCC TORT-1 (Lobster hepatopancreas), NRCC LUTS-1 (Lobster hepatopancreas) and IAEA-153 (Milk powder). The concentrations of Na, K, Ca, Mg, P and Cd, Cu, Mn, Sr, Zn were determined in the range 90-22 000 μg/g and 1-420 μg/g, respectively. The method could be useful as a routine procedure

  6. Simultaneous determination of macro and trace elements in biological reference materials by microwave induced plasma optical emission spectrometry with slurry sample introduction

    Energy Technology Data Exchange (ETDEWEB)

    Matusiewicz, Henryk E-mail: Henryk.Matusiewicz@put.poznan.pl; Golik, Bartosz

    2004-05-21

    A slurry sampling technique (SST) has been utilized for simultaneous multi-element analysis by microwave-induced plasma optical emission spectrometry (MIP-OES). Slurry samples from a spray chamber are fed directly into the microwave cavity-torch assembly (power 300 W) with no desolvation apparatus. The performance of SST-MIP-OES was demonstrated by the determination of macro (Na, K, Ca, Mg, P) and trace (Cd, Cu, Mn, Sr, Zn) elements in three biological certified reference materials using a V-groove, clog-free Babington-type nebulizer. Slurry concentrations up to 1% m/v (particles <20 {mu}m), prepared in 10% HNO{sub 3} (pH 1.2) containing 0.01% of Triton X-100, were used with calibration by the standard additions method. The method offers relatively good precision (R.S.D. ranged from 7 to 11%) with measured concentrations being in satisfactory agreement with certified values for NRCC TORT-1 (Lobster hepatopancreas), NRCC LUTS-1 (Lobster hepatopancreas) and IAEA-153 (Milk powder). The concentrations of Na, K, Ca, Mg, P and Cd, Cu, Mn, Sr, Zn were determined in the range 90-22 000 {mu}g/g and 1-420 {mu}g/g, respectively. The method could be useful as a routine procedure.

  7. The role of the grammar teaching: from communicative approaches to the common European framework of reference for languages THE ROLE OF THE GRAMAMAR TEACHING: FROM COMMUNCATIVE APPROACHES TO THE COMMON EUROPEAN FRAMEWORK OF REFERENCE FOR LANGUAGES

    Directory of Open Access Journals (Sweden)

    José López Rama

    2012-07-01

    Full Text Available In the history of language teaching, the role of grammar has been addressed by a number of linguistic theories, pedagogies and, currently, within the Common European Framework of Reference for Languages (CEF. The way grammar is considered has a decisive influence on pedagogical practices, learning processes and many other areas involved in language teaching. This paper constitutes a revision of how grammar has evolved in the last fifty years paying special attention to its evolving role in both communicative (CLT and post-communicative approaches and in the CEF.From this revision, some controversial issues concerning the pedagogic value of teaching grammar will arise as well, such as whether grammar is worth teaching in the classroom or not and how it should be taught.Even though there exists a parallel linguistic framework between CLT and the CEF, some issues still need revision concerning the notion of grammatical competence and its role for language teaching.Históricamente, el papel de la gramática en la enseñanza de lenguas se ha justificado y cuestionado tanto por teorías lingüísticas como, actualmente, dentro del Marco Común Europeo de Referencia. La forma de contemplar la gramática influye de modo fundamental en la metodología docente, en la elaboración de manuales de texto y en los procesos de aprendizaje, entre otros. Este artículo revisa el papel de la gramática en los últimos cincuenta años prestando especial atención al método comunicativo, los post-comunicativos y dentro del Marco Común Europeo de Referencia. En respuesta, se revisa la posible controversia sobre la propia definición de gramática y su valor en enseñanza de lenguas extranjeras.

  8. A novel quantitative approach for eliminating sample-to-sample variation using a hue saturation value analysis program.

    Science.gov (United States)

    Yabusaki, Katsumi; Faits, Tyler; McMullen, Eri; Figueiredo, Jose Luiz; Aikawa, Masanori; Aikawa, Elena

    2014-01-01

    As computing technology and image analysis techniques have advanced, the practice of histology has grown from a purely qualitative method to one that is highly quantified. Current image analysis software is imprecise and prone to wide variation due to common artifacts and histological limitations. In order to minimize the impact of these artifacts, a more robust method for quantitative image analysis is required. Here we present a novel image analysis software, based on the hue saturation value color space, to be applied to a wide variety of histological stains and tissue types. By using hue, saturation, and value variables instead of the more common red, green, and blue variables, our software offers some distinct advantages over other commercially available programs. We tested the program by analyzing several common histological stains, performed on tissue sections that ranged from 4 µm to 10 µm in thickness, using both a red green blue color space and a hue saturation value color space. We demonstrated that our new software is a simple method for quantitative analysis of histological sections, which is highly robust to variations in section thickness, sectioning artifacts, and stain quality, eliminating sample-to-sample variation.

  9. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  10. Are Flow Injection-based Approaches Suitable for Automated Handling of Solid Samples?

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald; Cerdà, Victor

    Flow-based approaches were originally conceived for liquid-phase analysis, implying that constituents in solid samples generally had to be transferred into the liquid state, via appropriate batch pretreatment procedures, prior to analysis. Yet, in recent years, much effort has been focused...... electrolytic or aqueous leaching, on-line dialysis/microdialysis, in-line filtration, and pervaporation-based procedures have been successfully implemented in continuous flow/flow injection systems. In this communication, the new generation of flow analysis, including sequential injection, multicommutated flow.......g., soils, sediments, sludges), and thus, ascertaining the potential mobility, bioavailability and eventual impact of anthropogenic elements on biota [2]. In this context, the principles of sequential injection-microcolumn extraction (SI-MCE) for dynamic fractionation are explained in detail along...

  11. Ethics and law in research with human biological samples: a new approach.

    Science.gov (United States)

    Petrini, Carlo

    2014-01-01

    During the last century a large number of documents (regulations, ethical codes, treatises, declarations, conventions) were published on the subject of ethics and clinical trials, many of them focusing on the protection of research participants. More recently various proposals have been put forward to relax some of the constraints imposed on research by these documents and regulations. It is important to distinguish between risks deriving from direct interventions on human subjects and other types of risk. In Italy the Data Protection Authority has acted in the question of research using previously collected health data and biological samples to simplify the procedures regarding informed consent. The new approach may be of help to other researchers working outside Italy.

  12. Rapid assessment of antimicrobial resistance prevalence using a Lot Quality Assurance sampling approach.

    Science.gov (United States)

    van Leth, Frank; den Heijer, Casper; Beerepoot, Mariëlle; Stobberingh, Ellen; Geerlings, Suzanne; Schultsz, Constance

    2017-04-01

    Increasing antimicrobial resistance (AMR) requires rapid surveillance tools, such as Lot Quality Assurance Sampling (LQAS). LQAS classifies AMR as high or low based on set parameters. We compared classifications with the underlying true AMR prevalence using data on 1335 Escherichia coli isolates from surveys of community-acquired urinary tract infection in women, by assessing operating curves, sensitivity and specificity. Sensitivity and specificity of any set of LQAS parameters was above 99% and between 79 and 90%, respectively. Operating curves showed high concordance of the LQAS classification with true AMR prevalence estimates. LQAS-based AMR surveillance is a feasible approach that provides timely and locally relevant estimates, and the necessary information to formulate and evaluate guidelines for empirical treatment.

  13. A Bayesian approach to assess data from radionuclide activity analyses in environmental samples

    International Nuclear Information System (INIS)

    Barrera, Manuel; Lourdes Romero, M.; Nunez-Lagos, Rafael; Bernardo, Jose M.

    2007-01-01

    A Bayesian statistical approach is introduced to assess experimental data from the analyses of radionuclide activity concentration in environmental samples (low activities). A theoretical model has been developed that allows the use of known prior information about the value of the measurand (activity), together with the experimental value determined through the measurement. The model has been applied to data of the Inter-laboratory Proficiency Test organised periodically among Spanish environmental radioactivity laboratories that are producing the radiochemical results for the Spanish radioactive monitoring network. A global improvement of laboratories performance is produced when this prior information is taken into account. The prior information used in this methodology is an interval within which the activity is known to be contained, but it could be extended to any other experimental quantity with a different type of prior information available

  14. A new approach for the determination of sulphur in food samples by high-resolution continuum source flame atomic absorption spectrometer.

    Science.gov (United States)

    Ozbek, N; Baysal, A

    2015-02-01

    The new approach for the determination of sulphur in foods was developed, and the sulphur concentrations of various fresh and dried food samples determined using a high-resolution continuum source flame atomic absorption spectrometer with an air/acetylene flame. The proposed method was optimised and the validated using standard reference materials, and certified values were found to be within the 95% confidence interval. The sulphur content of foods ranged from less than the LOD to 1.5mgg(-1). The method is accurate, fast, simple and sensitive. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. A novel four-dimensional analytical approach for analysis of complex samples.

    Science.gov (United States)

    Stephan, Susanne; Jakob, Cornelia; Hippler, Jörg; Schmitz, Oliver J

    2016-05-01

    A two-dimensional LC (2D-LC) method, based on the work of Erni and Frei in 1978, was developed and coupled to an ion mobility-high-resolution mass spectrometer (IM-MS), which enabled the separation of complex samples in four dimensions (2D-LC, ion mobility spectrometry (IMS), and mass spectrometry (MS)). This approach works as a continuous multiheart-cutting LC system, using a long modulation time of 4 min, which allows the complete transfer of most of the first - dimension peaks to the second - dimension column without fractionation, in comparison to comprehensive two-dimensional liquid chromatography. Hence, each compound delivers only one peak in the second dimension, which simplifies the data handling even when ion mobility spectrometry as a third and mass spectrometry as a fourth dimension are introduced. The analysis of a plant extract from Ginkgo biloba shows the separation power of this four-dimensional separation method with a calculated total peak capacity of more than 8700. Furthermore, the advantage of ion mobility for characterizing unknown compounds by their collision cross section (CCS) and accurate mass in a non-target approach is shown for different matrices like plant extracts and coffee. Graphical abstract Principle of the four-dimensional separation.

  16. A non-iterative sampling approach using noise subspace projection for EIT

    International Nuclear Information System (INIS)

    Bellis, Cédric; Constantinescu, Andrei; Coquet, Thomas; Jaravel, Thomas; Lechleiter, Armin

    2012-01-01

    This study concerns the problem of the reconstruction of inclusions embedded in a conductive medium in the context of electrical impedance tomography (EIT), which is investigated within the framework of a non-iterative sampling approach. This type of identification strategy relies on the construction of a special indicator function that takes, roughly speaking, small values outside the inclusion and large values inside. Such a function is constructed in this paper from the projection of a fundamental singular solution onto the space spanned by the singular vectors associated with some of the smallest singular values of the data-to-measurement operator. The behavior of the novel indicator function is analyzed. For a subsequent implementation in a discrete setting, the quality of classical finite-dimensional approximations of the measurement operator is discussed. The robustness of this approach is also analyzed when only noisy spectral information is available. Finally, this identification method is implemented numerically and experimentally, and its efficiency is discussed on a set of, partly experimental, examples. (paper)

  17. Assessment of Sampling Approaches for Remote Sensing Image Classification in the Iranian Playa Margins

    Science.gov (United States)

    Kazem Alavipanah, Seyed

    There are some problems in soil salinity studies based upon remotely sensed data: 1-spectral world is full of ambiguity and therefore soil reflectance can not be attributed to a single soil property such as salinity, 2) soil surface conditions as a function of time and space is a complex phenomena, 3) vegetation with a dynamic biological nature may create some problems in the study of soil salinity. Due to these problems the first question which may arise is how to overcome or minimise these problems. In this study we hypothesised that different sources of data, well established sampling plan and optimum approach could be useful. In order to choose representative training sites in the Iranian playa margins, to define the spectral and informational classes and to overcome some problems encountered in the variation within the field, the following attempts were made: 1) Principal Component Analysis (PCA) in order: a) to determine the most important variables, b) to understand the Landsat satellite images and the most informative components, 2) the photomorphic unit (PMU) consideration and interpretation; 3) study of salt accumulation and salt distribution in the soil profile, 4) use of several forms of field data, such as geologic, geomorphologic and soil information; 6) confirmation of field data and land cover types with farmers and the members of the team. The results led us to find at suitable approaches with a high and acceptable image classification accuracy and image interpretation. KEY WORDS; Photo Morphic Unit, Pprincipal Ccomponent Analysis, Soil Salinity, Field Work, Remote Sensing

  18. Normative reference values for the 20 m shuttle‐run test in a population‐based sample of school‐aged youth in Bogota, Colombia: the FUPRECOL study

    Science.gov (United States)

    Palacios‐López, Adalberto; Humberto Prieto‐Benavides, Daniel; Enrique Correa‐Bautista, Jorge; Izquierdo, Mikel; Alonso‐Martínez, Alicia; Lobelo, Felipe

    2016-01-01

    Abstract Objectives Our aim was to determine the normative reference values of cardiorespiratory fitness (CRF) and to establish the proportion of subjects with low CRF suggestive of future cardio‐metabolic risk. Methods A total of 7244 children and adolescents attending public schools in Bogota, Colombia (55.7% girls; age range of 9–17.9 years) participated in this study. We expressed CRF performance as the nearest stage (minute) completed and the estimated peak oxygen consumption (V˙O2peak). Smoothed percentile curves were calculated. In addition, we present the prevalence of low CRF after applying a correction factor to account for the impact of Bogota's altitude (2625 m over sea level) on CRF assessment, and we calculated the number of participants who fell below health‐related FITNESSGRAM cut‐points for low CRF. Results Shuttles and V˙O2peak were higher in boys than in girls in all age groups. In boys, there were higher levels of performance with increasing age, with most gains between the ages of 13 and 17. The proportion of subjects with a low CRF, suggestive of future cardio‐metabolic risk (health risk FITNESSGRAM category) was 31.5% (28.2% for boys and 34.1% for girls; X2 P = .001). After applying a 1.11 altitude correction factor, the overall prevalence of low CRF was 11.5% (9.6% for boys and 13.1% for girls; X2 P = .001). Conclusions Our results provide sex‐ and age‐specific normative reference standards for the 20 m shuttle‐run test and estimated V˙O2peak values in a large, population‐based sample of schoolchildren from a large Latin‐American city at high altitude. PMID:27500986

  19. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  20. A new insert sample approach to paper spray mass spectrometry: a paper substrate with paraffin barriers.

    Science.gov (United States)

    Colletes, T C; Garcia, P T; Campanha, R B; Abdelnur, P V; Romão, W; Coltro, W K T; Vaz, B G

    2016-03-07

    The analytical performance for paper spray (PS) using a new insert sample approach based on paper with paraffin barriers (PS-PB) is presented. The paraffin barrier is made using a simple, fast and cheap method based on the stamping of paraffin onto a paper surface. Typical operation conditions of paper spray such as the solvent volume applied on the paper surface, and the paper substrate type are evaluated. A paper substrate with paraffin barriers shows better performance on analysis of a range of typical analytes when compared to the conventional PS-MS using normal paper (PS-NP) and PS-MS using paper with two rounded corners (PS-RC). PS-PB was applied to detect sugars and their inhibitors in sugarcane bagasse liquors from a second generation ethanol process. Moreover, the PS-PB proved to be excellent, showing results for the quantification of glucose in hydrolysis liquors with excellent linearity (R(2) = 0.99), limits of detection (2.77 mmol L(-1)) and quantification (9.27 mmol L(-1)). The results are better than for PS-NP and PS-RC. The PS-PB was also excellent in performance when compared with the HPLC-UV method for glucose quantification on hydrolysis of liquor samples.

  1. Methodologies for the Extraction of Phenolic Compounds from Environmental Samples: New Approaches

    Directory of Open Access Journals (Sweden)

    Cristina Mahugo Santana

    2009-01-01

    Full Text Available Phenolic derivatives are among the most important contaminants present in the environment. These compounds are used in several industrial processes to manufacture chemicals such as pesticides, explosives, drugs and dyes. They also are used in the bleaching process of paper manufacturing. Apart from these sources, phenolic compounds have substantial applications in agriculture as herbicides, insecticides and fungicides. However, phenolic compounds are not only generated by human activity, but they are also formed naturally, e.g., during the decomposition of leaves or wood. As a result of these applications, they are found in soils and sediments and this often leads to wastewater and ground water contamination. Owing to their high toxicity and persistence in the environment, both, the US Environmental Protection Agency (EPA and the European Union have included some of them in their lists of priority pollutants. Current standard methods of phenolic compounds analysis in water samples are based on liquid–liquid extraction (LLE while Soxhlet extraction is the most used technique for isolating phenols from solid matrices. However, these techniques require extensive cleanup procedures that are time-intensive and involve expensive and hazardous organic solvents, which are undesirable for health and disposal reasons. In the last years, the use of news methodologies such as solid-phase extraction (SPE and solid-phase microextraction (SPME have increased for the extraction of phenolic compounds from liquid samples. In the case of solid samples, microwave assisted extraction (MAE is demonstrated to be an efficient technique for the extraction of these compounds. In this work we review the developed methods in the extraction and determination of phenolic derivatives in different types of environmental matrices such as water, sediments and soils. Moreover, we present the new approach in the use of micellar media coupled with SPME process for the

  2. THE ROLE OF THE GRAMAMAR TEACHING: FROM COMMUNCATIVE APPROACHES TO THE COMMON EUROPEAN FRAMEWORK OF REFERENCE FOR LANGUAGES

    Directory of Open Access Journals (Sweden)

    Gloria Luque Agulló

    2012-07-01

    Full Text Available

    In the history of language teaching, the role of grammar has been addressed by a number of linguistic theories, pedagogies and, currently, within the Common European Framework of Reference for Languages (CEF. The way grammar is considered has a decisive influence on pedagogical practices, learning processes and many other areas involved in language teaching. This paper constitutes a revision of how grammar has evolved in the last fifty years paying special attention to its evolving role in both communicative (CLT and post-communicative approaches and in the CEF.From this revision, some controversial issues concerning the pedagogic value of teaching grammar will arise as well, such as whether grammar is worth teaching in the classroom or not and how it should be taught.Even though there exists a parallel linguistic framework between CLT and the CEF, some issues still need revision concerning the notion of grammatical competence and its role for language teaching.

    Históricamente, el papel de la gramática en la enseñanza de lenguas se ha justificado y cuestionado tanto por teorías lingüísticas como, actualmente, dentro del Marco Común Europeo de Referencia. La forma de contemplar la gramática influye de modo fundamental en la metodología docente, en la elaboración de manuales de texto y en los procesos de aprendizaje, entre otros. Este artículo revisa el papel de la gramática en los últimos cincuenta años prestando especial atención al método comunicativo, los post-comunicativos y dentro del Marco Com

  3. A method of camera calibration in the measurement process with reference mark for approaching observation space target

    Science.gov (United States)

    Zhang, Hua; Zeng, Luan

    2017-11-01

    Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.

  4. Study of the Prevalence of Causative Bacterial&Protozoal Agents of in Stool Samples of 470 Gastroenteritis Patients Referring to the Nikoopour Clinic in Yazd,Iran

    Directory of Open Access Journals (Sweden)

    MR Sharifi

    2004-04-01

    Full Text Available Interoduction: Gasteroenteritis is one of the problems worth consideration all over the world. It is one of the important causes of mortality, especially in children < 5 years of age, in developing countries including Iran. The aim of this descriptive study was to determine the demographic conditions influencing the presence of causative bacteria and protozoa, followed by antibiograms of isolated bacteria from stool samples of patients with gasteroenteritis referring to Nikoopour Clinic in the city of Yazd, Iran from 1998 – 2001. Materials and method: A total of 470 samples were microbiologically examined by direct method, culture and then antibiogramed. In order to isolate the possible bacteria, differential and selected media were used. Also, wet – mount technique was applied for detection of protozoa. Results: Results revealed that 272 samples (57.9% were infected by pathogenic bacteria or protozoa. 138 (50.8% pathogenic specimens were from male patients and the remaining 134(49.3% were from female patients. Isolated species were: Enteropathogenic E.coli 117(43%, Shigella 51(18.8%, Salmonella.interetidis 25(9.2%, C.jejuni 16(5.9%, Giardia lambdia 51(18.8% and Amoebae spp 12(4.4%. The most commonly detected shigella species was dysenteriae, (74.5% while boydii with 2% was the least common type observed in the specimens. Except shigella, all the other bacteria were more common in males than female, but insignificant statistically. In order to determine the sensitivity and/or resistance of pathogenic bacteria, antibiogram test was performed using selected antibiotic disks such as Ampicillin, Nalidixic Acid, Ciprofloxacin, Gentamycin and Sulfamethaxazole. Conclusion: Results revealed that some patients were probably infected by pathogenic factors other than bacteria or protozoa. Since all viruses and parasites are almost resistant to antibiotics and on the other hand, administration of antibiotics may lead to resistance of bacterial agents

  5. Geologic Mapping and Paired Geochemical-Paleomagnetic Sampling of Reference Sections in the Grande Ronde Basalt: An Example from the Bingen Section, Columbia River Gorge, Washington

    Science.gov (United States)

    Sawlan, M.; Hagstrum, J. T.; Wells, R. E.

    2011-12-01

    We have completed comprehensive geochemical (GC) and paleomagnetic (PM) sampling of individual lava flows from eight reference stratigraphic sections in the Grande Ronde Basalt (GRB), Columbia River Basalt Group [Hagstrum et al., 2009, GSA Ann. Mtg, Portland (abst); Hagstrum et al., 2010, AGU Fall Mtg, San Francisco (abst)]. These sections, distributed across the Columbia Plateau and eastern Columbia River Gorge, contain as many as 30 flows, are up to 670 m thick, span upper magneto-stratigraphic zones R2 and N2, and, in some locations, also contain one or more N1 flows. In concert with GC and PM sampling, we have carried out detailed geologic mapping of these sections, typically at a scale of 1:3,000 to 1:5,000, using GPS, digital imagery from the National Aerial Imagery Program (NAIP), and compilation in GIS. GRB member and informal unit names of Reidel et al. [1989, GSA Sp. Paper 239] generally have been adopted, although two new units are identified and named within the N2 zone. Notably, a distinctive PM direction for intercalated lavas of several lower N2 units indicates coeval eruption of compositionally distinct units; this result contrasts with the scenario of serial stratigraphic succession of GRB units proposed by Reidel et al. [1989]. Our objectives in the mapping include: Confirming the integrity of the stratigraphic sequences by documenting flow contacts and intraflow horizons (changes in joint patterns or vesicularity); assessing fault displacements; and, establishing precisely located samples in geologic context such that selected sites can be unambiguously reoccupied. A geologic map and GC-PM data for the Bingen section, along the north side of the Columbia River, are presented as an example of our GRB reference section mapping and sampling. One of our thicker sections (670 m) along which 30 flows are mapped, the Bingen section spans 7 km along WA State Hwy 14, from near the Hood River Bridge ESE to Locke Lake. This section cuts obliquely through a

  6. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    Science.gov (United States)

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Conclusions about children’s reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information

    Science.gov (United States)

    Baxter, Suzanne Domel; Smith, Albert F.; Hardin, James W.; Nichols, Michele D.

    2008-01-01

    Objective Validation-study data are used to illustrate that conclusions about children’s reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information—conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Subjects and design Children were observed eating school meals on one day (n = 12), or two (n = 13) or three (n = 79) nonconsecutive days separated by ≥25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (protein, carbohydrate, fat), and compared. Main outcome measures For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), inflation ratios (error measures). Statistical analyses Mixed-model analyses. Results Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (Ps > .61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (Ps macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. Applications The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients. PMID:17383265

  8. Conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information.

    Science.gov (United States)

    Baxter, Suzanne Domel; Smith, Albert F; Hardin, James W; Nichols, Michele D

    2007-04-01

    Validation study data are used to illustrate that conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information-conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Children were observed eating school meals on 1 day (n=12), or 2 (n=13) or 3 (n=79) nonconsecutive days separated by >or=25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (ie, protein, carbohydrate, and fat), and compared. For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), and inflation ratios (error measures). Mixed-model analyses. Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (all four P values >0.61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (all four P values macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients.

  9. A coarse-graining approach for molecular simulation that retains the dynamics of the all-atom reference system by implementing hydrodynamic interactions

    Energy Technology Data Exchange (ETDEWEB)

    Markutsya, Sergiy [Ames Laboratory, Iowa State University, Ames, Iowa 50011 (United States); Lamm, Monica H., E-mail: mhlamm@iastate.edu [Ames Laboratory, Iowa State University, Ames, Iowa 50011 (United States); Department of Chemical and Biological Engineering, Iowa State University, Ames, Iowa 50011 (United States)

    2014-11-07

    We report on a new approach for deriving coarse-grained intermolecular forces that retains the frictional contribution that is often discarded by conventional coarse-graining methods. The approach is tested for water and an aqueous glucose solution, and the results from the new implementation for coarse-grained molecular dynamics simulation show remarkable agreement with the dynamics obtained from reference all-atom simulations. The agreement between the structural properties observed in the coarse-grained and all-atom simulations is also preserved. We discuss how this approach may be applied broadly to any existing coarse-graining method where the coarse-grained models are rigorously derived from all-atom reference systems.

  10. A coarse-graining approach for molecular simulation that retains the dynamics of the all-atom reference system by implementing hydrodynamic interactions

    International Nuclear Information System (INIS)

    Markutsya, Sergiy; Lamm, Monica H.

    2014-01-01

    We report on a new approach for deriving coarse-grained intermolecular forces that retains the frictional contribution that is often discarded by conventional coarse-graining methods. The approach is tested for water and an aqueous glucose solution, and the results from the new implementation for coarse-grained molecular dynamics simulation show remarkable agreement with the dynamics obtained from reference all-atom simulations. The agreement between the structural properties observed in the coarse-grained and all-atom simulations is also preserved. We discuss how this approach may be applied broadly to any existing coarse-graining method where the coarse-grained models are rigorously derived from all-atom reference systems

  11. Development of a new certified reference material of diosgenin using mass balance approach and Coulometric titration method.

    Science.gov (United States)

    Gong, Ningbo; Zhang, Baoxi; Hu, Fan; Du, Hui; Du, Guanhua; Gao, Zhaolin; Lu, Yang

    2014-12-01

    Certified reference materials (CRMs) can be used as a valuable tool to validate the trueness of measurement methods and to establish metrological traceability of analytical results. Diosgenin has been selected as a candidate reference material. Characterization of the material relied on two different methods, mass balance method and Coulometric titration method (CT). The certified value of diosgenin CRM is 99.80% with an expanded uncertainty of 0.37% (k=2). The new CRM of diosgenin can be used to validate analytical methods, improve the accuracy of measurement data and control the quality of diosgenin in relevant pharmaceutical formulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Technical note: A simple approach for efficient collection of field reference data for calibrating remote sensing mapping of northern wetlands

    Directory of Open Access Journals (Sweden)

    M. Gålfalk

    2018-03-01

    Full Text Available The calibration and validation of remote sensing land cover products are highly dependent on accurate field reference data, which are costly and practically challenging to collect. We describe an optical method for collection of field reference data that is a fast, cost-efficient, and robust alternative to field surveys and UAV imaging. A lightweight, waterproof, remote-controlled RGB camera (GoPro HERO4 Silver, GoPro Inc. was used to take wide-angle images from 3.1 to 4.5 m in altitude using an extendable monopod, as well as representative near-ground (< 1 m images to identify spectral and structural features that correspond to various land covers in present lighting conditions. A semi-automatic classification was made based on six surface types (graminoids, water, shrubs, dry moss, wet moss, and rock. The method enables collection of detailed field reference data, which is critical in many remote sensing applications, such as satellite-based wetland mapping. The method uses common non-expensive equipment, does not require special skills or training, and is facilitated by a step-by-step manual that is included in the Supplement. Over time a global ground cover database can be built that can be used as reference data for studies of non-forested wetlands from satellites such as Sentinel 1 and 2 (10 m pixel size.

  13. A NEW APPROACH FOR CULTURING LEMNA MINOR (DUCKWEED) AND STANDARDIZED METHOD FOR USING ATRAZINE AS A REFERENCE TOXICANT

    Science.gov (United States)

    Lemna minor (Duckweed) is commonly used in aquatic toxicity investigations. Methods for culturing and testing with reference toxicants, such as atrazine, are somewhat variable among researchers. Our goal was to develop standardized methods of culturing and testing for use with L....

  14. The Council of Europe's "Common European Framework of Reference for Languages" (CEFR): Approach, Status, Function and Use

    Science.gov (United States)

    Martyniuk, Waldemar

    2012-01-01

    The Council of Europe's "Common European Framework of Reference for Languages" is rapidly becoming a powerful instrument for shaping language education policies in Europe and beyond. The task of relating language policies, language curricula, teacher education and training, textbook and course design and content, examinations and…

  15. A simulation approach to assessing sampling strategies for insect pests: an example with the balsam gall midge.

    Directory of Open Access Journals (Sweden)

    R Drew Carleton

    Full Text Available Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with "pre-sampling" data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n ∼ 100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand was the most efficient, with sample means converging on true mean density for sample sizes of n ∼ 25-40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods.

  16. Analysis of positive control STR experiments reveals that results obtained for FGA, D3S1358, and D13S317 condition the success rate of the analysis of routine reference samples.

    Science.gov (United States)

    Murigneux, Valentine; Dufour, Anne-Béatrice; Lobry, Jean R; Pène, Laurent

    2014-07-01

    About 120,000 reference samples are analyzed each year in the Forensic Laboratory of Lyon. A total of 1640 positive control experiments used to validate and optimize the analytical method in the routine process were submitted to a multivariate exploratory data analysis approach with the aim of better understanding the underlying sources of variability. The peak heights of the 16 genetic markers targeted by the AmpFℓSTR(®) Identifiler(®) STR kit were used as variables of interest. Six different 3130xl genetic analyzers located in the same controlled environment were involved. Two major sources of variability were found: (i) the DNA load of the sample modulates all peak heights in a similar way so that the 16 markers are highly correlated, (ii) the genetic analyzer used with a locus-specific response for peak height and a better sensitivity for the most recently acquired. Three markers (FGA, D3S1358, and D13S317) were found to be of special interest to predict the success rate observed in the routine process. © 2014 American Academy of Forensic Sciences.

  17. Apollo Lunar Sample Integration into Google Moon: A New Approach to Digitization

    Science.gov (United States)

    Dawson, Melissa D.; Todd, nancy S.; Lofgren, Gary E.

    2011-01-01

    The Google Moon Apollo Lunar Sample Data Integration project is part of a larger, LASER-funded 4-year lunar rock photo restoration project by NASA s Acquisition and Curation Office [1]. The objective of this project is to enhance the Apollo mission data already available on Google Moon with information about the lunar samples collected during the Apollo missions. To this end, we have combined rock sample data from various sources, including Curation databases, mission documentation and lunar sample catalogs, with newly available digital photography of rock samples to create a user-friendly, interactive tool for learning about the Apollo Moon samples

  18. Technical note: A simple approach for efficient collection of field reference data for calibrating remote sensing mapping of northern wetlands

    Science.gov (United States)

    Gålfalk, Magnus; Karlson, Martin; Crill, Patrick; Bousquet, Philippe; Bastviken, David

    2018-03-01

    The calibration and validation of remote sensing land cover products are highly dependent on accurate field reference data, which are costly and practically challenging to collect. We describe an optical method for collection of field reference data that is a fast, cost-efficient, and robust alternative to field surveys and UAV imaging. A lightweight, waterproof, remote-controlled RGB camera (GoPro HERO4 Silver, GoPro Inc.) was used to take wide-angle images from 3.1 to 4.5 m in altitude using an extendable monopod, as well as representative near-ground (wetland mapping. The method uses common non-expensive equipment, does not require special skills or training, and is facilitated by a step-by-step manual that is included in the Supplement. Over time a global ground cover database can be built that can be used as reference data for studies of non-forested wetlands from satellites such as Sentinel 1 and 2 (10 m pixel size).

  19. Comparing The Effects Of Reference Pricing And Centers-Of-Excellence Approaches To Value-Based Benefit Design.

    Science.gov (United States)

    Zhang, Hui; Cowling, David W; Facer, Matthew

    2017-12-01

    Various health insurance benefit designs based on value-based purchasing have been promoted to steer patients to high-value providers, but little is known about the designs' relative effectiveness and underlying mechanisms. We compared the impact of two designs implemented by the California Public Employees' Retirement System on inpatient hospital total hip or knee replacement: a reference-based pricing design for preferred provider organizations (PPOs) and a centers-of-excellence design for health maintenance organizations (HMOs). Payment and utilization data for the procedures in the period 2008-13 were evaluated using pre-post and quasi-experimental designs at the system and health plan levels, adjusting for demographic characteristics, case-mix, and other confounders. We found that both designs prompted higher use of designated low-price high-quality facilities and reduced average replacement expenses per member at the plan and system levels. However, the designs used different routes: The reference-based pricing design reduced average replacement payments per case in PPOs by 26.7 percent in the first year, compared to HMOs, but did not lower PPO members' utilization rates. In contrast, the centers-of-excellence design lowered HMO members' utilization rates by 29.2 percent in the first year, compared to PPOs, but did not reduce HMO average replacement payments per case. The reference-based pricing design appears more suitable for reducing price variation, and the centers-of-excellence design for addressing variation in use.

  20. Evaluation of the uncertainty around the mean level of 137Cs fallout at undisturbed reference site: A simple statistical approach

    International Nuclear Information System (INIS)

    Mabit, L.; Gonsalves, B.C.; Chen, X.; Toloza, A.; Weltin, G.; Darby, I.G.; Padilla-Alvarez, R.

    2015-01-01

    One of the major issues related to the use of 137 Cs as a soil erosion/sedimentation tracer is the selection of the reference site which is used to estimate the initial 137 Cs fallout input (also termed reference inventory). The initial 137 Cs fallout input is a key component of the conversion models used to estimate erosion and sedimentation rates from the 137 Cs data set. The selection and evaluation of the validity of reference sites have been explained in detail in the recent IAEA TECDOC 1741 ‘‘Guidelines for using Fallout radionuclides to assess erosion and effectiveness of soil conservation strategies’’. An investigation was carried out at the experimental research station of the Austrian Agency for Health and Food Safety (AGES), in Austria, Grabenegg (48°07'40 N , 15°13'16 E ). Located at an altitude of 260 m a.s.l, with an annual average temperature of 8.4 °C and annual precipitation of 686 mm, the soil of this area has been classified as Gleyic Cambisol with a silt-loamy texture

  1. Theoretical Antecedents of Standing at Work: An Experience Sampling Approach Using the Theory of Planned Behavior

    Science.gov (United States)

    Meyer, M. Renée Umstattd; Wu, Cindy; Walsh, Shana M.

    2016-01-01

    Time spent sitting has been associated with an increased risk of diabetes, cancer, obesity, and mental health impairments. However, 75% of Americans spend most of their days sitting, with work-sitting accounting for 63% of total daily sitting time. Little research examining theory-based antecedents of standing or sitting has been conducted. This lack of solid groundwork makes it difficult to design effective intervention strategies to decrease sitting behaviors. Using the Theory of Planned Behavior (TPB) as our theoretical lens to better understand factors related with beneficial standing behaviors already being practiced, we examined relationships between TPB constructs and time spent standing at work among “positive deviants” (those successful in behavior change). Experience sampling methodology (ESM), 4 times a day (midmorning, before lunch, afternoon, and before leaving work) for 5 consecutive workdays (Monday to Friday), was used to assess employees' standing time. TPB scales assessing attitude (α = 0.81–0.84), norms (α = 0.83), perceived behavioral control (α = 0.77), and intention (α = 0.78) were developed using recommended methods and collected once on the Friday before the ESM surveys started. ESM data are hierarchically nested, therefore we tested our hypotheses using multilevel structural equation modeling with Mplus. Hourly full-time university employees (n = 50; 70.6% female, 84.3% white, mean age = 44 (SD = 11), 88.2% in full-time staff positions) with sedentary occupation types (time at desk while working ≥6 hours/day) participated. A total of 871 daily surveys were completed. Only perceived behavioral control (β = 0.45, p deviance approach to enhance perceived behavioral control, in addition to implementing environmental changes like installing standing desks. PMID:29546189

  2. Descriptions of sampling practices within five approaches to qualitative research in education and the health sciences

    OpenAIRE

    Guetterman, Timothy C.

    2015-01-01

    Although recommendations exist for determining qualitative sample sizes, the literature appears to contain few instances of research on the topic. Practical guidance is needed for determining sample sizes to conduct rigorous qualitative research, to develop proposals, and to budget resources. The purpose of this article is to describe qualitative sample size and sampling practices within published studies in education and the health sciences by research design: case study, ethnography, ground...

  3. Equilibrium sampling of hydrophobic organic chemicals in sediments: challenges and new approaches

    DEFF Research Database (Denmark)

    Schaefer, S.; Mayer, Philipp; Becker, B.

    2015-01-01

    ) are considered to be the effective concentrations for diffusive uptake and partitioning, and they can be measured by equilibrium sampling. We have thus applied glass jars with multiple coating thicknesses for equilibrium sampling of HOCs in sediment samples from various sites in different German rivers...

  4. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  5. Theoretical Antecedents of Standing at Work: An Experience Sampling Approach Using the Theory of Planned Behavior.

    Science.gov (United States)

    Meyer, M Renée Umstattd; Wu, Cindy; Walsh, Shana M

    2016-01-01

    Time spent sitting has been associated with an increased risk of diabetes, cancer, obesity, and mental health impairments. However, 75% of Americans spend most of their days sitting, with work-sitting accounting for 63% of total daily sitting time. Little research examining theory-based antecedents of standing or sitting has been conducted. This lack of solid groundwork makes it difficult to design effective intervention strategies to decrease sitting behaviors. Using the Theory of Planned Behavior (TPB) as our theoretical lens to better understand factors related with beneficial standing behaviors already being practiced, we examined relationships between TPB constructs and time spent standing at work among "positive deviants" (those successful in behavior change). Experience sampling methodology (ESM), 4 times a day (midmorning, before lunch, afternoon, and before leaving work) for 5 consecutive workdays (Monday to Friday), was used to assess employees' standing time. TPB scales assessing attitude (α = 0.81-0.84), norms (α = 0.83), perceived behavioral control (α = 0.77), and intention (α = 0.78) were developed using recommended methods and collected once on the Friday before the ESM surveys started. ESM data are hierarchically nested, therefore we tested our hypotheses using multilevel structural equation modeling with Mplus. Hourly full-time university employees (n = 50; 70.6% female, 84.3% white, mean age = 44 (SD = 11), 88.2% in full-time staff positions) with sedentary occupation types (time at desk while working ≥6 hours/day) participated. A total of 871 daily surveys were completed. Only perceived behavioral control (β = 0.45, p work-standing at the event-level (model fit: just fit); mediation through intention was not supported. This is the first study to examine theoretical antecedents of real-time work-standing in a naturalistic field setting among positive deviants. These relationships should be further examined, and behavioral intervention

  6. Theoretical Antecedents of Standing at Work: An Experience Sampling Approach Using the Theory of Planned Behavior

    Directory of Open Access Journals (Sweden)

    M. Renée Umstattd Meyer

    2016-09-01

    Full Text Available Time spent sitting has been associated with an increased risk of diabetes, cancer, obesity, and mental health impairments. However, 75% of Americans spend most of their days sitting, with work-sitting accounting for 63% of total daily sitting time. Little research examining theory-based antecedents of standing or sitting has been conducted. This lack of solid groundwork makes it difficult to design effective intervention strategies to decrease sitting behaviors. Using the Theory of Planned Behavior (TPB as our theoretical lens to better understand factors related with beneficial standing behaviors already being practiced, we examined relationships between TPB constructs and time spent standing at work among “positive deviants” (those successful in behavior change. Experience sampling methodology (ESM, 4 times a day (midmorning, before lunch, afternoon, and before leaving work for 5 consecutive workdays (Monday to Friday, was used to assess employees’ standing time. TPB scales assessing attitude (α = 0.81–0.84, norms (α = 0.83, perceived behavioral control (α = 0.77, and intention (α = 0.78 were developed using recommended methods and collected once on the Friday before the ESM surveys started. ESM data are hierarchically nested, therefore we tested our hypotheses using multilevel structural equation modeling with Mplus. Hourly full-time university employees (n = 50; 70.6% female, 84.3% white, mean age = 44 (SD = 11, 88.2%in full-time staff positions with sedentary occupation types (time at desk while working ≥6 hours/day participated. A total of 871 daily surveys were completed. Only perceived behavioral control (β = 0.45, p < 0.05 was related with work-standing at the event-level (model fit: just fit; mediation through intention was not supported. This is the first study to examine theoretical antecedents of real-time work-standing in a naturalistic field setting among positive deviants. These relationships should be further

  7. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    Science.gov (United States)

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  8. A Rational Approach for Discovering and Validating Cancer Markers in Very Small Samples Using Mass Spectrometry and ELISA Microarrays

    Directory of Open Access Journals (Sweden)

    Richard C. Zangar

    2004-01-01

    Full Text Available Identifying useful markers of cancer can be problematic due to limited amounts of sample. Some samples such as nipple aspirate fluid (NAF or early-stage tumors are inherently small. Other samples such as serum are collected in larger volumes but archives of these samples are very valuable and only small amounts of each sample may be available for a single study. Also, given the diverse nature of cancer and the inherent variability in individual protein levels, it seems likely that the best approach to screen for cancer will be to determine the profile of a battery of proteins. As a result, a major challenge in identifying protein markers of disease is the ability to screen many proteins using very small amounts of sample. In this review, we outline some technological advances in proteomics that greatly advance this capability. Specifically, we propose a strategy for identifying markers of breast cancer in NAF that utilizes mass spectrometry (MS to simultaneously screen hundreds or thousands of proteins in each sample. The best potential markers identified by the MS analysis can then be extensively characterized using an ELISA microarray assay. Because the microarray analysis is quantitative and large numbers of samples can be efficiently analyzed, this approach offers the ability to rapidly assess a battery of selected proteins in a manner that is directly relevant to traditional clinical assays.

  9. A Strategic Spatial Planning Approach to Cross-Border Place Branding with References to Galicia and Northern Portugal

    NARCIS (Netherlands)

    da Silva Oliveira, Eduardo; Zenker, Sebastian; Jacobsen, Björn

    2015-01-01

    This chapter adopts a strategic spatial planning approach to think strategically about potential joint place-branding initiatives between cross-border regions. The case study focuses on the extended cross-border European region composed of the NUTS III Alto Minho, Cávado, Ave, Área Metropolitina do

  10. Competency-Based Approaches: Linking Theory and Practice in Professional Education with Particular Reference to Health Education

    Science.gov (United States)

    Gonczi, Andrew

    2013-01-01

    Paul Hager and I worked on a large number of research projects and publications throughout the 1990s. The focus of this work was on developing a competency-based approach to professional education and assessment. I review this work and its impact over the years. Notwithstanding the fact that most professional associations today have a competency…

  11. Local diagnostic reference levels, approaches and compare the values in the South Bohemia Region in view of radiation protection inspector

    International Nuclear Information System (INIS)

    Zemanova, E.

    2014-01-01

    This paper compares the value of local diagnostic reference levels(the LDRL)in health facilities of the South Bohemia Region. The work is motivated by questions of licensees, who would like to know their position in terms of the LDRL compared to other workplaces. Also by the activity of the inspector who can identify the problematic workplaces, where is necessary to increase attention to optimization, exposure, or justification. In connection with the ongoing internal audits in licensee workplaces the information about the status of the LDRL among others is current, motivating licensee to changes, optimization and verification of compliance with the recommendation of the National radiological standard of Ministry of Health (author)

  12. Sample Entropy-Based Approach to Evaluate the Stability of Double-Wire Pulsed MIG Welding

    Directory of Open Access Journals (Sweden)

    Ping Yao

    2014-01-01

    Full Text Available According to the sample entropy, this paper deals with a quantitative method to evaluate the current stability in double-wire pulsed MIG welding. Firstly, the sample entropy of current signals with different stability but the same parameters is calculated. The results show that the more stable the current, the smaller the value and the standard deviation of sample entropy. Secondly, four parameters, which are pulse width, peak current, base current, and frequency, are selected for four-level three-factor orthogonal experiment. The calculation and analysis of desired signals indicate that sample entropy values are affected by welding current parameters. Then, a quantitative method based on sample entropy is proposed. The experiment results show that the method can preferably quantify the welding current stability.

  13. COORDINATE TRANSFORMATION USING FEATHERSTONE AND VANÍČEK PROPOSED APPROACH - A CASE STUDY OF GHANA GEODETIC REFERENCE NETWORK

    Directory of Open Access Journals (Sweden)

    Yao Yevenyo Ziggah

    2017-03-01

    Full Text Available Most developing countries like Ghana are yet to adopt the geocentric datum for its surveying and mapping purposes. It is well known and documented that non-geocentric datums based on its establishment have more distortions in height compared with satellite datums. Most authors have argued that combining such height with horizontal positions (latitude and longitude in the transformation process could introduce unwanted distortions to the network. This is because the local geodetic height in most cases is assumed to be determined to a lower accuracy compared with the horizontal positions. In the light of this, a transformation model was proposed by Featherstone and Vaníček (1999 which avoids the use of height in both global and local datums in coordinate transformation. It was confirmed that adopting such a method reduces the effect of distortions caused by geodetic height on the transformation parameters estimated. Therefore, this paper applied Featherstone and Vaníček (FV model for the first time to a set of common points coordinates in Ghana geodetic reference network. The FV model was used to transform coordinates from global datum (WGS84 to local datum (Accra datum. The results obtained based on the Root Mean Square Error (RMSE and Mean Absolute Error (MAE in both Eastings and Northings were satisfactory. Thus, a RMSE value of 0.66 m and 0.96 m were obtained for the Eastings and Northings while 0.76 m and 0.73 m were the MAE values achieved. Also, the FV model attained a transformation accuracy of 0.49 m. Hence, this study will serve as a preliminary investigation in avoiding the use of height in coordinate transformation within Ghana’s geodetic reference network.

  14. Impact of frequent cerebrospinal fluid sampling on Aβ levels: systematic approach to elucidate influencing factors.

    Science.gov (United States)

    Van Broeck, Bianca; Timmers, Maarten; Ramael, Steven; Bogert, Jennifer; Shaw, Leslie M; Mercken, Marc; Slemmon, John; Van Nueten, Luc; Engelborghs, Sebastiaan; Streffer, Johannes Rolf

    2016-05-19

    Cerebrospinal fluid (CSF) amyloid-beta (Aβ) peptides are predictive biomarkers for Alzheimer's disease and are proposed as pharmacodynamic markers for amyloid-lowering therapies. However, frequent sampling results in fluctuating CSF Aβ levels that have a tendency to increase compared with baseline. The impact of sampling frequency, volume, catheterization procedure, and ibuprofen pretreatment on CSF Aβ levels using continuous sampling over 36 h was assessed. In this open-label biomarker study, healthy participants (n = 18; either sex, age 55-85 years) were randomized into one of three cohorts (n = 6/cohort; high-frequency sampling). In all cohorts except cohort 2 (sampling started 6 h post catheterization), sampling through lumbar catheterization started immediately post catheterization. Cohort 3 received ibuprofen (800 mg) before catheterization. Following interim data review, an additional cohort 4 (n = 6) with an optimized sampling scheme (low-frequency and lower volume) was included. CSF Aβ(1-37), Aβ(1-38), Aβ(1-40), and Aβ(1-42) levels were analyzed. Increases and fluctuations in mean CSF Aβ levels occurred in cohorts 1-3 at times of high-frequency sampling. Some outliers were observed (cohorts 2 and 3) with an extreme pronunciation of this effect. Cohort 4 demonstrated minimal fluctuation of CSF Aβ both on a group and an individual level. Intersubject variability in CSF Aβ profiles over time was observed in all cohorts. CSF Aβ level fluctuation upon catheterization primarily depends on the sampling frequency and volume, but not on the catheterization procedure or inflammatory reaction. An optimized low-frequency sampling protocol minimizes or eliminates fluctuation of CSF Aβ levels, which will improve the capability of accurately measuring the pharmacodynamic read-out for amyloid-lowering therapies. ClinicalTrials.gov NCT01436188 . Registered 15 September 2011.

  15. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    Science.gov (United States)

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders

  16. Sampling maternal care behaviour in domestic dogs: What's the best approach?

    Science.gov (United States)

    Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J

    2017-07-01

    Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.

  17. Simulated Job Samples: A Student-Centered Approach to Vocational Exploration and Evaluation.

    Science.gov (United States)

    Richter-Stein, Caryn; Stodden, Robert A.

    1981-01-01

    Incorporating simulated job samples into the junior high school curriculum can provide vocational exploration opportunities as well as assessment data on special needs students. Students can participate as active learners and decision makers. (CL)

  18. A sub-sampled approach to extremely low-dose STEM

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, A. [OptimalSensing, Southlake, Texas 76092, USA; Duke University, ECE, Durham, North Carolina 27708, USA; Luzi, L. [Rice University, ECE, Houston, Texas 77005, USA; Yang, H. [Lawrence Berkeley National Laboratory, Berkeley, California 94720, USA; Kovarik, L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Mehdi, B. L. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom; Liyu, A. [Pacific NW National Laboratory, Richland, Washington 99354, USA; Gehm, M. E. [Duke University, ECE, Durham, North Carolina 27708, USA; Browning, N. D. [Pacific NW National Laboratory, Richland, Washington 99354, USA; University of Liverpool, Materials Engineering, Liverpool L69 3GH, United Kingdom

    2018-01-22

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e-2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis of the node distribution in metal-organic frameworks (MOFs).

  19. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  20. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  1. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    Science.gov (United States)

    2011-10-20

    ..., this 14th day of October 2011. Kevin Shea, Acting Administrator, Animal and Plant Health Inspection... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2011-0092] Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative Monitoring and...

  2. Using Paraffin PCM, Cryogel and TEC to Maintain Comet Surface Sample Cold from Earth Approach Through Retrieval

    Science.gov (United States)

    Choi, Michael K.

    2017-01-01

    An innovative thermal design concept to maintain comet surface samples cold (for example, 263 degrees Kelvin, 243 degrees Kelvin or 223 degrees Kelvin) from Earth approach through retrieval is presented. It uses paraffin phase change material (PCM), Cryogel insulation and thermoelectric cooler (TEC), which are commercially available.

  3. Large-volume injection of sample diluents not miscible with the mobile phase as an alternative approach in sample preparation for bioanalysis: an application for fenspiride bioequivalence.

    Science.gov (United States)

    Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor

    2011-09-01

    Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.

  4. Satellite Image Classification of Building Damages Using Airborne and Satellite Image Samples in a Deep Learning Approach

    Science.gov (United States)

    Duarte, D.; Nex, F.; Kerle, N.; Vosselman, G.

    2018-05-01

    The localization and detailed assessment of damaged buildings after a disastrous event is of utmost importance to guide response operations, recovery tasks or for insurance purposes. Several remote sensing platforms and sensors are currently used for the manual detection of building damages. However, there is an overall interest in the use of automated methods to perform this task, regardless of the used platform. Owing to its synoptic coverage and predictable availability, satellite imagery is currently used as input for the identification of building damages by the International Charter, as well as the Copernicus Emergency Management Service for the production of damage grading and reference maps. Recently proposed methods to perform image classification of building damages rely on convolutional neural networks (CNN). These are usually trained with only satellite image samples in a binary classification problem, however the number of samples derived from these images is often limited, affecting the quality of the classification results. The use of up/down-sampling image samples during the training of a CNN, has demonstrated to improve several image recognition tasks in remote sensing. However, it is currently unclear if this multi resolution information can also be captured from images with different spatial resolutions like satellite and airborne imagery (from both manned and unmanned platforms). In this paper, a CNN framework using residual connections and dilated convolutions is used considering both manned and unmanned aerial image samples to perform the satellite image classification of building damages. Three network configurations, trained with multi-resolution image samples are compared against two benchmark networks where only satellite image samples are used. Combining feature maps generated from airborne and satellite image samples, and refining these using only the satellite image samples, improved nearly 4 % the overall satellite image

  5. A holistic passive integrative sampling approach for assessing the presence and potential impacts of waterborne environmental contaminants

    Science.gov (United States)

    Petty, J.D.; Huckins, J.N.; Alvarez, D.A.; Brumbaugh, W. G.; Cranor, W.L.; Gale, R.W.; Rastall, A.C.; Jones-Lepp, T. L.; Leiker, T.J.; Rostad, C. E.; Furlong, E.T.

    2004-01-01

    As an integral part of our continuing research in environmental quality assessment approaches, we have developed a variety of passive integrative sampling devices widely applicable for use in defining the presence and potential impacts of a broad array of contaminants. The semipermeable membrane device has gained widespread use for sampling hydrophobic chemicals from water and air, the polar organic chemical integrative sampler is applicable for sequestering waterborne hydrophilic organic chemicals, the stabilized liquid membrane device is used to integratively sample waterborne ionic metals, and the passive integrative mercury sampler is applicable for sampling vapor phase or dissolved neutral mercury species. This suite of integrative samplers forms the basis for a new passive sampling approach for assessing the presence and potential toxicological significance of a broad spectrum of environmental contaminants. In a proof-of-concept study, three of our four passive integrative samplers were used to assess the presence of a wide variety of contaminants in the waters of a constructed wetland, and to determine the effectiveness of the constructed wetland in removing contaminants. The wetland is used for final polishing of secondary-treatment municipal wastewater and the effluent is used as a source of water for a state wildlife area. Numerous contaminants, including organochlorine pesticides, polycyclic aromatic hydrocarbons, organophosphate pesticides, and pharmaceutical chemicals (e.g., ibuprofen, oxindole, etc.) were detected in the wastewater. Herein we summarize the results of the analysis of the field-deployed samplers and demonstrate the utility of this holistic approach.

  6. Novel concepts for preparation of reference materials as whole water samples for priority substances at nanogram-per-liter level using model suspended particulate matter and humic acids

    NARCIS (Netherlands)

    Elordui-Zapatarietxe, S.; Fettig, I.; Philipp, R.; Gantois, F.; Lalère, B.; Swart, C.; Petrov, P.; Goenaga-Infante, H.; Vanermen, G.; Boom, G.; Emteborg, H.

    2015-01-01

    One of the unresolved issues of the European Water Framework Directive is the unavailability of realistic water reference materials for the organic priority pollutants at low nanogram-per-liter concentrations. In the present study, three different types of ready-to-use water test materials were

  7. Inflammatory potential in relation to the microbial content of settled dust samples collected from moisture domaged and reference schools: results of HITEA study.

    NARCIS (Netherlands)

    Huttunen, K.; Tirkkonen, J.; Täubel, M.; Krop, E.; Mikkonen, S.; Pekkanen, J.; Heederik, D.; Zock, J.P.; Hyvärinen, A.; Hirvonen, M.R.

    2016-01-01

    Aiming to identify factors causing the adverse health effects associated with moisture-damaged indoor environments, we analyzed immunotoxicological potential of settled dust from moisture-damaged and reference schools in relation to their microbiological composition. Mouse RAW264.7 macrophages were

  8. A sampling approach for predicting the eating quality of apples using visible-near infrared spectroscopy.

    Science.gov (United States)

    Martínez Vega, Mabel V; Sharifzadeh, Sara; Wulfsohn, Dvoralai; Skov, Thomas; Clemmensen, Line Harder; Toldam-Andersen, Torben B

    2013-12-01

    Visible-near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used representative samples of an early and a late season apple cultivar to evaluate model robustness (in terms of prediction ability and error) on the soluble solids content (SSC) and acidity prediction, in the wavelength range 400-1100 nm. A total of 196 middle-early season and 219 late season apples (Malus domestica Borkh.) cvs 'Aroma' and 'Holsteiner Cox' samples were used to construct spectral models for SSC and acidity. Partial least squares (PLS), ridge regression (RR) and elastic net (EN) models were used to build prediction models. Furthermore, we compared three sub-sample arrangements for forming training and test sets ('smooth fractionator', by date of measurement after harvest and random). Using the 'smooth fractionator' sampling method, fewer spectral bands (26) and elastic net resulted in improved performance for SSC models of 'Aroma' apples, with a coefficient of variation CVSSC = 13%. The model showed consistently low errors and bias (PLS/EN: R(2) cal = 0.60/0.60; SEC = 0.88/0.88°Brix; Biascal = 0.00/0.00; R(2) val = 0.33/0.44; SEP = 1.14/1.03; Biasval = 0.04/0.03). However, the prediction acidity and for SSC (CV = 5%) of the late cultivar 'Holsteiner Cox' produced inferior results as compared with 'Aroma'. It was possible to construct local SSC and acidity calibration models for early season apple cultivars with CVs of SSC and acidity around 10%. The overall model performance of these data sets also depend on the proper selection of training and test sets. The 'smooth fractionator' protocol provided an objective method for obtaining training and test sets that capture the existing variability of the fruit samples for construction of visible-NIR prediction models. The implication

  9. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Directory of Open Access Journals (Sweden)

    Faten A Taki

    Full Text Available Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2 is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR. The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO, ovariectomized rats in the absence (OVX or presence of E2 (OVXE2. These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  10. Architectural approach to the energy performance of buildings in a hot-dry climate with special reference to Egypt

    Energy Technology Data Exchange (ETDEWEB)

    Hamdy, I F

    1986-01-01

    A thesis is presented on the changing approach to architectural design of buildings in a hot, dry climate in view of the increased recognition of the importance of energy efficiency. The thermal performance of buildings in Egypt is used as an example and the nature of the local climate and human requirements are also studied. Other effects on the thermal performance considered include building form, orientation and surrounding conditions. An evaluative computer model is constructed and its applications allow the prediction on the energy performance of changing design parameters.

  11. A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples

    Science.gov (United States)

    Conley, Catharine; Steele, Andrew

    2016-07-01

    The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.

  12. Low-sampling-rate ultra-wideband channel estimation using a bounded-data-uncertainty approach

    KAUST Repository

    Ballal, Tarig

    2014-01-01

    This paper proposes a low-sampling-rate scheme for ultra-wideband channel estimation. In the proposed scheme, P pulses are transmitted to produce P observations. These observations are exploited to produce channel impulse response estimates at a desired sampling rate, while the ADC operates at a rate that is P times less. To avoid loss of fidelity, the interpulse interval, given in units of sampling periods of the desired rate, is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this situation and to achieve good performance without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. This estimator is shown to be related to the Bayesian linear minimum mean squared error (LMMSE) estimator. The performance of the proposed sub-sampling scheme was tested in conjunction with the new estimator. It is shown that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in most cases; while in the high SNR regime, it also outperforms the LMMSE estimator. © 2014 IEEE.

  13. Discovery and validation of plasma-protein biomarker panels for the detection of colorectal cancer and advanced adenoma in a Danish collection of samples from patients referred for diagnostic colonoscopy

    DEFF Research Database (Denmark)

    Blume, John E.; Wilhelmsen, Michael; Benz, Ryan W.

    2016-01-01

    and utilization of such a resource is an important step in the development of blood-based biomarker tests for colorectal cancer.Methods: We have created a subject data and biological sample resource, Endoscopy II, which is based on 4698 individuals referred for diagnostic colonoscopy in Denmark between May 2010...

  14. Reference Device-Assisted Adaptive Location Fingerprinting

    Directory of Open Access Journals (Sweden)

    Dongjin Wu

    2016-06-01

    Full Text Available Location fingerprinting suffers in dynamic environments and needs recalibration from time to time to maintain system performance. This paper proposes an adaptive approach for location fingerprinting. Based on real-time received signal strength indicator (RSSI samples measured by a group of reference devices, the approach applies a modified Universal Kriging (UK interpolant to estimate adaptive temporal and environmental radio maps. The modified UK can take the spatial distribution characteristics of RSSI into account. In addition, the issue of device heterogeneity caused by multiple reference devices is further addressed. To compensate the measuring differences of heterogeneous reference devices, differential RSSI metric is employed. Extensive experiments were conducted in an indoor field and the results demonstrate that the proposed approach not only adapts to dynamic environments and the situation of changing APs’ positions, but it is also robust toward measuring differences of heterogeneous reference devices.

  15. Teaching Methods and Their Impact on Students' Emotions in Mathematics: An Experience-Sampling Approach

    Science.gov (United States)

    Bieg, Madeleine; Goetz, Thomas; Sticca, Fabio; Brunner, Esther; Becker, Eva; Morger, Vinzenz; Hubbard, Kyle

    2017-01-01

    Various theoretical approaches propose that emotions in the classroom are elicited by appraisal antecedents, with subjective experiences of control playing a crucial role in this context. Perceptions of control, in turn, are expected to be influenced by the classroom social environment, which can include the teaching methods being employed (e.g.,…

  16. An Improved Asymptotic Sampling Approach For Stochastic Finite Element Stiffness of a Laterally Loaded Monopile

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammadjavad; Bayat, Mehdi; Andersen, Lars Vabbersgaard

    2012-01-01

    In this study a stochastic approach is conducted to obtain the horizontal and rotational stiffness of an offshore monopile foundation. A nonlinear stochastic p-y curve is integrated into a finite element scheme for calculation of the monopile response in over-consolidated clay having spatial...

  17. Sampling Practices and Social Spaces: Exploring a Hip-Hop Approach to Higher Education

    Science.gov (United States)

    Petchauer, Emery

    2010-01-01

    Much more than a musical genre, hip-hop culture exists as an animating force in the lives of many young adults. This article looks beyond the moral concerns often associated with rap music to explore how hip-hop as a larger set of expressions and practices implicates the educational experiences, activities, and approaches for students. The article…

  18. Functional approximations to posterior densities: a neural network approach to efficient sampling

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); J.F. Kaashoek (Johan); H.K. van Dijk (Herman)

    2002-01-01

    textabstractThe performance of Monte Carlo integration methods like importance sampling or Markov Chain Monte Carlo procedures greatly depends on the choice of the importance or candidate density. Usually, such a density has to be "close" to the target density in order to yield numerically accurate

  19. A sampling approach for predicting the eating quality of apples using visible–near infrared spectroscopy

    DEFF Research Database (Denmark)

    Vega, Mabel V Martínez; Sharifzadeh, Sara; Wulfsohn, Dvoralai

    2013-01-01

    BACKGROUND Visible–near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used represent......BACKGROUND Visible–near infrared spectroscopy remains a method of increasing interest as a fast alternative for the evaluation of fruit quality. The success of the method is assumed to be achieved by using large sets of samples to produce robust calibration models. In this study we used...... representative samples of an early and a late season apple cultivar to evaluate model robustness (in terms of prediction ability and error) on the soluble solids content (SSC) and acidity prediction, in the wavelength range 400–1100 nm. RESULTS A total of 196 middle–early season and 219 late season apples (Malus...... training and test sets (‘smooth fractionator’, by date of measurement after harvest and random). Using the ‘smooth fractionator’ sampling method, fewer spectral bands (26) and elastic net resulted in improved performance for SSC models of ‘Aroma’ apples, with a coefficient of variation CVSSC = 13...

  20. The 4-vessel Sampling Approach to Integrative Studies of Human Placental Physiology In Vivo.

    Science.gov (United States)

    Holme, Ane M; Holm, Maia B; Roland, Marie C P; Horne, Hildegunn; Michelsen, Trond M; Haugen, Guttorm; Henriksen, Tore

    2017-08-02

    The human placenta is highly inaccessible for research while still in utero. The current understanding of human placental physiology in vivo is therefore largely based on animal studies, despite the high diversity among species in placental anatomy, hemodynamics and duration of the pregnancy. The vast majority of human placenta studies are ex vivo perfusion studies or in vitro trophoblast studies. Although in vitro studies and animal models are essential, extrapolation of the results from such studies to the human placenta in vivo is uncertain. We aimed to study human placenta physiology in vivo at term, and present a detailed protocol of the method. Exploiting the intraabdominal access to the uterine vein just before the uterine incision during planned cesarean section, we collect blood samples from the incoming and outgoing vessels on the maternal and fetal sides of the placenta. When combining concentration measurements from blood samples with volume blood flow measurements, we are able to quantify placental and fetal uptake and release of any compound. Furthermore, placental tissue samples from the same mother-fetus pairs can provide measurements of transporter density and activity and other aspects of placental functions in vivo. Through this integrative use of the 4-vessel sampling method we are able to test some of the current concepts of placental nutrient transfer and metabolism in vivo, both in normal and pathological pregnancies. Furthermore, this method enables the identification of substances secreted by the placenta to the maternal circulation, which could be an important contribution to the search for biomarkers of placenta dysfunction.

  1. An approach for measuring the {sup 129}I/{sup 127}I ratio in fish samples

    Energy Technology Data Exchange (ETDEWEB)

    Kusuno, Haruka, E-mail: kusuno@um.u-tokyo.ac.jp [The University Museum, The University of Tokyo, 3-7-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Matsuzaki, Hiroyuki [The University Museum, The University of Tokyo, 3-7-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Nagata, Toshi; Miyairi, Yosuke; Yokoyama, Yusuke [Atmosphere and Ocean Research Institute, The University of Tokyo, 5-1-5, Kashiwanoha, Kashiwa-shi, Chiba 277-8564 (Japan); Ohkouchi, Naohiko [Japan Agency for Marine-Earth Science and Technology, 2-15, Natsushima-cho, Yokosuka-city, Kanagawa 237-0061 (Japan)

    2015-10-15

    The {sup 129}I/{sup 127}I ratio in marine fish samples was measured employing accelerator mass spectrometry. The measurement was successful because of the low experimental background of {sup 129}I. Pyrohydrolysis was applied to extract iodine from fish samples. The experimental background of pyrohydrolysis was checked carefully and evaluated as 10{sup 4}–10{sup 5} atoms {sup 129}I/combustion. The methodology employed in the present study thus required only 0.05–0.2 g of dried fish samples. The methodology was then applied to obtain the {sup 129}I/{sup 127}I ratio of marine fish samples collected from the Western Pacific Ocean as (0.63–1.2) × 10{sup −10}. These values were similar to the ratio for the surface seawater collected at the same station, 0.4 × 10{sup −10}. The {sup 129}I/{sup 127}I ratio of IAEA-414, which was a mix of fish from the Irish Sea and the North Sea, was also measured and determined as 1.82 × 10{sup −7}. Consequently, fish from the Western Pacific Ocean and the North Sea were distinguished by their {sup 129}I/{sup 127}I ratios. The {sup 129}I/{sup 127}I ratio is thus a direct indicator of the area of habitat of fish.

  2. Gender Wage Gap : A Semi-Parametric Approach With Sample Selection Correction

    NARCIS (Netherlands)

    Picchio, M.; Mussida, C.

    2010-01-01

    Sizeable gender differences in employment rates are observed in many countries. Sample selection into the workforce might therefore be a relevant issue when estimating gender wage gaps. This paper proposes a new semi-parametric estimator of densities in the presence of covariates which incorporates

  3. Rapid assessment of antimicrobial resistance prevalence using a Lot Quality Assurance sampling approach

    NARCIS (Netherlands)

    van Leth, Frank; den Heijer, Casper; Beerepoot, Marielle; Stobberingh, Ellen; Geerlings, Suzanne; Schultsz, Constance

    2017-01-01

    Increasing antimicrobial resistance (AMR) requires rapid surveillance tools, such as Lot Quality Assurance Sampling (LQAS). LQAS classifies AMR as high or low based on set parameters. We compared classifications with the underlying true AMR prevalence using data on 1335 Escherichia coli isolates

  4. Japanese Society for Laboratory Hematology flow cytometric reference method of determining the differential leukocyte count: external quality assurance using fresh blood samples.

    Science.gov (United States)

    Kawai, Y; Nagai, Y; Ogawa, E; Kondo, H

    2017-04-01

    To provide target values for the manufacturers' survey of the Japanese Society for Laboratory Hematology (JSLH), accurate standard data from healthy volunteers were needed for the five-part differential leukocyte count. To obtain such data, JSLH required an antibody panel that achieved high specificity (particularly for mononuclear cells) using simple gating procedures. We developed a flow cytometric method for determining the differential leukocyte count (JSLH-Diff) and validated it by comparison with the flow cytometric differential leukocyte count of the International Council for Standardization in Haematology (ICSH-Diff) and the manual differential count obtained by microscopy (Manual-Diff). First, the reference laboratory performed an imprecision study of JSLH-Diff and ICSH-Diff, as well as performing comparison among JSLH-Diff, Manual-Diff, and ICSH-Diff. Then two reference laboratories and seven participating laboratories performed imprecision and accuracy studies of JSLH-Diff, Manual-Diff, and ICSH-Diff. Simultaneously, six manufacturers' laboratories provided their own representative values by using automated hematology analyzers. The precision of both JSLH-Diff and ICSH-Diff methods was adequate. Comparison by the reference laboratory showed that all correlation coefficients, slopes and intercepts obtained by the JSLH-Diff, ICSH-Diff, and Manual-Diff methods conformed to the criteria. When the imprecision and accuracy of JSLH-Diff were assessed at seven laboratories, the CV% for lymphocytes, neutrophils, monocytes, eosinophils, and basophils was 0.5~0.9%, 0.3~0.7%, 1.7~2.6%, 3.0~7.9%, and 3.8~10.4%, respectively. More than 99% of CD45 positive leukocytes were identified as normal leukocytes by JSLH-Diff. When JSLH-Diff method were validated by comparison with Manual-Diff and ICSH-Diff, JSLH-Diff showed good performance as a reference method. © 2016 John Wiley & Sons Ltd.

  5. New approach of a transient ICP-MS measurement method for samples with high salinity.

    Science.gov (United States)

    Hein, Christina; Sander, Jonas Michael; Kautenburger, Ralf

    2017-03-01

    In the near future it is necessary to establish a disposal for high level nuclear waste (HLW) in deep and stable geological formations. In Germany typical host rocks are salt or claystone. Suitable clay formations exist in the south and in the north of Germany. The geochemical conditions of these clay formations show a strong difference. In the northern ionic strengths of the pore water up to 5M are observed. The determination of parameters like K d values during sorption experiments of metal ions like uranium or europium as homologues for trivalent actinides onto clay stones are very important for long term safety analysis. The measurement of the low concentrated, not sorbed analytes commonly takes place by inductively coupled plasma mass spectrometry (ICP-MS). A direct measurement of high saline samples like seawater with more than 1% total dissolved salt content is not possible. Alternatives like sample clean up, preconcentration or strong dilution have more disadvantages than advantages for example more preparation steps or additional and expensive components. With a small modification of the ICP-MS sample introduction system and a home-made reprogramming of the autosampler a transient analysing method was developed which is suitable for measuring metal ions like europium and uranium in high saline sample matrices up to 5M (NaCl). Comparisons at low ionic strength between the default and the transient measurement show the latter performs similarly well to the default measurement. Additionally no time consuming sample clean-up or expensive online dilution or matrix removal systems are necessary and the analysation shows a high sensitivity due to the data processing based on the peak area. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Growth references

    NARCIS (Netherlands)

    Buuren, S. van

    2007-01-01

    A growth reference describes the variation of an anthropometric measurement within a group of individuals. A reference is a tool for grouping and analyzing data and provides a common basis for comparing populations.1 A well known type of reference is the age-conditional growth diagram. The

  7. Participatory Communication Referred to Meta-Design Approach through the FleXpeaker™ Application of Innovative Material in Exhibition Design

    Directory of Open Access Journals (Sweden)

    Pei-Hsuan Su

    2016-07-01

    Full Text Available Modelling a communication system in material culture today always involves with objects, people, organizations, activities and interrelationships among them. The researcher suggests bringing together stakeholders engaged to exchange ideas, which the interactions relate to multiple professions and disciplines in a participatory scope of communication system. Owing to the invention of digital media, the status quo of images and sounds has revolutionized and caused changes of the mode of art exhibitions that produce activities and aesthetic concepts in terms of numerical representation, modularity, automation, visual variability and transcoding. Underlying a participatory-design approach, the research emphasizes a co-creative meta-interpretation of museum‟s visitors. In addition, the research delves further into the use of new media-FleXpeaker™ [ITRI], as the carrier. Combining art and design with innovative technology, the research focuses on examining design objects and innovative material which are applied in new media art and exhibition, in the hope to find new angles of participatory interpretation of the “integrated innovation” in curating an exhibition.

  8. The role of digital sample information within the digital geoscience infrastructure: a pragmatic approach

    Science.gov (United States)

    Howe, Michael

    2014-05-01

    Much of the digital geological information on the composition, properties and dynamics of the subsurface is based ultimately on physical samples, many of which are archived to provide a basis for the information. Online metadata catalogues of these collections have now been available for many years. Many of these are institutional and tightly focussed, with UK examples including the British Geological Survey's (BGS) palaeontological samples database, PalaeoSaurus (http://www.bgs.ac.uk/palaeosaurus/), and mineralogical and petrological sample database, Britrocks (http://www.bgs.ac.uk/data/britrocks.html) . There are now a growing number of international sample metadata databases, including The Palaeobiology Database (http://paleobiodb.org/) and SESAR, the IGSN (International Geo Sample Number) database (http://www.geosamples.org/catalogsearch/ ). More recently the emphasis has moved beyond metadata (locality, identification, age, citations, etc) to digital imagery, with the intention of providing the user with at least enough information to determine whether viewing the sample would be worthwhile. Recent BGS examples include high resolution (e.g. 7216 x 5412 pixel) hydrocarbon well core images (http://www.bgs.ac.uk/data/offshoreWells/wells.cfc?method=searchWells) , high resolution rock thin section images (e.g. http://www.largeimages.bgs.ac.uk/iip/britrocks.html?id=290000/291739 ) and building stone images (http://geoscenic.bgs.ac.uk/asset-bank/action/browseItems?categoryId=1547&categoryTypeId=1) . This has been developed further with high resolution stereo images. The Jisc funded GB3D type fossils online project delivers these as red-cyan anaglyphs (http://www.3d-fossils.ac.uk/). More innovatively, the GB3D type fossils project has laser scanned several thousand type fossils and the resulting 3d-digital models are now being delivered through the online portal. Importantly, this project also represents collaboration between the BGS, Oxford and Cambridge Universities

  9. Interactive Fuzzy Goal Programming approach in multi-response stratified sample surveys

    Directory of Open Access Journals (Sweden)

    Gupta Neha

    2016-01-01

    Full Text Available In this paper, we applied an Interactive Fuzzy Goal Programming (IFGP approach with linear, exponential and hyperbolic membership functions, which focuses on maximizing the minimum membership values to determine the preferred compromise solution for the multi-response stratified surveys problem, formulated as a Multi- Objective Non Linear Programming Problem (MONLPP, and by linearizing the nonlinear objective functions at their individual optimum solution, the problem is approximated to an Integer Linear Programming Problem (ILPP. A numerical example based on real data is given, and comparison with some existing allocations viz. Cochran’s compromise allocation, Chatterjee’s compromise allocation and Khowaja’s compromise allocation is made to demonstrate the utility of the approach.

  10. A review of single-sample-based models and other approaches for radiocarbon dating of dissolved inorganic carbon in groundwater

    Science.gov (United States)

    Han, L. F; Plummer, Niel

    2016-01-01

    Numerous methods have been proposed to estimate the pre-nuclear-detonation 14C content of dissolved inorganic carbon (DIC) recharged to groundwater that has been corrected/adjusted for geochemical processes in the absence of radioactive decay (14C0) - a quantity that is essential for estimation of radiocarbon age of DIC in groundwater. The models/approaches most commonly used are grouped as follows: (1) single-sample-based models, (2) a statistical approach based on the observed (curved) relationship between 14C and δ13C data for the aquifer, and (3) the geochemical mass-balance approach that constructs adjustment models accounting for all the geochemical reactions known to occur along a groundwater flow path. This review discusses first the geochemical processes behind each of the single-sample-based models, followed by discussions of the statistical approach and the geochemical mass-balance approach. Finally, the applications, advantages and limitations of the three groups of models/approaches are discussed.The single-sample-based models constitute the prevailing use of 14C data in hydrogeology and hydrological studies. This is in part because the models are applied to an individual water sample to estimate the 14C age, therefore the measurement data are easily available. These models have been shown to provide realistic radiocarbon ages in many studies. However, they usually are limited to simple carbonate aquifers and selection of model may have significant effects on 14C0 often resulting in a wide range of estimates of 14C ages.Of the single-sample-based models, four are recommended for the estimation of 14C0 of DIC in groundwater: Pearson's model, (Ingerson and Pearson, 1964; Pearson and White, 1967), Han & Plummer's model (Han and Plummer, 2013), the IAEA model (Gonfiantini, 1972; Salem et al., 1980), and Oeschger's model (Geyh, 2000). These four models include all processes considered in single-sample-based models, and can be used in different ranges of

  11. Inference for Local Distributions at High Sampling Frequencies: A Bootstrap Approach

    DEFF Research Database (Denmark)

    Hounyo, Ulrich; Varneskov, Rasmus T.

    of "large" jumps. Our locally dependent wild bootstrap (LDWB) accommodate issues related to the stochastic scale and jumps as well as account for a special block-wise dependence structure induced by sampling errors. We show that the LDWB replicates first and second-order limit theory from the usual...... empirical process and the stochastic scale estimate, respectively, as well as an asymptotic bias. Moreover, we design the LDWB sufficiently general to establish asymptotic equivalence between it and and a nonparametric local block bootstrap, also introduced here, up to second-order distribution theory....... Finally, we introduce LDWB-aided Kolmogorov-Smirnov tests for local Gaussianity as well as local von-Mises statistics, with and without bootstrap inference, and establish their asymptotic validity using the second-order distribution theory. The finite sample performance of CLT and LDWB-aided local...

  12. Surface plasmon resonance: advances of label-free approaches in the analysis of biological samples

    Czech Academy of Sciences Publication Activity Database

    Riedel, Tomáš; Majek, P.; Rodriguez-Emmenegger, Cesar; Brynda, Eduard

    2014-01-01

    Roč. 6, č. 24 (2014), s. 3325-3336 ISSN 1757-6180 R&D Projects: GA ČR(CZ) GBP205/12/G118; GA MŠk(CZ) EE2.3.30.0029; GA MŠk(CZ) ED1.1.00/02.0109 Institutional support: RVO:61389013 Keywords : surface plasmon resonance sensors * polymer brushes * human serum samples Subject RIV: CE - Biochemistry Impact factor: 3.003, year: 2014

  13. Next Generation Offline Approaches to Trace Gas-Phase Organic Compound Speciation: Sample Collection and Analysis

    Science.gov (United States)

    Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.

    2017-12-01

    Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.

  14. Understanding active sampling strategies: Empirical approaches and implications for attention and decision research.

    Science.gov (United States)

    Gottlieb, Jacqueline

    2018-05-01

    In natural behavior we actively gather information using attention and active sensing behaviors (such as shifts of gaze) to sample relevant cues. However, while attention and decision making are naturally coordinated, in the laboratory they have been dissociated. Attention is studied independently of the actions it serves. Conversely, decision theories make the simplifying assumption that the relevant information is given, and do not attempt to describe how the decision maker may learn and implement active sampling policies. In this paper I review recent studies that address questions of attentional learning, cue validity and information seeking in humans and non-human primates. These studies suggest that learning a sampling policy involves large scale interactions between networks of attention and valuation, which implement these policies based on reward maximization, uncertainty reduction and the intrinsic utility of cognitive states. I discuss the importance of using such paradigms for formalizing the role of attention, as well as devising more realistic theories of decision making that capture a broader range of empirical observations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Analysis of solid waste management strategies in Thimphu with reference to its detrimental effect and remission approach

    Science.gov (United States)

    Chakraborty, Bidisha; Murshed, Warefta E.; Chakraborty, Saikat

    2018-04-01

    Bhutan is a small landlocked country with an area of 38,394 km2 and population of 797,765 which is considered as the world's leading carbon negative country. Since Bhutan is a developing nation which is thriving to expand its social, cultural and economic boundaries, the country is facing increasing number of rural-urban migration as well as rapidly changing life style which are the major driving forces of increased waste generation in the cities especially at the urban centers like Thimphu. Irregular management and improper dumping leads to an unhealthy community, disturbs the natural ecosystem and dismantles the Life Cycle Assessment (LCA) of a product. This study basically assessed the constraints of the present strategies used in waste management practices in Thimphu associated with the increasing population pressure and the importance of having modern landfill and incinerator facilities with the use of innovative technologies which would help to develop the concept of Eco-city. Bhutan being a carbon negative country which could be a leading sign of eco-city might be questionable in near future for its improper management techniques. South Asian countries such as Bhutan need to be concerned about proper management of waste. It also contends that the current trend of using only landfills, cannot solve the waste management problem, but rather then that incinerators have the potential to be a better choice, if maintained properly for the development of an eco-city. Solid waste management (SWM) and partaking greener policy is a matter of concern for the eco-city. Use of proper waste management approaches is essential for having sustainable eco-city in the long run.

  16. The one-sample PARAFAC approach reveals molecular size distributions of fluorescent components in dissolved organic matter

    DEFF Research Database (Denmark)

    Wünsch, Urban; Murphy, Kathleen R.; Stedmon, Colin

    2017-01-01

    Molecular size plays an important role in dissolved organic matter (DOM) biogeochemistry, but its relationship with the fluorescent fraction of DOM (FDOM) remains poorly resolved. Here high-performance size exclusion chromatography (HPSEC) was coupled to fluorescence emission-excitation (EEM...... but not their spectral properties. Thus, in contrast to absorption measurements, bulk fluorescence is unlikely to reliably indicate the average molecular size of DOM. The one-sample approach enables robust and independent cross-site comparisons without large-scale sampling efforts and introduces new analytical...... opportunities for elucidating the origins and biogeochemical properties of FDOM...

  17. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part I: framework and fundamentals.

    Science.gov (United States)

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    Virgin olive oil is the only food product for which sensory analysis is regulated to classify it in different quality categories. To harmonize the results of the sensorial method, the use of standards or reference materials is crucial. The stability of sensory reference materials is required to enable their suitable control, aiming to confirm that their specific target values are maintained on an ongoing basis. Currently, such stability is monitored by means of sensory analysis and the sensory panels are in the paradoxical situation of controlling the standards that are devoted to controlling the panels. In the present study, several approaches based on similarity analysis are exploited. For each approach, the specific methodology to build a proper multivariate control chart to monitor the stability of the sensory properties is explained and discussed. The normalized Euclidean and Mahalanobis distances, the so-called nearness and hardiness indices respectively, have been defined as new similarity indices to range the values from 0 to 1. Also, the squared mean from Hotelling's T 2 -statistic and Q 2 -statistic has been proposed as another similarity index. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  18. Idiographic duo-trio tests using a constant-reference based on preference of each consumer: Sample presentation sequence in difference test can be customized for individual consumers to reduce error.

    Science.gov (United States)

    Kim, Min-A; Sim, Hye-Min; Lee, Hye-Seong

    2016-11-01

    As reformulations and processing changes are increasingly needed in the food industry to produce healthier, more sustainable, and cost effective products while maintaining superior quality, reliable measurements of consumers' sensory perception and discrimination are becoming more critical. Consumer discrimination methods using a preferred-reference duo-trio test design have been shown to be effective in improving the discrimination performance by customizing sample presentation sequences. However, this design can add complexity to the discrimination task for some consumers, resulting in more errors in sensory discrimination. The objective of the present study was to investigate the effects of different types of test instructions using the preference-reference duo-trio test design where a paired-preference test is followed by 6 repeated preferred-reference duo-trio tests, in comparison to the analytical method using the balanced-reference duo-trio. Analyses of d' estimates (product-related measure) and probabilistic sensory discriminators in momentary numbers of subjects showing statistical significance (subject-related measure) revealed that only preferred-reference duo-trio test using affective reference-framing, either by providing no information about the reference or information on a previously preferred sample, improved the sensory discrimination more than the analytical method. No decrease in discrimination performance was observed with any type of instruction, confirming that consumers could handle the test methods. These results suggest that when repeated tests are feasible, using the affective discrimination method would be operationally more efficient as well as ecologically more reliable for measuring consumers' sensory discrimination ability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A novel approach for craniofacial symmetry evaluation: Using the midsagittal Reference line drawn from “Crista Gali” with NHP technique

    Directory of Open Access Journals (Sweden)

    Morteza Ordobazari

    2013-11-01

    Full Text Available Please cite this article as: Ordobazari M, Naqavi Al-Hosseini AA, Zafarmand H. A novel approach for craniofacial symmetry evaluation: Using the midsagittal Reference line drawn from “Crista Gali” with NHP technique. Novel Biomed 2013;1(2:48-53.Background and objective: The purpose of this study was the determination of midsagittal reference line (MSL for craniofacial asymmetry assessment by drawing a line from Crista gali parallel to the true vertical line in PA cephalometry, using Natural Head Position (NHP technique.Method and Materials: 60 Iranian subjects within the age range of 9-13 years old were selected for this prospective study. Patients referred for orthodontic treatment and ghad no supernumerary or missing teeth, no skeletal anomaly, or any history of orthodontic and jaw surgery with normal occlusion. Posteroanterior cephalometric radiographs (PA Ceph were taken of all subjects with NHP technique. The midsagittal line was also traced parallel to the hanging chain from Crista gali. True horizontal line (THL and true vertical line (TVL were also traced from Crista gali (Cg. Using Cartesian system based upon Cg point (0~0, the craniofacial symmetry was assessed with linear, angular and proportional measurements in PA cephalogam, related to TVL and THL lines, for 10 bilateral (R&L anatomical landmarks. The mean differences of the above measurements in left and right sides were analyzed by T- test.Results: The proportional ratios for all left and right measurements were not statistically significant. This was true for both vertical and horizontal distances. The significant level for MSL drawn from Cg as referred to ANS (0±0.255 and Me points (0.007±0.527 was 0.002 and 0.004, respectively.Conclusion: In posteroanterior cephalometry radiographs taken with NHP method, the MSL drawn from Crista gali is reproducible and reliable up to 96% of the times for facial symmetry diagnosis.

  20. References on the Study and Research of Public External Operational Audit of Structural Non-reimbursable Funds an Epistemological Approach

    Directory of Open Access Journals (Sweden)

    Constantin AFANASE

    2011-01-01

    Full Text Available The International Auditing and Assurance Standards Board (IAASB of the International Federation of Accountants (IFAC is the international organism of standardization in the auditing field [1]. In our opinion, the theories, the methodologies and the standards issued by the mentioned organism, are still the paradigms with the most significant impact on audit rules and practices. Since some theorists define accounting as a social applied science, we can also affirm that the audit activity has a social role [2]. We intend to treat the mentioned subject not only from gnoseological point of view, in other words, we won’t just broaden the current theories and practices. In research, beside the theoretical analysis work, we intend to have a critical attitude both regarding previous research and defining and spreading innovative ideas relating to the suggested topic, as well. We may say that studying theorists work who are linked to the field news, in order to formulate the rules of good practice is an epistemological matter. From the epistemological point of view, in auditing, we operate with valuable judgments, namely evaluations or practical assessments of the phenomenon which our work can influence by adopting an approval or disapproval attitude. Improving the audit of operations financed from external grants can and should be a leverage of the utmost importance for their strategic absorption, implementation according to the agreements signed with the European Commission under the full protection of EU financial interests. The present project is focused on increasing the optimization of audit procedures and techniques as regards grants audit operations so that their implementation to be transparent, effective, efficient and economic for the national economy, and complying with the financial interests of the European Union. The challenge of this approach is caused by the fact that the external public audit of the external funded grants should fully

  1. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  2. Improved prediction of MHC class I and class II epitopes using a novel Gibbs sampling approach

    DEFF Research Database (Denmark)

    Nielsen, Morten; Lundegaard, Claus; Worning, Peder

    2004-01-01

    Prediction of which peptides will bind a specific major histocompatibility complex (MHC) constitutes an important step in identifying potential T-cell epitopes suitable as vaccine candidates. MHC class II binding peptides have a broad length distribution complicating such predictions. Thus......, identifying the correct alignment is a crucial part of identifying the core of an MHC class II binding motif. In this context, we wish to describe a novel Gibbs motif sampler method ideally suited for recognizing such weak sequence motifs. The method is based on the Gibbs sampling method, and it incorporates...

  3. A novel approach to non-biased systematic random sampling: a stereologic estimate of Purkinje cells in the human cerebellum.

    Science.gov (United States)

    Agashiwala, Rajiv M; Louis, Elan D; Hof, Patrick R; Perl, Daniel P

    2008-10-21

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well.

  4. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  5. Multi-phase classification by a least-squares support vector machine approach in tomography images of geological samples

    Science.gov (United States)

    Khan, Faisal; Enzmann, Frieder; Kersten, Michael

    2016-03-01

    Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.

  6. Rational approach to solvent system selection for liquid-liquid extraction-assisted sample pretreatment in counter-current chromatography.

    Science.gov (United States)

    Wang, Jiajia; Gu, Dongyu; Wang, Miao; Guo, Xinfeng; Li, Haoquan; Dong, Yue; Guo, Hong; Wang, Yi; Fan, Mengqi; Yang, Yi

    2017-05-15

    A rational liquid-liquid extraction approach was established to pre-treat samples for high-speed counter-current chromatography (HSCCC). n-Hexane-ethyl acetate-methanol-water (4:5:4:5, v/v) and (1:5:1:5, v/v) were selected as solvent systems for liquid-liquid extraction by systematically screening K of target compounds to remove low- and high-polarity impurities in the sample, respectively. After liquid-liquid extraction was performed, 1.4g of crude sample II was obtained from 18.5g of crude sample I which was extracted from the flowers of Robinia pseudoacacia L., and then separated with HSCCC by using a solvent system composed of n-hexane-ethyl acetate-methanol-water (1:2:1:2, v/v). As a result, 31mg of robinin and 37mg of kaempferol 7-O-α-l-rhamnopyranoside were isolated from 200mg of crude sample II in a single run of HSCCC. A scale-up separation was also performed, and 160mg of robinin with 95% purity and 188mg of kaempferol 7-O-α-l-rhamnopyranoside with 97% purity were produced from 1.2g of crude sample II. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Exploring hadronic tau identification with DC1 datat samples a track based approach

    CERN Document Server

    Richter-Was, Elzbieta; Tarrade, F

    2004-01-01

    In this note we discuss the identification of hadronic $\\tau$s. We propose an algorithm, tauID, which starts from a reconstructed, relatively high pT track and then collects calorimetric energy deposition in a fixed cone seeded by the track eta and phi at the vertex. With the proposed algorithm we explore exclusive features of the hadronic $\\tau$ decays and we indicate also the possibility of using an energy-flow based approach for defining the energy scale of the reconstructed tau-candidates. The results presented here are limited to the barrel region (|eta| < 1.5) and are based on the DC1 events simulated without pile-up and electronic noise. We compare the performances of the proposed algorithm and of the base-line tauRec algorithm and draw some conclusions for further studies.

  8. Coupling pervaporation to AAS for inorganic and organic mercury determination. A new approach to speciation of Hg in environmental samples.

    Science.gov (United States)

    Fernandez-Rivas, C; Muñoz-Olivas, R; Camara, C

    2001-12-01

    The design and development of a new approach for Hg speciation in environmental samples is described in detail. This method, consisting of the coupling of pervaporation and atomic absorption spectrometry, is based on a membrane phenomenon that combines the evaporation of volatile analytes and their diffusion through a polymeric membrane. It is proposed here as an alternative to gas chromatography for speciation of inorganic and organic Hg compounds, as the latter compounds are volatile and can be separated by applying the principles mentioned above. The interest of this method lies in its easy handling, low cost, and rapidity for the analysis of liquid and solid samples. This method has been applied to Hg speciation in a compost sample provided by a waste water treatment plant.

  9. Interference and k-point sampling in the supercell approach to phase-coherent transport - art. no. 0333401

    DEFF Research Database (Denmark)

    Thygesen, Kristian Sommer; Jacobsen, Karsten Wedel

    2005-01-01

    We present a systematic study of interference and k-point sampling effects in the supercell approach to phase-coherent electron transport. We use a representative tight-binding model to show that interference between the repeated images is a small effect compared to the error introduced by using...... only the Gamma-point for a supercell containing (3,3) sites in the transverse plane. An insufficient k-point sampling can introduce strong but unphysical features in the transmission function which can be traced to the presence of van Hove singularities in the lead. We present a first......-principles calculation of the transmission through a Pt contact which shows that the k-point sampling is also important for realistic systems....

  10. A Model Based Approach to Sample Size Estimation in Recent Onset Type 1 Diabetes

    Science.gov (United States)

    Bundy, Brian; Krischer, Jeffrey P.

    2016-01-01

    The area under the curve C-peptide following a 2-hour mixed meal tolerance test from 481 individuals enrolled on 5 prior TrialNet studies of recent onset type 1 diabetes from baseline to 12 months after enrollment were modelled to produce estimates of its rate of loss and variance. Age at diagnosis and baseline C-peptide were found to be significant predictors and adjusting for these in an ANCOVA resulted in estimates with lower variance. Using these results as planning parameters for new studies results in a nearly 50% reduction in the target sample size. The modelling also produces an expected C-peptide that can be used in Observed vs. Expected calculations to estimate the presumption of benefit in ongoing trials. PMID:26991448

  11. A model-based approach to sample size estimation in recent onset type 1 diabetes.

    Science.gov (United States)

    Bundy, Brian N; Krischer, Jeffrey P

    2016-11-01

    The area under the curve C-peptide following a 2-h mixed meal tolerance test from 498 individuals enrolled on five prior TrialNet studies of recent onset type 1 diabetes from baseline to 12 months after enrolment were modelled to produce estimates of its rate of loss and variance. Age at diagnosis and baseline C-peptide were found to be significant predictors, and adjusting for these in an ANCOVA resulted in estimates with lower variance. Using these results as planning parameters for new studies results in a nearly 50% reduction in the target sample size. The modelling also produces an expected C-peptide that can be used in observed versus expected calculations to estimate the presumption of benefit in ongoing trials. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Simultaneous alignment and clustering of peptide data using a Gibbs sampling approach

    DEFF Research Database (Denmark)

    Andreatta, Massimo; Lund, Ole; Nielsen, Morten

    2013-01-01

    Motivation: Proteins recognizing short peptide fragments play a central role in cellular signaling. As a result of high-throughput technologies, peptide-binding protein specificities can be studied using large peptide libraries at dramatically lower cost and time. Interpretation of such large...... peptide datasets, however, is a complex task, especially when the data contain multiple receptor binding motifs, and/or the motifs are found at different locations within distinct peptides.Results: The algorithm presented in this article, based on Gibbs sampling, identifies multiple specificities...... of unaligned peptide datasets of variable length. Example applications described in this article include mixtures of binders to different MHC class I and class II alleles, distinct classes of ligands for SH3 domains and sub-specificities of the HLA-A*02:01 molecule.Availability: The Gibbs clustering method...

  13. Episodic work-family conflict, cardiovascular indicators, and social support: an experience sampling approach.

    Science.gov (United States)

    Shockley, Kristen M; Allen, Tammy D

    2013-07-01

    Work-family conflict, a prevalent stressor in today's workforce, has been linked to several detrimental consequences for the individual, including physical health. The present study extends this area of research by examining episodic work-family conflict in relation to objectively measured cardiovascular health indicators (systolic and diastolic blood pressure and heart rate) using an experience sampling methodology. The results suggested that the occurrence of an episode of work interference with family conflict is linked to a subsequent increase in heart rate but not blood pressure; however, the relationship between episodes of family interference with work conflict and both systolic and diastolic blood pressure is moderated by perceptions of family-supportive supervision. No evidence was found for the moderating role of work-supportive family. Further theoretical and practical implications are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. APPLICATION OF SPATIAL MODELLING APPROACHES, SAMPLING STRATEGIES AND 3S TECHNOLOGY WITHIN AN ECOLGOCIAL FRAMWORK

    Directory of Open Access Journals (Sweden)

    H.-C. Chen

    2012-07-01

    Full Text Available How to effectively describe ecological patterns in nature over broader spatial scales and build a modeling ecological framework has become an important issue in ecological research. We test four modeling methods (MAXENT, DOMAIN, GLM and ANN to predict the potential habitat of Schima superba (Chinese guger tree, CGT with different spatial scale in the Huisun study area in Taiwan. Then we created three sampling design (from small to large scales for model development and validation by different combinations of CGT samples from aforementioned three sites (Tong-Feng watershed, Yo-Shan Mountain, and Kuan-Dau watershed. These models combine points of known occurrence and topographic variables to infer CGT potential spatial distribution. Our assessment revealed that the method performance from highest to lowest was: MAXENT, DOMAIN, GLM and ANN on small spatial scale. The MAXENT and DOMAIN two models were the most capable for predicting the tree's potential habitat. However, the outcome clearly indicated that the models merely based on topographic variables performed poorly on large spatial extrapolation from Tong-Feng to Kuan-Dau because the humidity and sun illumination of the two watersheds are affected by their microterrains and are quite different from each other. Thus, the models developed from topographic variables can only be applied within a limited geographical extent without a significant error. Future studies will attempt to use variables involving spectral information associated with species extracted from high spatial, spectral resolution remotely sensed data, especially hyperspectral image data, for building a model so that it can be applied on a large spatial scale.

  15. Biodiversity sampling using a global acoustic approach: contrasting sites with microendemics in New Caledonia.

    Directory of Open Access Journals (Sweden)

    Amandine Gasc

    Full Text Available New Caledonia is a Pacific island with a unique biodiversity showing an extreme microendemism. Many species distributions observed on this island are extremely restricted, localized to mountains or rivers making biodiversity evaluation and conservation a difficult task. A rapid biodiversity assessment method based on acoustics was recently proposed. This method could help to document the unique spatial structure observed in New Caledonia. Here, this method was applied in an attempt to reveal differences among three mountain sites (Mandjélia, Koghis and Aoupinié with similar ecological features and species richness level, but with high beta diversity according to different microendemic assemblages. In each site, several local acoustic communities were sampled with audio recorders. An automatic acoustic sampling was run on these three sites for a period of 82 successive days. Acoustic properties of animal communities were analysed without any species identification. A frequency spectral complexity index (NP was used as an estimate of the level of acoustic activity and a frequency spectral dissimilarity index (Df assessed acoustic differences between pairs of recordings. As expected, the index NP did not reveal significant differences in the acoustic activity level between the three sites. However, the acoustic variability estimated by the index Df , could first be explained by changes in the acoustic communities along the 24-hour cycle and second by acoustic dissimilarities between the three sites. The results support the hypothesis that global acoustic analyses can detect acoustic differences between sites with similar species richness and similar ecological context, but with different species assemblages. This study also demonstrates that global acoustic methods applied at broad spatial and temporal scales could help to assess local biodiversity in the challenging context of microendemism. The method could be deployed over large areas, and

  16. Automated washing of FTA Card punches and PCR setup for reference samples using a LIMS-controlled Sias Xantus automated liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Olsen, Addie Nina; Frøslev, Tobias G.

    2009-01-01

    We have implemented and validated automated methods for washing FTA Card punches containing buccal samples and subsequent PCR setup using a Sias Xantus automated liquid handler. The automated methods were controlled by worklists generated by our LabWare Laboratory Information Management System...

  17. The Structured Clinical Interview for DSM-IV Childhood Diagnoses (Kid-SCID): first psychometric evaluation in a Dutch sample of clinically referred youths

    NARCIS (Netherlands)

    Roelofs, J.; Muris, P.; Braet, C.; Arntz, A.; Beelen, I.

    2015-01-01

    The Structured Clinical Interview for DSM-IV Childhood Disorders (Kid-SCID) is a semi-structured interview for the classification of psychiatric disorders in children and adolescents. This study presents a first evaluation of the psychometric properties of the Kid-SCID in a Dutch sample of children

  18. Vitamins A and E in liver, kidney, and whole blood of East Greenland polar bears sampled 1994–2008: reference values and temporal trends

    DEFF Research Database (Denmark)

    Bechshoft, T.; Sonne, C.; Jakobsen, Jette

    2015-01-01

    on this health issue in polar bears (Ursus maritimus). The aim of this study was thus to provide reference values for concentrations of vitamin A in liver, kidney cortex, and whole blood and vitamin E in kidney cortex and whole blood from 166 East Greenland polar bears, as well as to assess the relationship...... that POPs could be disrupting polar bear vitamin status. However, while the observed temporal increases in vitamin concentrations were likely POP related, the question remains as to whether they stem from influence of contaminants only or also, e.g., changes in prey species. Further studies are needed...... between POPs and vitamin concentrations. In addition, vitamin concentrations were analyzed for temporal trends (1994–2008). Results showed vitamin A in liver to be higher in adult bears and the concentrations of vitamin E in kidney and blood to likewise be generally higher in adult bears. In addition, all...

  19. Optimized Field Sampling and Monitoring of Airborne Hazardous Transport Plumes; A Geostatistical Simulation Approach

    International Nuclear Information System (INIS)

    Chen, DI-WEN

    2001-01-01

    Airborne hazardous plumes inadvertently released during nuclear/chemical/biological incidents are mostly of unknown composition and concentration until measurements are taken of post-accident ground concentrations from plume-ground deposition of constituents. Unfortunately, measurements often are days post-incident and rely on hazardous manned air-vehicle measurements. Before this happens, computational plume migration models are the only source of information on the plume characteristics, constituents, concentrations, directions of travel, ground deposition, etc. A mobile ''lighter than air'' (LTA) system is being developed at Oak Ridge National Laboratory that will be part of the first response in emergency conditions. These interactive and remote unmanned air vehicles will carry light-weight detectors and weather instrumentation to measure the conditions during and after plume release. This requires a cooperative computationally organized, GPS-controlled set of LTA's that self-coordinate around the objectives in an emergency situation in restricted time frames. A critical step before an optimum and cost-effective field sampling and monitoring program proceeds is the collection of data that provides statistically significant information, collected in a reliable and expeditious manner. Efficient aerial arrangements of the detectors taking the data (for active airborne release conditions) are necessary for plume identification, computational 3-dimensional reconstruction, and source distribution functions. This report describes the application of stochastic or geostatistical simulations to delineate the plume for guiding subsequent sampling and monitoring designs. A case study is presented of building digital plume images, based on existing ''hard'' experimental data and ''soft'' preliminary transport modeling results of Prairie Grass Trials Site. Markov Bayes Simulation, a coupled Bayesian/geostatistical methodology, quantitatively combines soft information

  20. Micro-Crater Laser Induced Breakdown Spectroscopy--an Analytical approach in metals samples

    Energy Technology Data Exchange (ETDEWEB)

    Piscitelli, Vincent [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela); Lawrence Berkeley National laboratory, Berkeley, US (United States); Gonzalez, Jhanis; Xianglei, Mao; Russo, Richard [Lawrence Berkeley National laboratory, Berkeley, US (United States); Fernandez, Alberto [UCV- Laboratorio de Espectroscopia Laser, Caracas (Venezuela)

    2008-04-15

    The laser ablation has been increasing its popularity like as technique of chemical analysis. This is due to its great potentiality in the analysis of solid samples. On the way to contributing to the development of the technique, we in this work studied the laser induced breakdown spectroscopy (LIBS) in conditions of micro ablation for future studies of coverings and micro crates analysis. Craters between 2 and 7 micrometers of diameter were made using an Nd-YAG nanosecond laser in their fundamental emission of 1064 nm. In order to create these craters we use an objective lens of long distance work and 0.45 of numerical aperture. The atomic emission versus the energy of the laser and its effect on the size of craters was study. We found that below 3 micrometers although there was evidence of material removal by the formation of a crater, it was no detectable atomic emission for our instruments. In order to try to understand this, curves of size of crater versus plasma temperature using the Boltzmann distribution graphs taking the Copper emission lines in the visible region were made. In addition calibration curves for Copper and aluminum were made in two different matrices; one of it was a Cu/Zn alloy and the other a Zinc Matrix. The atomic lines Cu I (521.78 nm) and Al I (396.15 nm) was used. From the Calibration curve the analytical limit of detection and other analytical parameters were obtained.

  1. Extended multi-configuration quasi-degenerate perturbation theory: the new approach to multi-state multi-reference perturbation theory.

    Science.gov (United States)

    Granovsky, Alexander A

    2011-06-07

    The distinctive desirable features, both mathematically and physically meaningful, for all partially contracted multi-state multi-reference perturbation theories (MS-MR-PT) are explicitly formulated. The original approach to MS-MR-PT theory, called extended multi-configuration quasi-degenerate perturbation theory (XMCQDPT), having most, if not all, of the desirable properties is introduced. The new method is applied at the second order of perturbation theory (XMCQDPT2) to the 1(1)A(')-2(1)A(') conical intersection in allene molecule, the avoided crossing in LiF molecule, and the 1(1)A(1) to 2(1)A(1) electronic transition in cis-1,3-butadiene. The new theory has several advantages compared to those of well-established approaches, such as second order multi-configuration quasi-degenerate perturbation theory and multi-state-second order complete active space perturbation theory. The analysis of the prevalent approaches to the MS-MR-PT theory performed within the framework of the XMCQDPT theory unveils the origin of their common inherent problems. We describe the efficient implementation strategy that makes XMCQDPT2 an especially useful general-purpose tool in the high-level modeling of small to large molecular systems. © 2011 American Institute of Physics

  2. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes

    2009-01-01

    , and muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types......We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  3. Out-of-Pocket Expenditures on Complementary Health Approaches Associated with Painful Health Conditions in a Nationally Representative Adult Sample

    Science.gov (United States)

    Nahin, Richard L.; Stussman, Barbara J.; Herman, Patricia M.

    2015-01-01

    National surveys suggest that millions of adults in the United States use complementary health approaches such as acupuncture, chiropractic manipulation, and herbal medicines to manage painful conditions such as arthritis, back pain and fibromyalgia. Yet, national and per person out-of-pocket (OOP) costs attributable to this condition-specific use are unknown. In the 2007 National Health Interview Survey, use of complementary health approaches, reasons for this use, and associated OOP costs were captured in a nationally representative sample of 5,467 adults. Ordinary least square regression models that controlled for co-morbid conditions were used to estimate aggregate and per person OOP costs associated with 14 painful health conditions. Individuals using complementary approaches spent a total of $14.9 billion (S.E. $0.9 billion) OOP on these approaches to manage these painful conditions. Total OOP expenditures seen in those using complementary approaches for their back pain ($8.7 billion, S.E. $0.8 billion) far outstripped that of any other condition, with the majority of these costs ($4.7 billion, S.E. $0.4 billion) resulting from visits to complementary providers. Annual condition-specific per-person OOP costs varied from a low of $568 (SE $144) for regular headaches, to a high of $895 (SE $163) for fibromyalgia. PMID:26320946

  4. Reference Assessment

    Science.gov (United States)

    Bivens-Tatum, Wayne

    2006-01-01

    This article presents interesting articles that explore several different areas of reference assessment, including practical case studies and theoretical articles that address a range of issues such as librarian behavior, patron satisfaction, virtual reference, or evaluation design. They include: (1) "Evaluating the Quality of a Chat Service"…

  5. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data.

    Science.gov (United States)

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks.

  6. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data

    Science.gov (United States)

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks. PMID:29706880

  7. A certified urea reference material (NMIJ CRM 6006-a) as a reliable calibrant for the elemental analyses of amino acids and food samples.

    Science.gov (United States)

    Itoh, Nobuyasu; Yamazaki, Taichi; Sato, Ayako; Numata, Masahiko; Takatsu, Akiko

    2014-01-01

    We examined the reliability of a certified reference material (CRM) for urea (NMIJ CRM 6006-a) as a calibrant for N, C, and H in elemental analyzers. Only the N content for this CRM is provided as an indicative value. To estimate the C and H contents of the urea CRM, we took into account the purity of the urea and the presence of other identified impurities. When we examined the use of various masses of the calibrant (0.2 to 2 mg), we unexpectedly observed low signal intensities for small masses of H and N, but these plateaued at about 2 mg. We therefore analyzed four amino acid CRMs and four food CRMs on a 2-mg scale with the urea CRM as the calibrant. For the amino acid CRMs, the differences in the analytical and theoretical contents (≤0.0026 kg/kg) were acceptable with good repeatability (≤0.0013 kg/kg in standard deviation; n = 4). For food CRMs, comparable repeatabilities to those obtained with amino acid CRMs (≤0.0025 kg/kg in standard deviation; n = 4) were obtained. The urea CRM can therefore be used as a reliable calibrant for C, H, and N in an elemental analyzer.

  8. Relationship between dietary approaches to stop hypertension score and presence or absence of coronary heart diseases in patients referring to Imam Hossein Hospital, Tehran, Iran

    Directory of Open Access Journals (Sweden)

    Zeinab Mokhtari

    2013-11-01

    Full Text Available BACKGROUND: The dietary approaches to stop hypertension (DASH dietary pattern reduces blood pressure. However, there is little information about the relationship between DASH and coronary heart diseases. This study aimed to assess the relationship between a DASH-style diet adherence score and coronary heart diseases (CHD in patients referring for coronary angiography. METHODS: In this study, 201 adults (102 males, 99 females within the age range of 40-80 years who referred for coronary angiography were selected. Diet was evaluated using a validated food frequency questionnaire. DASH score was calculated based on 8 food components (fruits, vegetables, whole grains, nuts and legumes, low fat dairy, red/processed meats, soft drinks/sweets, and sodium. The relationship between DASH score and CHD was assessed using logistic regression analysis. RESULTS: Mean of DASH score was 23.99 ± 4.41. Individuals in the highest quartile of DASH score were less likely to have CHD [odds ratio (OR = 0.38, 95% confidence interval (CI: 0.16-0.86]. However, after adjustment for gender or smoking, there was little evidence that coronary heart disease was associated with DASH diet score. There was a significant negative correlation between DASH score and diastolic blood pressure (P ≤ 0.05. CONCLUSION: In conclusion, having a diet similar to DASH plan was not independently related to CHD in this study. This might indicate that having a healthy dietary pattern, such as DASH pattern, is highly related to gender (dietary pattern is healthier in women than men or smoking habit (non-smokers have healthier dietary pattern compared to smokers.   Keywords: Coronary Heart Disease, Dietary Approach to Stop Hypertension, Blood Pressure 

  9. Cross-validation of the Dot Counting Test in a large sample of credible and non-credible patients referred for neuropsychological testing.

    Science.gov (United States)

    McCaul, Courtney; Boone, Kyle B; Ermshar, Annette; Cottingham, Maria; Victor, Tara L; Ziegler, Elizabeth; Zeller, Michelle A; Wright, Matthew

    2018-01-18

    To cross-validate the Dot Counting Test in a large neuropsychological sample. Dot Counting Test scores were compared in credible (n = 142) and non-credible (n = 335) neuropsychology referrals. Non-credible patients scored significantly higher than credible patients on all Dot Counting Test scores. While the original E-score cut-off of ≥17 achieved excellent specificity (96.5%), it was associated with mediocre sensitivity (52.8%). However, the cut-off could be substantially lowered to ≥13.80, while still maintaining adequate specificity (≥90%), and raising sensitivity to 70.0%. Examination of non-credible subgroups revealed that Dot Counting Test sensitivity in feigned mild traumatic brain injury (mTBI) was 55.8%, whereas sensitivity was 90.6% in patients with non-credible cognitive dysfunction in the context of claimed psychosis, and 81.0% in patients with non-credible cognitive performance in depression or severe TBI. Thus, the Dot Counting Test may have a particular role in detection of non-credible cognitive symptoms in claimed psychiatric disorders. Alternative to use of the E-score, failure on ≥1 cut-offs applied to individual Dot Counting Test scores (≥6.0″ for mean grouped dot counting time, ≥10.0″ for mean ungrouped dot counting time, and ≥4 errors), occurred in 11.3% of the credible sample, while nearly two-thirds (63.6%) of the non-credible sample failed one of more of these cut-offs. An E-score cut-off of 13.80, or failure on ≥1 individual score cut-offs, resulted in few false positive identifications in credible patients, and achieved high sensitivity (64.0-70.0%), and therefore appear appropriate for use in identifying neurocognitive performance invalidity.

  10. Android quick APIs reference

    CERN Document Server

    Cinar, Onur

    2015-01-01

    The Android Quick APIs Reference is a condensed code and APIs reference for the new Google Android 5.0 SDK. It presents the essential Android APIs in a well-organized format that can be used as a handy reference. You won't find any technical jargon, bloated samples, drawn out history lessons, or witty stories in this book. What you will find is a software development kit and APIs reference that is concise, to the point and highly accessible. The book is packed with useful information and is a must-have for any mobile or Android app developer or programmer. In the Android Quick APIs Refe

  11. Dental Erosion in Patients with Gastroesophageal Reflux Disease (GERD) in a Sample of Patients Referred to the Motahari Clinic, Shiraz, Iran.

    Science.gov (United States)

    Alavi, G; Alavi, Aa; Saberfiroozi, M; Sarbazi, Ah; Motamedi, M; Hamedani, Sh

    2014-03-01

    Systematic reviews of the literature show that the dental erosion is associated with the gastroesophageal reflux disease (GERD).The prevalence of the problem may not be exclusively similar in different countries. The purpose of this study was to investigate the association of gastro-esophageal reflux disease (GERD) with dental erosion in a sample of Iranian population regarding the standing difference in the Iranian oral hygiene and diet. Material s and Method: 140 patients with the average age of 30 to 50 years old comprised the study group. The participants were already eligible for the endoscopic examination, diagnosed by their gastroenterologist. All patients completed a detailed questionnaire regarding the medical and dental situations. After completing the questionnaire and before endoscopy, dental examination was performed by two blinded dentists.The endoscopy was then performed by a gastroenterologist and the patients were divided into three groups of healthy, suspected to GERD, and positive GERD. Data were collected and analyzed by Chi- Square test. The cross tabulation test was performed to compare the qualitative variants and discover the correlations. The statistical significance was adopted as: p dental erosion in GERD patients (22.6%) was found to be higher than the suspected (5.3%) and the healthy (7%) individuals. This study declared the GERD patients are at higher risk of developing dental erosion compared to the healthy individuals in a sample of Iranian population.

  12. Recent references

    International Nuclear Information System (INIS)

    Ramavataram, S.

    1991-01-01

    In support of a continuing program of systematic evaluation of nuclear structure data, the National Nuclear Data Center maintains a complete computer file of references to the nuclear physics literature. Each reference is tagged by a keyword string, which indicates the kinds of data contained in the article. This master file of Nuclear Structure References (NSR) contains complete keyword indexes to literature published since 1969, with partial indexing of older references. Any reader who finds errors in the keyword descriptions is urged to report them to the National Nuclear Data Center so that the master NSR file can be corrected. In 1966, the first collection of Recent References was published as a separate issue of Nuclear Data Sheets. Every four months since 1970, a similar indexed bibliography to new nuclear experiments has been prepared from additions to the NSR file and published. Beginning in 1978, Recent References was cumulated annually, with the third issue completely superseding the two issues previously published during a given year. Due to publication policy changes, cumulation of Recent Reference was discontinued in 1986. The volume and issue number of all the cumulative issues published to date are given. NNDC will continue to respond to individual requests for special bibliographies on nuclear physics topics, in addition to those easily obtained from Recent References. If the required information is available from the keyword string, a reference list can be prepared automatically from the computer files. This service can be provided on request, in exchange for the timely communication of new nuclear physics results (e.g., preprints). A current copy of the NSR file may also be obtained in a standard format on magnetic tape from NNDC. Requests for special searches of the NSR file may also be directed to the National Nuclear Data Center

  13. Evaluation of the highly sensitive Roche thyroglobulin II assay and establishment of a reference limit for thyroglobulin-negative patient samples

    Directory of Open Access Journals (Sweden)

    Dorien M. Rotteveel-de Groot

    2016-08-01

    Full Text Available Objectives: Thyroglobulin (Tg measurements are used to monitor for residual thyroid tissue in patients with differentiated thyroid cancer (DTC after thyroidectomy and radioiodine ablative therapy. In recent years highly sensitive Tg assays have been developed. In this study the analytical performance of the new Roche Elecsys Tg II assay was evaluated and compared with the well documented Access2 Tg assay (Beckman–Coulter. Design and methods: Analytical performance was examined using various Clinical and Laboratory Standards Institute (CLSI evaluation protocols. Tg negative patient sera were used to establish an upper reference limit (URL for the Elecsys Tg II assay. Results: Non-linearity, drift and carry-over according to CLSI EP10 and EP6 in a measuring range of 0.04–500 ng/mL were non-significant. Total precision according to CLSI EP5 was 10% at a Tg concentration of 0.08 ng/mL. A patient serum comparison performed according to a modified CLSI EP9 protocol showed a significant difference of a factor of approximately 1.4, despite using an identical CRM calibrator. The Elecsys Tg II assay measured Tg with a two-fold higher sensitivity than the Access2 assay. Finally, using human sera without Tg, an URL of 0.05 ng/mL was determined. Conclusions: In our hands the highly sensitive Elecsys Tg II assay shows a good analytical performance and a higher sensitivity compared to the Access2 Tg assay. An URL of 0.05 ng/mL for the Elecsys Tg II assay was determined which may improve the clinical utility of the assay for the detection of residual DTC or disease recurrence. Keywords: Thyroglobulin, Roche Elecsys Tg II assay, validation, reporting limit

  14. Evaluation of the highly sensitive Roche thyroglobulin II assay and establishment of a reference limit for thyroglobulin-negative patient samples.

    Science.gov (United States)

    Rotteveel-de Groot, Dorien M; Ross, H Alec; Janssen, Marcel J R; Netea-Maier, Romana T; Oosting, Janine D; Sweep, Fred C G J; van Herwaarden, Antonius E

    2016-08-01

    Thyroglobulin (Tg) measurements are used to monitor for residual thyroid tissue in patients with differentiated thyroid cancer (DTC) after thyroidectomy and radioiodine ablative therapy. In recent years highly sensitive Tg assays have been developed. In this study the analytical performance of the new Roche Elecsys Tg II assay was evaluated and compared with the well documented Access2 Tg assay (Beckman-Coulter). Analytical performance was examined using various Clinical and Laboratory Standards Institute (CLSI) evaluation protocols. Tg negative patient sera were used to establish an upper reference limit (URL) for the Elecsys Tg II assay. Non-linearity, drift and carry-over according to CLSI EP10 and EP6 in a measuring range of 0.04-500 ng/mL were non-significant. Total precision according to CLSI EP5 was 10% at a Tg concentration of 0.08 ng/mL. A patient serum comparison performed according to a modified CLSI EP9 protocol showed a significant difference of a factor of approximately 1.4, despite using an identical CRM calibrator. The Elecsys Tg II assay measured Tg with a two-fold higher sensitivity than the Access2 assay. Finally, using human sera without Tg, an URL of 0.05 ng/mL was determined. In our hands the highly sensitive Elecsys Tg II assay shows a good analytical performance and a higher sensitivity compared to the Access2 Tg assay. An URL of 0.05 ng/mL for the Elecsys Tg II assay was determined which may improve the clinical utility of the assay for the detection of residual DTC or disease recurrence.

  15. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  16. Neutron activation analysis of archaeological artifacts using the conventional relative method: a realistic approach for analysis of large samples

    International Nuclear Information System (INIS)

    Bedregal, P.S.; Mendoza, A.; Montoya, E.H.; Cohen, I.M.; Universidad Tecnologica Nacional, Buenos Aires; Oscar Baltuano

    2012-01-01

    A new approach for analysis of entire potsherds of archaeological interest by INAA, using the conventional relative method, is described. The analytical method proposed involves, primarily, the preparation of replicates of the original archaeological pottery, with well known chemical composition (standard), destined to be irradiated simultaneously, in a well thermalized external neutron beam of the RP-10 reactor, with the original object (sample). The basic advantage of this proposal is to avoid the need of performing complicated effect corrections when dealing with large samples, due to neutron self shielding, neutron self-thermalization and gamma ray attenuation. In addition, and in contrast with the other methods, the main advantages are the possibility of evaluating the uncertainty of the results and, fundamentally, validating the overall methodology. (author)

  17. Performances on Rey Auditory Verbal Learning Test and Rey Complex Figure Test in a healthy, elderly Danish sample--reference data and validity issues

    DEFF Research Database (Denmark)

    Vogel, Asmus; Stokholm, Jette; Jørgensen, Kasper

    2012-01-01

    . The RCFT copy score was significantly related to age and the DART score. On RCFT recall a highly significant difference was found between persons who could make a faultless copy and persons with incomplete copy performance. Thus, this study presents separate data for RCFT recall scores according...... to the subjects' copying performance (in separate tables for age and education groups). For all measures on both RAVLT and RCFT wide distributions of scores were found and the impact of this broad score range on the tests' discriminative validity is discussed. RAVLT performances for elderly were similar...... to previous published meta-norms, but the included sample of elderly Danes performed better on RCFT (copy and recall) than elderly from the United States....

  18. A combinatory approach for analysis of protein sets in barley sieve-tube samples using EDTA-facilitated exudation and aphid stylectomy.

    Science.gov (United States)

    Gaupels, Frank; Knauer, Torsten; van Bel, Aart J E

    2008-01-01

    This study investigated advantages and drawbacks of two sieve-tube sap sampling methods for comparison of phloem proteins in powdery mildew-infested vs. non-infested Hordeum vulgare plants. In one approach, sieve tube sap was collected by stylectomy. Aphid stylets were cut and immediately covered with silicon oil to prevent any contamination or modification of exudates. In this way, a maximum of 1muL pure phloem sap could be obtained per hour. Interestingly, after pathogen infection exudation from microcauterized stylets was reduced to less than 40% of control plants, suggesting that powdery mildew induced sieve tube-occlusion mechanisms. In contrast to the laborious stylectomy, facilitated exudation using EDTA to prevent calcium-mediated callose formation is quick and easy with a large volume yield. After two-dimensional (2D) electrophoresis, a digital overlay of the protein sets extracted from EDTA solutions and stylet exudates showed that some major spots were the same with both sampling techniques. However, EDTA exudates also contained large amounts of contaminative proteins of unknown origin. A combinatory approach may be most favourable for studies in which the protein composition of phloem sap is compared between control and pathogen-infected plants. Facilitated exudation may be applied for subtractive identification of differentially expressed proteins by 2D/mass spectrometry, which requires large amounts of protein. A reference gel loaded with pure phloem sap from stylectomy may be useful for confirmation of phloem origin of candidate spots by digital overlay. The method provides a novel opportunity to study differential expression of phloem proteins in monocotyledonous plant species.

  19. Metrological approach to quantitative analysis of clinical samples by LA-ICP-MS: A critical review of recent studies.

    Science.gov (United States)

    Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta

    2018-05-15

    Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  1. A two-level approach to VLBI terrestrial and celestial reference frames using both least-squares adjustment and Kalman filter algorithms

    Science.gov (United States)

    Soja, B.; Krasna, H.; Boehm, J.; Gross, R. S.; Abbondanza, C.; Chin, T. M.; Heflin, M. B.; Parker, J. W.; Wu, X.

    2017-12-01

    The most recent realizations of the ITRS include several innovations, two of which are especially relevant to this study. On the one hand, the IERS ITRS combination center at DGFI-TUM introduced a two-level approach with DTRF2014, consisting of a classical deterministic frame based on normal equations and an optional coordinate time series of non-tidal displacements calculated from geophysical loading models. On the other hand, the JTRF2014 by the combination center at JPL is a time series representation of the ITRF determined by Kalman filtering. Both the JTRF2014 and the second level of the DTRF2014 are thus able to take into account short-term variations in the station coordinates. In this study, based on VLBI data, we combine these two approaches, applying them to the determination of both terrestrial and celestial reference frames. Our product has two levels like DTRF2014, with the second level being a Kalman filter solution like JTRF2014. First, we compute a classical TRF and CRF in a global least-squares adjustment by stacking normal equations from 5446 VLBI sessions between 1979 and 2016 using the Vienna VLBI and Satellite Software VieVS (solution level 1). Next, we obtain coordinate residuals from the global adjustment by applying the level-1 TRF and CRF in the single-session analysis and estimating coordinate offsets. These residuals are fed into a Kalman filter and smoother, taking into account the stochastic properties of the individual stations and radio sources. The resulting coordinate time series (solution level 2) serve as an additional layer representing irregular variations not considered in the first level of our approach. Both levels of our solution are implemented in VieVS in order to test their individual and combined performance regarding the repeatabilities of estimated baseline lengths, EOP, and radio source coordinates.

  2. Pollen reference collection digitization

    NARCIS (Netherlands)

    Ercan, F.E.Z.; Donders, T.H.; Bijl, P.K.; Wagner, F.

    2016-01-01

    The extensive Utrecht University pollen reference collection holds thousands of pollen samples of many species and genera from all over the world and has been a basis for the widely-used North West European Pollen Flora. These samples are fixed on glass slides for microscopy use, but the aging

  3. Towards a system level understanding of non-model organisms sampled from the environment: a network biology approach.

    Directory of Open Access Journals (Sweden)

    Tim D Williams

    2011-08-01

    Full Text Available The acquisition and analysis of datasets including multi-level omics and physiology from non-model species, sampled from field populations, is a formidable challenge, which so far has prevented the application of systems biology approaches. If successful, these could contribute enormously to improving our understanding of how populations of living organisms adapt to environmental stressors relating to, for example, pollution and climate. Here we describe the first application of a network inference approach integrating transcriptional, metabolic and phenotypic information representative of wild populations of the European flounder fish, sampled at seven estuarine locations in northern Europe with different degrees and profiles of chemical contaminants. We identified network modules, whose activity was predictive of environmental exposure and represented a link between molecular and morphometric indices. These sub-networks represented both known and candidate novel adverse outcome pathways representative of several aspects of human liver pathophysiology such as liver hyperplasia, fibrosis, and hepatocellular carcinoma. At the molecular level these pathways were linked to TNF alpha, TGF beta, PDGF, AGT and VEGF signalling. More generally, this pioneering study has important implications as it can be applied to model molecular mechanisms of compensatory adaptation to a wide range of scenarios in wild populations.

  4. Polypyrrole solid phase microextraction: A new approach to rapid sample preparation for the monitoring of antibiotic drugs

    International Nuclear Information System (INIS)

    Szultka, Malgorzata; Kegler, Ricarda; Fuchs, Patricia; Olszowy, Pawel; Miekisch, Wolfram; Schubert, Jochen K.; Buszewski, Boguslaw; Mundkowski, Ralf G.

    2010-01-01

    Simple or even rapid bioanalytical methods are rare, since they generally involve complicated, time-consuming sample preparation from the biological matrices like LLE or SPE. SPME provides a promising approach to overcome these limitations. The full potential of this innovative technique for medical diagnostics, pharmacotherapy or biochemistry has not been tapped yet. In-house manufactured SPME probes with polypyrrole (PPy) coating were evaluated using three antibiotics of high clinical relevance - linezolid, daptomycin, and moxifloxacin - from PBS, plasma, and whole blood. The PPy coating was characterised by scanning electron microscopy. Influences of pH, inorganic salt, and blood anticoagulants were studied for optimum performance. Extraction yields were determined from stagnant media as well as re-circulating human blood using the heart-and-lung machine model system. The PPy-SPME fibres showed high extraction yields, particularly regarding linezolid. The reproducibility of the method was optimised to achieve RSDs of 9% or 17% and 7% for SPME from stagnant or re-circulating blood using fresh and re-used fibres, respectively. The PPy-SPME approach was demonstrated to meet the requirements of therapeutic monitoring of the drugs tested, even from re-circulating blood at physiological flow rates. SPME represents a rapid and simple dual-step procedure with potency to significantly reduce the effort and expenditure of complicated sample preparations in biomedical analysis.

  5. Polypyrrole solid phase microextraction: A new approach to rapid sample preparation for the monitoring of antibiotic drugs

    Energy Technology Data Exchange (ETDEWEB)

    Szultka, Malgorzata [Department of Environmental Chemistry and Bioanalytics, Faculty of Chemistry, Nicolaus, Copernicus University, Gagarin 7, 87-100 Torun (Poland); Kegler, Ricarda [Institute of Clinical Pharmacology, University of Rostock, Schillingallee 70, D-18057 Rostock (Germany); Fuchs, Patricia [Department of Anaesthesia and Intensive Care, University of Rostock, Schillingallee 35, D-18057 Rostock (Germany); Olszowy, Pawel [Department of Environmental Chemistry and Bioanalytics, Faculty of Chemistry, Nicolaus, Copernicus University, Gagarin 7, 87-100 Torun (Poland); Miekisch, Wolfram; Schubert, Jochen K. [Department of Anaesthesia and Intensive Care, University of Rostock, Schillingallee 35, D-18057 Rostock (Germany); Buszewski, Boguslaw [Department of Environmental Chemistry and Bioanalytics, Faculty of Chemistry, Nicolaus, Copernicus University, Gagarin 7, 87-100 Torun (Poland); Mundkowski, Ralf G., E-mail: ralf.mundkowski@med.uni-rostock.de [Institute of Clinical Pharmacology, University of Rostock, Schillingallee 70, D-18057 Rostock (Germany)

    2010-05-14

    Simple or even rapid bioanalytical methods are rare, since they generally involve complicated, time-consuming sample preparation from the biological matrices like LLE or SPE. SPME provides a promising approach to overcome these limitations. The full potential of this innovative technique for medical diagnostics, pharmacotherapy or biochemistry has not been tapped yet. In-house manufactured SPME probes with polypyrrole (PPy) coating were evaluated using three antibiotics of high clinical relevance - linezolid, daptomycin, and moxifloxacin - from PBS, plasma, and whole blood. The PPy coating was characterised by scanning electron microscopy. Influences of pH, inorganic salt, and blood anticoagulants were studied for optimum performance. Extraction yields were determined from stagnant media as well as re-circulating human blood using the heart-and-lung machine model system. The PPy-SPME fibres showed high extraction yields, particularly regarding linezolid. The reproducibility of the method was optimised to achieve RSDs of 9% or 17% and 7% for SPME from stagnant or re-circulating blood using fresh and re-used fibres, respectively. The PPy-SPME approach was demonstrated to meet the requirements of therapeutic monitoring of the drugs tested, even from re-circulating blood at physiological flow rates. SPME represents a rapid and simple dual-step procedure with potency to significantly reduce the effort and expenditure of complicated sample preparations in biomedical analysis.

  6. Towards a system level understanding of non-model organisms sampled from the environment: a network biology approach.

    Science.gov (United States)

    Williams, Tim D; Turan, Nil; Diab, Amer M; Wu, Huifeng; Mackenzie, Carolynn; Bartie, Katie L; Hrydziuszko, Olga; Lyons, Brett P; Stentiford, Grant D; Herbert, John M; Abraham, Joseph K; Katsiadaki, Ioanna; Leaver, Michael J; Taggart, John B; George, Stephen G; Viant, Mark R; Chipman, Kevin J; Falciani, Francesco

    2011-08-01

    The acquisition and analysis of datasets including multi-level omics and physiology from non-model species, sampled from field populations, is a formidable challenge, which so far has prevented the application of systems biology approaches. If successful, these could contribute enormously to improving our understanding of how populations of living organisms adapt to environmental stressors relating to, for example, pollution and climate. Here we describe the first application of a network inference approach integrating transcriptional, metabolic and phenotypic information representative of wild populations of the European flounder fish, sampled at seven estuarine locations in northern Europe with different degrees and profiles of chemical contaminants. We identified network modules, whose activity was predictive of environmental exposure and represented a link between molecular and morphometric indices. These sub-networks represented both known and candidate novel adverse outcome pathways representative of several aspects of human liver pathophysiology such as liver hyperplasia, fibrosis, and hepatocellular carcinoma. At the molecular level these pathways were linked to TNF alpha, TGF beta, PDGF, AGT and VEGF signalling. More generally, this pioneering study has important implications as it can be applied to model molecular mechanisms of compensatory adaptation to a wide range of scenarios in wild populations.

  7. Prognostic evaluation of DNA index in HIV-HPV co-infected women cervical samples attending in reference centers for HIV-AIDS in Recife.

    Directory of Open Access Journals (Sweden)

    Albert Eduardo Silva Martins

    Full Text Available INTRODUCTION: Persistence of cervical infection caused by human papillomavirus (HPV types with high oncogenic risk may lead to cervical intraepithelial neoplasia (CIN. The aim of the present study was to evaluate whether, in HIV-positive women, the presence of aneuploidy in cervical cell samples is associated with presence and evolution of CIN. METHODS: The present study had two stages. In the first stage, comprising a cross-sectional study, the association between the presence of aneuploidy seen via flow cytometry and sociodemographic characteristics, habits and characteristics relating to HPV and HIV infection was analyzed. In the second stage, comprising a cohort study, it was investigated whether aneuploidy was predictive of CIN evolution. RESULTS: No association was observed between the presence of aneuploidy and HPV infection, or between its presence and alterations seen in oncotic cytological analysis. On the other hand, aneuploidy was associated with the presence of CIN (p = 0.030 in histological analysis and with nonuse of antiretroviral therapy (p = 0.001. Most of the HIV-positive women (234/272 presented normal CD4+ T lymphocyte counts (greater than 350 cells/mm3 and showed a greater aneuploidy regression rate (77.5% than a progression rate (23.9% over a follow-up of up to two years. CONCLUSION: Although there was an association between the presence of cervical tissue lesions and the DNA index, the latter was not predictive of progression of the cervical lesion. This suggests that progression of the cervical lesion to cancer in HIV-positive women may also be changed through improvement of the immunological state enabled by using antiretroviral therapy.

  8. Population Pharmacokinetics of Gemcitabine and dFdU in Pancreatic Cancer Patients Using an Optimal Design, Sparse Sampling Approach.

    Science.gov (United States)

    Serdjebi, Cindy; Gattacceca, Florence; Seitz, Jean-François; Fein, Francine; Gagnière, Johan; François, Eric; Abakar-Mahamat, Abakar; Deplanque, Gael; Rachid, Madani; Lacarelle, Bruno; Ciccolini, Joseph; Dahan, Laetitia

    2017-06-01

    Gemcitabine remains a pillar in pancreatic cancer treatment. However, toxicities are frequently observed. Dose adjustment based on therapeutic drug monitoring might help decrease the occurrence of toxicities. In this context, this work aims at describing the pharmacokinetics (PK) of gemcitabine and its metabolite dFdU in pancreatic cancer patients and at identifying the main sources of their PK variability using a population PK approach, despite a sparse sampled-population and heterogeneous administration and sampling protocols. Data from 38 patients were included in the analysis. The 3 optimal sampling times were determined using KineticPro and the population PK analysis was performed on Monolix. Available patient characteristics, including cytidine deaminase (CDA) status, were tested as covariates. Correlation between PK parameters and occurrence of severe hematological toxicities was also investigated. A two-compartment model best fitted the gemcitabine and dFdU PK data (volume of distribution and clearance for gemcitabine: V1 = 45 L and CL1 = 4.03 L/min; for dFdU: V2 = 36 L and CL2 = 0.226 L/min). Renal function was found to influence gemcitabine clearance, and body surface area to impact the volume of distribution of dFdU. However, neither CDA status nor the occurrence of toxicities was correlated to PK parameters. Despite sparse sampling and heterogeneous administration and sampling protocols, population and individual PK parameters of gemcitabine and dFdU were successfully estimated using Monolix population PK software. The estimated parameters were consistent with previously published results. Surprisingly, CDA activity did not influence gemcitabine PK, which was explained by the absence of CDA-deficient patients enrolled in the study. This work suggests that even sparse data are valuable to estimate population and individual PK parameters in patients, which will be usable to individualize the dose for an optimized benefit to risk ratio.

  9. Micelle assisted thin-film solid phase microextraction: a new approach for determination of quaternary ammonium compounds in environmental samples.

    Science.gov (United States)

    Boyacı, Ezel; Pawliszyn, Janusz

    2014-09-16

    Determination of quaternary ammonium compounds (QACs) often is considered to be a challenging undertaking owing to secondary interactions of the analytes' permanently charged quaternary ammonium head or hydrophobic tail with the utilized labware. Here, for the first time, a micelle assisted thin-film solid phase microextraction (TF-SPME) using a zwitterionic detergent 3-[(3-cholamidopropyl)dimethylammonio]-1-propanesulfonate (CHAPS) as a matrix modifier is introduced as a novel approach for in-laboratory sample preparation of the challenging compounds. The proposed micelle assisted TF-SPME method offers suppression/enhancement free electrospray ionization of analytes in mass spectrometric detection, minimal interaction of the micelles with the TF-SPME coating, and chromatographic stationary phase and analysis free of secondary interactions. Moreover, it was found that the matrix modifier has multiple functions; when its concentration is found below the critical micelle concentration (CMC), the matrix modifier primarily acts as a surface deactivator; above its CMC, it acts as a stabilizer for QACs. Additionally, shorter equilibrium extraction times in the presence of the modifier demonstrated that micelles also assist in the transfer of analytes from the bulk of the sample to the surface of the coating. The developed micelle assisted TF-SPME protocol using the 96-blade system requires only 30 min of extraction and 15 min of desorption. Together with a conditioning step (15 min), the entire method is 60 min; considering the advantage of using the 96-blade system, if all the blades in the brush are used, the sample preparation time per sample is 0.63 min. Moreover, the recoveries for all analytes with the developed method were found to range within 80.2-97.3%; as such, this method can be considered an open bed solid phase extraction. The proposed method was successfully validated using real samples.

  10. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    Science.gov (United States)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  11. A systematic approach towards the objective evaluation of low-contrast performance in MDCT: Combination of a full-reference image fidelity metric and a software phantom

    International Nuclear Information System (INIS)

    Falck, Christian von; Rodt, Thomas; Waldeck, Stephan; Hartung, Dagmar; Meyer, Bernhard; Wacker, Frank; Shin, Hoen-oh

    2012-01-01

    Objectives: To assess the feasibility of an objective approach for the evaluation of low-contrast detectability in multidetector computed-tomography (MDCT) by combining a virtual phantom containing simulated lesions with an image quality metric. Materials and methods: A low-contrast phantom containing hypodense spheric lesions (−20 HU) was scanned on a 64-slice MDCT scanner at 4 different dose levels (25, 50, 100, 200 mAs). In addition, virtual round hypodense low-contrast lesions (20 HU object contrast) based on real CT data were inserted into the lesion-free section of the datasets. The sliding-thin-slab algorithm was applied to the image data with an increasing slice-thickness from 1 to 15 slices. For each dataset containing simulated lesions a lesion-free counterpart was reconstructed and post-processed in the same manner. The low-contrast performance of all datasets containing virtual lesions was determined using a full-reference image quality metric (modified multiscale structural similarity index, MS-SSIM*). The results were validated against a reader-study of the real lesions. Results: For all dose levels and lesion sizes there was no statistically significant difference between the low-contrast performance as determined by the image quality metric when compared to the reader study (p < 0.05). The intraclass correlation coefficient was 0.72, 0.82, 0.90 and 0.84 for lesion diameters of 4 mm, 5 mm, 8 mm and 10 mm, respectively. The use of the sliding-thin-slab algorithm improves lesion detectability by a factor ranging from 1.15 to 2.69 when compared with the original axial slice (0.625 mm). Conclusion: The combination of a virtual phantom and a full-reference image quality metric enables a systematic, automated and objective evaluation of low-contrast detectability in MDCT datasets and correlates well with the judgment of human readers.

  12. Genotyping-by-sequencing for Populus population genomics: an assessment of genome sampling patterns and filtering approaches.

    Directory of Open Access Journals (Sweden)

    Martin P Schilling

    Full Text Available Continuing advances in nucleotide sequencing technology are inspiring a suite of genomic approaches in studies of natural populations. Researchers are faced with data management and analytical scales that are increasing by orders of magnitude. With such dramatic advances comes a need to understand biases and error rates, which can be propagated and magnified in large-scale data acquisition and processing. Here we assess genomic sampling biases and the effects of various population-level data filtering strategies in a genotyping-by-sequencing (GBS protocol. We focus on data from two species of Populus, because this genus has a relatively small genome and is emerging as a target for population genomic studies. We estimate the proportions and patterns of genomic sampling by examining the Populus trichocarpa genome (Nisqually-1, and demonstrate a pronounced bias towards coding regions when using the methylation-sensitive ApeKI restriction enzyme in this species. Using population-level data from a closely related species (P. tremuloides, we also investigate various approaches for filtering GBS data to retain high-depth, informative SNPs that can be used for population genetic analyses. We find a data filter that includes the designation of ambiguous alleles resulted in metrics of population structure and Hardy-Weinberg equilibrium that were most consistent with previous studies of the same populations based on other genetic markers. Analyses of the filtered data (27,910 SNPs also resulted in patterns of heterozygosity and population structure similar to a previous study using microsatellites. Our application demonstrates that technically and analytically simple approaches can readily be developed for population genomics of natural populations.

  13. Use of Complementary Approaches to Imaging Biomolecules and Endogenous and Exogenous Trace Elements and Nanoparticles in Biological Samples

    Science.gov (United States)

    Brown, Koshonna Dinettia

    X-ray Fluorescence Microscopy (XFM) is a useful technique for study of biological samples. XFM was used to map and quantify endogenous biological elements as well as exogenous materials in biological samples, such as the distribution of titanium dioxide (TiO2) nanoparticles. TiO 2 nanoparticles are produced for many different purposes, including development of therapeutic and diagnostic particles for cancer detection and treatment, drug delivery, and induction of DNA breaks. Delivery of such nanoparticles can be targeted to specific cells and subcellular structures. In this work, we develop two novel approaches to stain TiO2 nanoparticles for optical microscopy and to confirm that staining by XFM. The first approach utilizes fluorescent biotin and fluorescent streptavidin to label the nanoparticles before and after cellular uptake; the second approach is based on the copper-catalyzed azide-alkyne cycloaddition, the so-called CLICK chemistry, for labeling of azide conjugated TiO2 nanoparticles with "clickable" dyes such as alkyne Alexa Fluor dyes with a high fluorescent yield. To confirm that the optical fluorescence signals of nanoparticles stained in situ match the distribution of the Ti element, we used high resolution synchrotron X-Ray Fluorescence Microscopy (XFM) using the Bionanoprobe instrument at the Advanced Photon Source at Argonne National Laboratory. Titanium-specific X-ray fluorescence showed excellent overlap with the location of Alexa Fluor optical fluorescence detected by confocal microscopy. In this work XFM was also used to investigate native elemental differences between two different types of head and neck cancer, one associated with human papilloma virus infection, the other virus free. Future work may see a cross between these themes, for example, exploration of TiO2 nanoparticles as anticancer treatment for these two different types of head and neck cancer.

  14. An Integrated Approach Using Chaotic Map & Sample Value Difference Method for Electrocardiogram Steganography and OFDM Based Secured Patient Information Transmission.

    Science.gov (United States)

    Pandey, Anukul; Saini, Barjinder Singh; Singh, Butta; Sood, Neetu

    2017-10-18

    This paper presents a patient's confidential data hiding scheme in electrocardiogram (ECG) signal and its subsequent wireless transmission. Patient's confidential data is embedded in ECG (called stego-ECG) using chaotic map and the sample value difference approach. The sample value difference approach effectually hides the patient's confidential data in ECG sample pairs at the predefined locations. The chaotic map generates these predefined locations through the use of selective control parameters. Subsequently, the wireless transmission of the stego-ECG is analyzed using the Orthogonal Frequency Division Multiplexing (OFDM) system in a Rayleigh fading scenario for telemedicine applications. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through the statistical and clinical performance measures. Statistical measures comprise of Percentage Root-mean-square Difference (PRD), Peak Signal to Noise Ratio (PSNR), and Kulback-Leibler Divergence (KL-Div), etc. while clinical metrics includes wavelet Energy Based Diagnostic Distortion (WEDD) and Wavelet based Weighted PRD (WWPRD). The various channel Signal-to-Noise Ratio scenarios are simulated for wireless communication of stego-ECG in OFDM system. The proposed method over all the 48 records of MIT-BIH arrhythmia database resulted in average, PRD = 0.26, PSNR = 55.49, KL-Div = 3.34 × 10 -6 , WEDD = 0.02, and WWPRD = 0.10 with secret data size of 21Kb. Further, a comparative analysis of proposed method and recent existing works was also performed. The results clearly, demonstrated the superiority of proposed method.

  15. Reference distributions for complement proteins C3 and C4: a practical, simple and clinically relevant approach in a large cohort.

    Science.gov (United States)

    Ritchie, Robert F; Palomaki, Glenn E; Neveux, Louis M; Navolotskaia, Olga; Ledue, Thomas B; Craig, Wendy Y

    2004-01-01

    The two serum proteins of the complement cascade in the highest concentrations, C3 and C4, respond to various conditions in much the same manner as do other positive acute-phase proteins. A major difference is that they are relatively sluggish in response to cytokine drive, requiring several days rather than hours to be detectably elevated by serial measurements. As with other acute-phase proteins, there are many processes that up- or down-regulate synthesis, including infection or inflammation, hepatic failure, and immune-complex formation. Clinicians may find it difficult to distinguish among these processes, because they often occur simultaneously. The situation is further complicated by genetic polymorphism, with rare instances of markedly reduced synthesis and circulating levels, and consequent vulnerability to infection. C3 and C4 are measured for clinical purposes to help define certain rheumatic and immunologically mediated renal diseases. Interpreting the measured blood levels of these two components requires one to consider the intensity of the inflammatory drive, the timing of the suspected clinical process, the production of complement-consuming immune complexes, and the possible existence of benign circumstances. In this fifth article in a series, reference ranges for serum levels of two complement proteins (C3 and C4) are examined. The study is based on a cohort of over 55,000 Caucasian individuals from northern New England, who were tested in our laboratory in 1994-1999. Measurements were standardized against certified reference material (CRM) 470/reference preparation for proteins in human serum (RPPHS), and analyzed using a previously described statistical approach. Individuals with unequivocal laboratory evidence of inflammation (C-reactive protein of 10 mg/L or higher) were excluded. Our results show that the levels of C3 and C4 change little during life and between the sexes, except that they increase slightly and then fall after age 20 in males

  16. Development of a surface plasmon resonance biosensing approach for the rapid detection of porcine circovirus type2 in sample solutions.

    Directory of Open Access Journals (Sweden)

    Jiandong Hu

    Full Text Available A sensitive and label-free analytical approach for the detection of porcine circovirus type 2 (PCV2 instead of PCV2 antibody in serum sample was systematically investigated in this research based on surface plasmon resonance (SPR with an establishment of special molecular identification membrane. The experimental device for constructing the biosensing analyzer is composed of an integrated biosensor, a home-made microfluidic module, and an electrical control circuit incorporated with a photoelectric converter. In order to detect the PCV2 using the surface plasmon resonance immunoassay, the mercaptopropionic acid has been used to bind the Au film in advance through the known form of the strong S-Au covalent bonds formed by the chemical radical of the mercaptopropionic acid and the Au film. PCV2 antibodies were bonded with the mercaptopropionic acid by covalent -CO-NH- amide bonding. For the purpose of evaluating the performance of this approach, the known concentrations of PCV2 Cap protein of 10 µg/mL, 7.5 µg/mL, 5 µg/mL, 2.5 µg/mL, 1 µg/mL, and 0.5 µg/mL were prepared by diluting with PBS successively and then the delta response units (ΔRUs were measured individually. Using the data collected from the linear CCD array, the ΔRUs gave a linear response over a wide concentration range of standard known concentrations of PCV2 Cap protein with the R-Squared value of 0.99625. The theoretical limit of detection was calculated to be 0.04 µg/mL for the surface plasmon resonance biosensing approach. Correspondingly, the recovery rate ranged from 81.0% to 89.3% was obtained. In contrast to the PCV2 detection kits, this surface plasmon resonance biosensing system was validated through linearity, precision and recovery, which demonstrated that the surface plasmon resonance immunoassay is reliable and robust. It was concluded that the detection method which is associated with biomembrane properties is expected to contribute much to determine the PCV2

  17. imFASP: An integrated approach combining in-situ filter-aided sample pretreatment with microwave-assisted protein digestion for fast and efficient proteome sample preparation.

    Science.gov (United States)

    Zhao, Qun; Fang, Fei; Wu, Ci; Wu, Qi; Liang, Yu; Liang, Zhen; Zhang, Lihua; Zhang, Yukui

    2016-03-17

    An integrated sample preparation method, termed "imFASP", which combined in-situ filter-aided sample pretreatment and microwave-assisted trypsin digestion, was developed for preparation of microgram and even nanogram amounts of complex protein samples with high efficiency in 1 h. For imFASP method, proteins dissolved in 8 M urea were loaded onto a filter device with molecular weight cut off (MWCO) as 10 kDa, followed by in-situ protein preconcentration, denaturation, reduction, alkylation, and microwave-assisted tryptic digestion. Compared with traditional in-solution sample preparation method, imFASP method generated more protein and peptide identifications (IDs) from preparation of 45 μg Escherichia coli protein sample due to the higher efficiency, and the sample preparation throughput was significantly improved by 14 times (1 h vs. 15 h). More importantly, when the starting amounts of E. coli cell lysate decreased to nanogram level (50-500 ng), the protein and peptide identified by imFASP method were improved at least 30% and 44%, compared with traditional in-solution preparation method, suggesting dramatically higher peptide recovery of imFASP method for trace amounts of complex proteome samples. All these results demonstrate that the imFASP method developed here is of high potential for high efficient and high throughput preparation of trace amounts of complex proteome samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Reference values and evaluation of the results of intercomparisons

    International Nuclear Information System (INIS)

    Aigner, H.; Deron, S.; Kuhn, E.

    1981-01-01

    The need of a reference value for the composition of materials distributed in intercomparisons is generally recognized. A single reference laboratory or a group of reference laboratories may be used to establish this reference value. The respective advantages and limitations of the two approaches are discussed. The reference measurements must be evaluated to provide the confidence limits of the reference value but also an estimate of the possible heterogeneity of the materials and its samples. The results of the intercomparison measurements should themselves be evaluated to test and discuss the significance of the biases of individual and selected groups of laboratories or techniques. The approach taken by the Analytical Quality Control Services of the International Atomic Energy Agency is illustrated by the SR-1 intercomparison on uranium assay in UO 2 powder

  19. Laser-Assisted Sampling Techniques in Combination with ICP-MS: A Novel Approach for Particle Analysis at the IAEA Environmental Samples Laboratory

    International Nuclear Information System (INIS)

    Dzigal, N.; Chinea-Cano, E.

    2015-01-01

    Researchers have found many applications for lasers. About two decades ago, scientists started using lasers as sample introduction instruments for mass spectrometry measurements. Similarly, lasers as micro-dissection tools have also been increasingly on demand in the fields of life sciences, materials science, forensics, etc. This presentation deals with the interception of these aforementioned laser-assisted techniques to the field of particle analysis. Historically, the use of a nanosecond laser to ablate material has been used in materials science. Recently, it has been proven that in the analysis of particulate materials the disadvantages associated with the utilization of nanosecond lasers such as overheating and melting of the sample are suppressed when using femtosecond lasers. Further, due to the length of a single laser shot, fs-LA allows a more controlled ablation to occur and therefore the sample plasma is more homogeneous and less mass-fractionation events are detected. The use of laser micro-dissection devices enables the physical segmentation of microsized artefacts previously performed by a laborious manual procedure. By combining the precision of the laser cutting inherent to the LMD technique together with a particle identification methodology, one can increase the efficiency of single particle isolation. Further, besides the increase in throughput of analyses, this combination enhances the signal-to-noise ratio by removing matrix particles effectively. Specifically, this contribution describes the use of an Olympus+MMI laser microdissection device in improving the sample preparation of environmental swipe samples and the installation of an Applied Spectra J200 fs-LA/LIBS (laser ablation/laser inducedbreakdown spectroscopy) system as a sample introduction device to a quadrupole mass spectrometer, the iCap Q from Thermofisher Scientific at the IAEA Environmental Samples Laboratory are explored. Preliminary results of the ongoing efforts for the

  20. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  1. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    Science.gov (United States)

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.

  2. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    Energy Technology Data Exchange (ETDEWEB)

    Thien, Mike G. [Washington River Protection Solutions, LLC, P.O Box 850, Richland WA, 99352 (United States); Barnes, Steve M. [Waste Treatment Plant, 2435 Stevens Center Place, Richland WA 99354 (United States)

    2013-07-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)

  3. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests

    International Nuclear Information System (INIS)

    Thien, Mike G.; Barnes, Steve M.

    2013-01-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broad spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described

  4. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  5. Metrology for stable isotope reference materials: 13C/12C and 18O/16O isotope ratio value assignment of pure carbon dioxide gas samples on the Vienna PeeDee Belemnite-CO2 scale using dual-inlet mass spectrometry.

    Science.gov (United States)

    Srivastava, Abneesh; Michael Verkouteren, R

    2018-05-25

    Isotope ratio measurements have been conducted on a series of isotopically distinct pure CO 2 gas samples using the technique of dual-inlet isotope ratio mass spectrometry (DI-IRMS). The influence of instrumental parameters, data normalization schemes on the metrological traceability and uncertainty of the sample isotope composition have been characterized. Traceability to the Vienna PeeDee Belemnite(VPDB)-CO 2 scale was realized using the pure CO 2 isotope reference materials(IRMs) 8562, 8563, and 8564. The uncertainty analyses include contributions associated with the values of iRMs and the repeatability and reproducibility of our measurements. Our DI-IRMS measurement system is demonstrated to have high long-term stability, approaching a precision of 0.001 parts-per-thousand for the 45/44 and 46/44 ion signal ratios. The single- and two-point normalization bias for the iRMs were found to be within their published standard uncertainty values. The values of 13 C/ 12 C and 18 O/ 16 O isotope ratios are expressed relative to VPDB-CO 2 using the [Formula: see text] and [Formula: see text] notation, respectively, in parts-per-thousand (‰ or per mil). For the samples, value assignments between (-25 to +2) ‰ and (-33 to -1) ‰ with nominal combined standard uncertainties of (0.05, 0.3) ‰ for [Formula: see text] and [Formula: see text], respectively were obtained. These samples are used as laboratory reference to provide anchor points for value assignment of isotope ratios (with VPDB traceability) to pure CO 2 samples. Additionally, they serve as potential parent isotopic source material required for the development of gravimetric based iRMs of CO 2 in CO 2 -free dry air in high pressure gas cylinder packages at desired abundance levels and isotopic composition values. Graphical abstract CO 2 gas isotope ratio metrology.

  6. A semi-empirical approach to analyze the activities of cylindrical radioactive samples using gamma energies from 185 to 1764 keV.

    Science.gov (United States)

    Huy, Ngo Quang; Binh, Do Quang

    2014-12-01

    This work suggests a method for determining the activities of cylindrical radioactive samples. The self-attenuation factor was applied for providing the self-absorption correction of gamma rays in the sample material. The experimental measurement of a (238)U reference sample and the calculation using the MCNP5 code allow obtaining the semi-empirical formulae of detecting efficiencies for the gamma energies ranged from 185 to 1764keV. These formulae were used to determine the activities of the (238)U, (226)Ra, (232)Th, (137)Cs and (40)K nuclides in the IAEA RGU-1, IAEA-434, IAEA RGTh-1, IAEA-152 and IAEA RGK-1 radioactive standards. The coincidence summing corrections for gamma rays in the (238)U and (232)Th series were applied. The activities obtained in this work were in good agreement with the reference values. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Novel approach to systematic random sampling in population surveys: Lessons from the United Arab Emirates National Diabetes Study (UAEDIAB).

    Science.gov (United States)

    Sulaiman, Nabil; Albadawi, Salah; Abusnana, Salah; Fikri, Mahmoud; Madani, Abdulrazzag; Mairghani, Maisoon; Alawadi, Fatheya; Zimmet, Paul; Shaw, Jonathan

    2015-09-01

    The prevalence of diabetes has risen rapidly in the Middle East, particularly in the Gulf Region. However, some prevalence estimates have not fully accounted for large migrant worker populations and have focused on minority indigenous populations. The objectives of the UAE National Diabetes and Lifestyle Study are to: (i) define the prevalence of, and risk factors for, T2DM; (ii) describe the distribution and determinants of T2DM risk factors; (iii) study health knowledge, attitudes, and (iv) identify gene-environment interactions; and (v) develop baseline data for evaluation of future intervention programs. Given the high burden of diabetes in the region and the absence of accurate data on non-UAE nationals in the UAE, a representative sample of the non-UAE nationals was essential. We used an innovative methodology in which non-UAE nationals were sampled when attending the mandatory biannual health check that is required for visa renewal. Such an approach could also be used in other countries in the region. Complete data were available for 2719 eligible non-UAE nationals (25.9% Arabs, 70.7% Asian non-Arabs, 1.1% African non-Arabs, and 2.3% Westerners). Most were men < 65 years of age. The response rate was 68%, and the non-response was greater among women than men; 26.9% earned less than UAE Dirham (AED) 24 000 (US$6500) and the most common areas of employment were as managers or professionals, in service and sales, and unskilled occupations. Most (37.4%) had completed high school and 4.1% had a postgraduate degree. This novel methodology could provide insights for epidemiological studies in the UAE and other Gulf States, particularly for expatriates. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  8. An integrative pharmacological approach to radio telemetry and blood sampling in pharmaceutical drug discovery and safety assessment.

    Science.gov (United States)

    Litwin, Dennis C; Lengel, David J; Kamendi, Harriet W; Bialecki, Russell A

    2011-01-18

    A successful integration of the automated blood sampling (ABS) and telemetry (ABST) system is described. The new ABST system facilitates concomitant collection of physiological variables with blood and urine samples for determination of drug concentrations and other biochemical measures in the same rat without handling artifact. Integration was achieved by designing a 13 inch circular receiving antenna that operates as a plug-in replacement for the existing pair of DSI's orthogonal antennas which is compatible with the rotating cage and open floor design of the BASi Culex® ABS system. The circular receiving antenna's electrical configuration consists of a pair of electrically orthogonal half-toroids that reinforce reception of a dipole transmitter operating within the coil's interior while reducing both external noise pickup and interference from other adjacent dipole transmitters. For validation, measured baclofen concentration (ABST vs. satellite (μM): 69.6 ± 23.8 vs. 76.6 ± 19.5, p = NS) and mean arterial pressure (ABST vs. traditional DSI telemetry (mm Hg): 150 ± 5 vs.147 ± 4, p = NS) variables were quantitatively and qualitatively similar between rats housed in the ABST system and traditional home cage approaches. The ABST system offers unique advantages over traditional between-group study paradigms that include improved data quality and significantly reduced animal use. The superior within-group model facilitates assessment of multiple physiological and biochemical responses to test compounds in the same animal. The ABST also provides opportunities to evaluate temporal relations between parameters and to investigate anomalous outlier events because drug concentrations, physiological and biochemical measures for each animal are available for comparisons.

  9. Roaming Reference: Reinvigorating Reference through Point of Need Service

    Directory of Open Access Journals (Sweden)

    Kealin M. McCabe

    2011-11-01

    Full Text Available Roaming reference service was pursued as a way to address declining reference statistics. The service was staffed by librarians armed with iPads over a period of six months during the 2010-2011 academic year. Transactional statistics were collected in relation to query type (Research, Facilitative or Technology, location and approach (librarian to patron, patron to librarian or via chat widget. Overall, roaming reference resulted in an additional 228 reference questions, 67% (n=153 of which were research related. Two iterations of the service were implemented, roaming reference as a standalone service (Fall 2010 and roaming reference integrated with traditional reference desk duties (Winter 2011. The results demonstrate that although the Weller Library’s reference transactions are declining annually, they are not disappearing. For a roaming reference service to succeed, it must be a standalone service provided in addition to traditional reference services. The integration of the two reference models (roaming reference and reference desk resulted in a 56% decline in the total number of roaming reference questions from the previous term. The simple act of roaming has the potential to reinvigorate reference services as a whole, forcing librarians outside their comfort zones, allowing them to reach patrons at their point of need.

  10. A multi-assay screening approach for assessment of endocrine-active contaminants in wastewater effluent samples

    Energy Technology Data Exchange (ETDEWEB)

    Metcalfe, Chris D., E-mail: cmetcalfe@trentu.ca [Environmental and Resource Studies, Trent University, Peterborough, ON, K9J 7B8 (Canada); Kleywegt, Sonya [Standards Development Branch, Ontario Ministry of the Environment, 40 St. Clair Ave. West, Toronto, ON, M4V 1M2 (Canada); Letcher, Robert J. [Ecotoxicology and Wildlife Health Division, Science and Technology Branch, Environment Canada, National Wildlife Research Centre, Carleton University, Ottawa, ON, K1A 0H3 (Canada); Topp, Edward [Agriculture and Agri-Food Canada, Southern Crop Protection and Food Research Centre, London, ON, N5V 7T3 (Canada); Wagh, Purva; Trudeau, Vance L.; Moon, Thomas W. [Department of Biology and Centre for Advanced Research in Environmental Genomics, University of Ottawa, Ottawa, ON, K1N 6N5 (Canada)

    2013-06-01

    Environmental agencies must monitor an ever increasing range of contaminants of emerging concern, including endocrine disrupting compounds (EDCs). An alternative to using ultra-trace chemical analysis of samples for EDCs is to test for biological activity using in vitro screening assays, then use these assay results to direct analytical chemistry approaches. In this study, we used both analytical approaches and in vitro bioassays to characterize the EDCs present in treated wastewater from four wastewater treatment plants (WWTPs) in Ontario, Canada. Estrogen-mediated activity was assessed using a yeast estrogenicity screening (YES) assay. An in vitro competitive binding assay was used to assess capacity to interfere with binding of the thyroid hormone, thyroxine (T4) to the recombinant human thyroid hormone transport protein, transthyretin (i.e. hTTR). An in vitro binding assay with a rat peroxisome proliferator responsive element transfected into a rainbow trout gill cell line was used to evaluate binding and subsequent gene expression via the peroxisome proliferator activated receptor (PPAR). Analyses of a suite of contaminants known to be EDCs in extracts from treated wastewater were conducted using either gas chromatography with mass spectrometry (GC-MS) or liquid chromatography with tandem mass spectrometry (LC-MS/MS). Estrogenic activity was detected in the YES assay only in those extracts that contained detectable amounts of estradiol (E2). There was a positive relationship between the degree of response in the T4-hTTR assay and the amounts of polybrominated diphenyl ether (PBDE) congeners 47 and 99, triclosan and the PBDE metabolite, 4-OH-BDE17. Several wastewater extracts gave a positive response in the PPAR assay, but these responses were not correlated with the amounts of any of the EDCs analyzed by LC-MS/MS. Overall, these data indicate that a step-wise approach is feasible using a combination of in vitro testing and instrumental analysis to monitor for

  11. A multi-assay screening approach for assessment of endocrine-active contaminants in wastewater effluent samples

    International Nuclear Information System (INIS)

    Metcalfe, Chris D.; Kleywegt, Sonya; Letcher, Robert J.; Topp, Edward; Wagh, Purva; Trudeau, Vance L.; Moon, Thomas W.

    2013-01-01

    Environmental agencies must monitor an ever increasing range of contaminants of emerging concern, including endocrine disrupting compounds (EDCs). An alternative to using ultra-trace chemical analysis of samples for EDCs is to test for biological activity using in vitro screening assays, then use these assay results to direct analytical chemistry approaches. In this study, we used both analytical approaches and in vitro bioassays to characterize the EDCs present in treated wastewater from four wastewater treatment plants (WWTPs) in Ontario, Canada. Estrogen-mediated activity was assessed using a yeast estrogenicity screening (YES) assay. An in vitro competitive binding assay was used to assess capacity to interfere with binding of the thyroid hormone, thyroxine (T4) to the recombinant human thyroid hormone transport protein, transthyretin (i.e. hTTR). An in vitro binding assay with a rat peroxisome proliferator responsive element transfected into a rainbow trout gill cell line was used to evaluate binding and subsequent gene expression via the peroxisome proliferator activated receptor (PPAR). Analyses of a suite of contaminants known to be EDCs in extracts from treated wastewater were conducted using either gas chromatography with mass spectrometry (GC-MS) or liquid chromatography with tandem mass spectrometry (LC-MS/MS). Estrogenic activity was detected in the YES assay only in those extracts that contained detectable amounts of estradiol (E2). There was a positive relationship between the degree of response in the T4-hTTR assay and the amounts of polybrominated diphenyl ether (PBDE) congeners 47 and 99, triclosan and the PBDE metabolite, 4-OH-BDE17. Several wastewater extracts gave a positive response in the PPAR assay, but these responses were not correlated with the amounts of any of the EDCs analyzed by LC-MS/MS. Overall, these data indicate that a step-wise approach is feasible using a combination of in vitro testing and instrumental analysis to monitor for

  12. Identifying Risk Factors for Drug Use in an Iranian Treatment Sample: A Prediction Approach Using Decision Trees.

    Science.gov (United States)

    Amirabadizadeh, Alireza; Nezami, Hossein; Vaughn, Michael G; Nakhaee, Samaneh; Mehrpour, Omid

    2018-05-12

    Substance abuse exacts considerable social and health care burdens throughout the world. The aim of this study was to create a prediction model to better identify risk factors for drug use. A prospective cross-sectional study was conducted in South Khorasan Province, Iran. Of the total of 678 eligible subjects, 70% (n: 474) were randomly selected to provide a training set for constructing decision tree and multiple logistic regression (MLR) models. The remaining 30% (n: 204) were employed in a holdout sample to test the performance of the decision tree and MLR models. Predictive performance of different models was analyzed by the receiver operating characteristic (ROC) curve using the testing set. Independent variables were selected from demographic characteristics and history of drug use. For the decision tree model, the sensitivity and specificity for identifying people at risk for drug abuse were 66% and 75%, respectively, while the MLR model was somewhat less effective at 60% and 73%. Key independent variables in the analyses included first substance experience, age at first drug use, age, place of residence, history of cigarette use, and occupational and marital status. While study findings are exploratory and lack generalizability they do suggest that the decision tree model holds promise as an effective classification approach for identifying risk factors for drug use. Convergent with prior research in Western contexts is that age of drug use initiation was a critical factor predicting a substance use disorder.

  13. Elaboração e validação de intervalos de referência longitudinais de peso fetal com uma amostra da população brasileira Elaboration and validation of longitudinal reference intervals of fetal weight with a sample of the Brazilian population

    Directory of Open Access Journals (Sweden)

    Érica Luciana de Paula Furlan

    2012-10-01

    Full Text Available OBJETIVOS: Elaborar modelos de predição de peso fetal e de percentis longitudinais de peso fetal estimado (PFE com uma amostra da população brasileira. MÉTODOS: Estudo observacional prospectivo. Dois grupos de gestantes foram recrutados: Grupo EPF (estimativa de peso fetal: pacientes para elaboração (EPF-El e validação (EPF-Val de um modelo de predição de peso fetal; Grupo IRL (intervalos de referência longitudinais: gestantes para elaboração (IRL-El e validação (IRL-Val de intervalos de referência longitudinais de PFE. Regressão polinomial foi utilizada com os dados do subgrupo EPF-El para gerar o modelo de predição de peso fetal. O desempenho deste modelo foi comparado com os de outros disponíveis na literatura. Modelos lineares mistos foram usados para elaboração de intervalos longitudinais de PFE com os dados do subgrupo IRL-El. Os dados do subgrupo IRL-Val foram usados para validação destes intervalos. RESULTADOS: Quatrocentos e cinqüenta e oito pacientes compuseram o Grupo EPF (EPF-El: 367; EPF-Val: 91 e 315 o Grupo IRL (IRL-El: 265; IRL-Val: 50. A fórmula para cálculo do PFE foi: PFE=-8,277+2,146xDBPxCAxCF-2,449xCFxDBP². Os desempenhos de outras fórmulas para estimativa de peso fetal em nossa amostra foram significativamente piores do que os do modelo gerado neste estudo. Equações para predição de percentis condicionais de PFE foram derivadas das avaliações longitudinais do subgrupo IRL-El e validadas com os dados do subgrupo IRL-Val. CONCLUSÕES: descrevemos um método para adaptação de intervalos de referência longitudinais de PFE, sendo este obtido por meio de fórmulas geradas em uma amostra da população brasileira.PURPOSES: To elaborate models for the estimation of fetal weight and longitudinal reference intervals of estimated fetal weight (EFW using a sample of the Brazilian population. METHODS: Prospective observational study. Two groups of patients were evaluated: Group EFW (estimation of

  14. Estimating prevalence and diagnostic test characteristics of bovine cysticercosis in Belgium in the absence of a 'gold standard' reference test using a Bayesian approach.

    Science.gov (United States)

    Jansen, Famke; Dorny, Pierre; Gabriël, Sarah; Eichenberger, Ramon Marc; Berkvens, Dirk

    2018-04-30

    A Bayesian model was developed to estimate values for the prevalence and diagnostic test characteristics of bovine cysticercosis (Taenia saginata) by combining results of four imperfect tests. Samples of 612 bovine carcases that were found negative for cysticercosis during routine meat inspection collected at three Belgian slaughterhouses, underwent enhanced meat inspection (additional incisions in the heart), dissection of the predilection sites, B158/B60 Ag-ELISA and ES Ab-ELISA. This Bayesian approach allows for the combination of prior expert opinion with experimental data to estimate the true prevalence of bovine cysticercosis in the absence of a gold standard test. A first model (based on a multinomial distribution and including all possible interactions between the individual tests) required estimation of 31 parameters, while only allowing for 15 parameters to be estimated. Including prior expert information about specificity and sensitivity resulted in an optimal model with a reduction of the number of parameters to be estimated to 8. The estimated bovine cysticercosis prevalence was 33.9% (95% credibility interval: 27.7-44.4%), while apparent prevalence based on meat inspection is only 0.23%. The test performances were estimated as follows (sensitivity (Se) - specificity (Sp)): enhanced meat inspection (Se 2.87% - Sp 100%), dissection of predilection sites (Se 69.8% - Sp 100%), Ag-ELISA (Se 26.9% - Sp 99.4%), Ab-ELISA (Se 13.8% - Sp 92.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  15. On-site sampling and sample-preparation approach with a portable sampler based on hollow-fiber/graphene bars for the microextraction of nitrobenzene compounds in lake water.

    Science.gov (United States)

    Xing, Rongrong; Hu, Shuang; Chen, Xuan; Bai, Xiaohong; Feng, Meiqin

    2015-02-01

    A novel on-site sampling and sample-preparation approach was developed and evaluated in the present work. In this procedure, hollow-fiber/graphene bars (HF/GBs) were used for sampling and sample preparation. A handheld battery-operated electric egg beater was utilized to support the HF/GBs and stir the sample solution to facilitate extraction at the sampling site. Four nitrobenzene compounds (nitrobenzene, o-nitrophenol, m-nitrophenol, and p-nitrophenol) were used as model compounds. Several factors affecting performance, including types and amount of graphene used and extraction and desorption times, were investigated and optimized in the laboratory. Under optimized conditions, the enrichment factors of the four nitrobenzene compounds ranged from 46 to 69. Good linearities of 0.01-10 μg/mL with regression coefficients between 0.9917 and 0.9973 were obtained for all analytes. The LOD of the method was 0.3 ng/mL. Satisfactory recoveries (98-102%) and precision (1.0-5.8%) were also achieved. The ultrastructures and extraction mechanism of the HF/GBs were characterized and analyzed. The proposed approach coupled with high-performance liquid chromatography was successfully applied in the extraction and determination of trace nitrobenzene compounds in lake water. Experimental results showed that the approach is simple, convenient, rapid, and practical for routine environmental monitoring. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Enterprise Reference Library

    Science.gov (United States)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference