WorldWideScience

Sample records for hapmap samples generating

  1. Generating samples for association studies based on HapMap data

    Directory of Open Access Journals (Sweden)

    Chen Yixuan

    2008-01-01

    Full Text Available Abstract Background With the completion of the HapMap project, a variety of computational algorithms and tools have been proposed for haplotype inference, tag SNP selection and genome-wide association studies. Simulated data are commonly used in evaluating these new developed approaches. In addition to simulations based on population models, empirical data generated by perturbing real data, has also been used because it may inherit specific properties from real data. However, there is no tool that is publicly available to generate large scale simulated variation data by taking into account knowledge from the HapMap project. Results A computer program (gs was developed to quickly generate a large number of samples based on real data that are useful for a variety of purposes, including evaluating methods for haplotype inference, tag SNP selection and association studies. Two approaches have been implemented to generate dense SNP haplotype/genotype data that share similar local linkage disequilibrium (LD patterns as those in human populations. The first approach takes haplotype pairs from samples as inputs, and the second approach takes patterns of haplotype block structures as inputs. Both quantitative and qualitative traits have been incorporated in the program. Phenotypes are generated based on a disease model, or based on the effect of a quantitative trait nucleotide, both of which can be specified by users. In addition to single-locus disease models, two-locus disease models have also been implemented that can incorporate any degree of epistasis. Users are allowed to specify all nine parameters in a 3 × 3 penetrance table. For several commonly used two-locus disease models, the program can automatically calculate penetrances based on the population prevalence and marginal effects of a disease that users can conveniently specify. Conclusion The program gs can effectively generate large scale genetic and phenotypic variation data that can be

  2. Geographical affinities of the HapMap samples.

    Directory of Open Access Journals (Sweden)

    Miao He

    Full Text Available The HapMap samples were collected for medical-genetic studies, but are also widely used in population-genetic and evolutionary investigations. Yet the ascertainment of the samples differs from most population-genetic studies which collect individuals who live in the same local region as their ancestors. What effects could this non-standard ascertainment have on the interpretation of HapMap results?We compared the HapMap samples with more conventionally-ascertained samples used in population- and forensic-genetic studies, including the HGDP-CEPH panel, making use of published genome-wide autosomal SNP data and Y-STR haplotypes, as well as producing new Y-STR data. We found that the HapMap samples were representative of their broad geographical regions of ancestry according to all tests applied. The YRI and JPT were indistinguishable from independent samples of Yoruba and Japanese in all ways investigated. However, both the CHB and the CEU were distinguishable from all other HGDP-CEPH populations with autosomal markers, and both showed Y-STR similarities to unusually large numbers of populations, perhaps reflecting their admixed origins.The CHB and JPT are readily distinguished from one another with both autosomal and Y-chromosomal markers, and results obtained after combining them into a single sample should be interpreted with caution. The CEU are better described as being of Western European ancestry than of Northern European ancestry as often reported. Both the CHB and CEU show subtle but detectable signs of admixture. Thus the YRI and JPT samples are well-suited to standard population-genetic studies, but the CHB and CEU less so.

  3. An evaluation of the performance of tag SNPs derived from HapMap in a Caucasian population.

    Directory of Open Access Journals (Sweden)

    Alexandre Montpetit

    2006-03-01

    Full Text Available The Haplotype Map (HapMap project recently generated genotype data for more than 1 million single-nucleotide polymorphisms (SNPs in four population samples. The main application of the data is in the selection of tag single-nucleotide polymorphisms (tSNPs to use in association studies. The usefulness of this selection process needs to be verified in populations outside those used for the HapMap project. In addition, it is not known how well the data represent the general population, as only 90-120 chromosomes were used for each population and since the genotyped SNPs were selected so as to have high frequencies. In this study, we analyzed more than 1,000 individuals from Estonia. The population of this northern European country has been influenced by many different waves of migrations from Europe and Russia. We genotyped 1,536 randomly selected SNPs from two 500-kbp ENCODE regions on Chromosome 2. We observed that the tSNPs selected from the CEPH (Centre d'Etude du Polymorphisme Humain from Utah (CEU HapMap samples (derived from US residents with northern and western European ancestry captured most of the variation in the Estonia sample. (Between 90% and 95% of the SNPs with a minor allele frequency of more than 5% have an r2 of at least 0.8 with one of the CEU tSNPs. Using the reverse approach, tags selected from the Estonia sample could almost equally well describe the CEU sample. Finally, we observed that the sample size, the allelic frequency, and the SNP density in the dataset used to select the tags each have important effects on the tagging performance. Overall, our study supports the use of HapMap data in other Caucasian populations, but the SNP density and the bias towards high-frequency SNPs have to be taken into account when designing association studies.

  4. Semantic Modeling for SNPs Associated with Ethnic Disparities in HapMap Samples

    Directory of Open Access Journals (Sweden)

    HyoYoung Kim

    2014-03-01

    Full Text Available Single-nucleotide polymorphisms (SNPs have been emerging out of the efforts to research human diseases and ethnic disparities. A semantic network is needed for in-depth understanding of the impacts of SNPs, because phenotypes are modulated by complex networks, including biochemical and physiological pathways. We identified ethnicity-specific SNPs by eliminating overlapped SNPs from HapMap samples, and the ethnicity-specific SNPs were mapped to the UCSC RefGene lists. Ethnicity-specific genes were identified as follows: 22 genes in the USA (CEU individuals, 25 genes in the Japanese (JPT individuals, and 332 genes in the African (YRI individuals. To analyze the biologically functional implications for ethnicity-specific SNPs, we focused on constructing a semantic network model. Entities for the network represented by "Gene," "Pathway," "Disease," "Chemical," "Drug," "ClinicalTrials," "SNP," and relationships between entity-entity were obtained through curation. Our semantic modeling for ethnicity-specific SNPs showed interesting results in the three categories, including three diseases ("AIDS-associated nephropathy," "Hypertension," and "Pelvic infection", one drug ("Methylphenidate", and five pathways ("Hemostasis," "Systemic lupus erythematosus," "Prostate cancer," "Hepatitis C virus," and "Rheumatoid arthritis". We found ethnicity-specific genes using the semantic modeling, and the majority of our findings was consistent with the previous studies - that an understanding of genetic variability explained ethnicity-specific disparities.

  5. Genotype Imputation for Latinos Using the HapMap and 1000 Genomes Project Reference Panels

    Directory of Open Access Journals (Sweden)

    Xiaoyi eGao

    2012-06-01

    Full Text Available Genotype imputation is a vital tool in genome-wide association studies (GWAS and meta-analyses of multiple GWAS results. Imputation enables researchers to increase genomic coverage and to pool data generated using different genotyping platforms. HapMap samples are often employed as the reference panel. More recently, the 1000 Genomes Project resource is becoming the primary source for reference panels. Multiple GWAS and meta-analyses are targeting Latinos, the most populous and fastest growing minority group in the US. However, genotype imputation resources for Latinos are rather limited compared to individuals of European ancestry at present, largely because of the lack of good reference data. One choice of reference panel for Latinos is one derived from the population of Mexican individuals in Los Angeles contained in the HapMap Phase 3 project and the 1000 Genomes Project. However, a detailed evaluation of the quality of the imputed genotypes derived from the public reference panels has not yet been reported. Using simulation studies, the Illumina OmniExpress GWAS data from the Los Angles Latino Eye Study and the MACH software package, we evaluated the accuracy of genotype imputation in Latinos. Our results show that the 1000 Genomes Project AMR+CEU+YRI reference panel provides the highest imputation accuracy for Latinos, and that also including Asian samples in the panel can reduce imputation accuracy. We also provide the imputation accuracy for each autosomal chromosome using the 1000 Genomes Project panel for Latinos. Our results serve as a guide to future imputation-based analysis in Latinos.

  6. Comparison of HapMap and 1000 Genomes Reference Panels in a Large-Scale Genome-Wide Association Study.

    Directory of Open Access Journals (Sweden)

    Paul S de Vries

    Full Text Available An increasing number of genome-wide association (GWA studies are now using the higher resolution 1000 Genomes Project reference panel (1000G for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In order to assess the improvement of 1000G over HapMap imputation in identifying associated loci, we compared the results of GWA studies of circulating fibrinogen based on the two reference panels. Using both HapMap and 1000G imputation we performed a meta-analysis of 22 studies comprising the same 91,953 individuals. We identified six additional signals using 1000G imputation, while 29 loci were associated using both HapMap and 1000G imputation. One locus identified using HapMap imputation was not significant using 1000G imputation. The genome-wide significance threshold of 5×10-8 is based on the number of independent statistical tests using HapMap imputation, and 1000G imputation may lead to further independent tests that should be corrected for. When using a stricter Bonferroni correction for the 1000G GWA study (P-value < 2.5×10-8, the number of loci significant only using HapMap imputation increased to 4 while the number of loci significant only using 1000G decreased to 5. In conclusion, 1000G imputation enabled the identification of 20% more loci than HapMap imputation, although the advantage of 1000G imputation became less clear when a stricter Bonferroni correction was used. More generally, our results provide insights that are applicable to the implementation of other dense reference panels that are under development.

  7. SNPexp - A web tool for calculating and visualizing correlation between HapMap genotypes and gene expression levels

    Directory of Open Access Journals (Sweden)

    Franke Andre

    2010-12-01

    Full Text Available Abstract Background Expression levels for 47294 transcripts in lymphoblastoid cell lines from all 270 HapMap phase II individuals, and genotypes (both HapMap phase II and III of 3.96 million single nucleotide polymorphisms (SNPs in the same individuals are publicly available. We aimed to generate a user-friendly web based tool for visualization of the correlation between SNP genotypes within a specified genomic region and a gene of interest, which is also well-known as an expression quantitative trait locus (eQTL analysis. Results SNPexp is implemented as a server-side script, and publicly available on this website: http://tinyurl.com/snpexp. Correlation between genotype and transcript expression levels are calculated by performing linear regression and the Wald test as implemented in PLINK and visualized using the UCSC Genome Browser. Validation of SNPexp using previously published eQTLs yielded comparable results. Conclusions SNPexp provides a convenient and platform-independent way to calculate and visualize the correlation between HapMap genotypes within a specified genetic region anywhere in the genome and gene expression levels. This allows for investigation of both cis and trans effects. The web interface and utilization of publicly available and widely used software resources makes it an attractive supplement to more advanced bioinformatic tools. For the advanced user the program can be used on a local computer on custom datasets.

  8. GLIDERS - A web-based search engine for genome-wide linkage disequilibrium between HapMap SNPs

    Directory of Open Access Journals (Sweden)

    Broxholme John

    2009-10-01

    Full Text Available Abstract Background A number of tools for the examination of linkage disequilibrium (LD patterns between nearby alleles exist, but none are available for quickly and easily investigating LD at longer ranges (>500 kb. We have developed a web-based query tool (GLIDERS: Genome-wide LInkage DisEquilibrium Repository and Search engine that enables the retrieval of pairwise associations with r2 ≥ 0.3 across the human genome for any SNP genotyped within HapMap phase 2 and 3, regardless of distance between the markers. Description GLIDERS is an easy to use web tool that only requires the user to enter rs numbers of SNPs they want to retrieve genome-wide LD for (both nearby and long-range. The intuitive web interface handles both manual entry of SNP IDs as well as allowing users to upload files of SNP IDs. The user can limit the resulting inter SNP associations with easy to use menu options. These include MAF limit (5-45%, distance limits between SNPs (minimum and maximum, r2 (0.3 to 1, HapMap population sample (CEU, YRI and JPT+CHB combined and HapMap build/release. All resulting genome-wide inter-SNP associations are displayed on a single output page, which has a link to a downloadable tab delimited text file. Conclusion GLIDERS is a quick and easy way to retrieve genome-wide inter-SNP associations and to explore LD patterns for any number of SNPs of interest. GLIDERS can be useful in identifying SNPs with long-range LD. This can highlight mis-mapping or other potential association signal localisation problems.

  9. Unexpected Relationships and Inbreeding in HapMap Phase III Populations

    Science.gov (United States)

    Stevens, Eric L.; Baugher, Joseph D.; Shirley, Matthew D.; Frelin, Laurence P.; Pevsner, Jonathan

    2012-01-01

    Correct annotation of the genetic relationships between samples is essential for population genomic studies, which could be biased by errors or omissions. To this end, we used identity-by-state (IBS) and identity-by-descent (IBD) methods to assess genetic relatedness of individuals within HapMap phase III data. We analyzed data from 1,397 individuals across 11 ethnic populations. Our results support previous studies (Pemberton et al., 2010; Kyriazopoulou-Panagiotopoulou et al., 2011) assessing unknown relatedness present within this population. Additionally, we present evidence for 1,657 novel pairwise relationships across 9 populations. Surprisingly, significant Cotterman's coefficients of relatedness K1 (IBD1) values were detected between pairs of known parents. Furthermore, significant K2 (IBD2) values were detected in 32 previously annotated parent-child relationships. Consistent with a hypothesis of inbreeding, regions of homozygosity (ROH) were identified in the offspring of related parents, of which a subset overlapped those reported in previous studies (Gibson et al. 2010; Johnson et al. 2011). In total, we inferred 28 inbred individuals with ROH that overlapped areas of relatedness between the parents and/or IBD2 sharing at a different genomic locus between a child and a parent. Finally, 8 previously annotated parent-child relationships had unexpected K0 (IBD0) values (resulting from a chromosomal abnormality or genotype error), and 10 previously annotated second-degree relationships along with 38 other novel pairwise relationships had unexpected IBD2 (indicating two separate paths of recent ancestry). These newly described types of relatedness may impact the outcome of previous studies and should inform the design of future studies relying on the HapMap Phase III resource. PMID:23185369

  10. Single-molecule optical genome mapping of a human HapMap and a colorectal cancer cell line.

    Science.gov (United States)

    Teo, Audrey S M; Verzotto, Davide; Yao, Fei; Nagarajan, Niranjan; Hillmer, Axel M

    2015-01-01

    Next-generation sequencing (NGS) technologies have changed our understanding of the variability of the human genome. However, the identification of genome structural variations based on NGS approaches with read lengths of 35-300 bases remains a challenge. Single-molecule optical mapping technologies allow the analysis of DNA molecules of up to 2 Mb and as such are suitable for the identification of large-scale genome structural variations, and for de novo genome assemblies when combined with short-read NGS data. Here we present optical mapping data for two human genomes: the HapMap cell line GM12878 and the colorectal cancer cell line HCT116. High molecular weight DNA was obtained by embedding GM12878 and HCT116 cells, respectively, in agarose plugs, followed by DNA extraction under mild conditions. Genomic DNA was digested with KpnI and 310,000 and 296,000 DNA molecules (≥ 150 kb and 10 restriction fragments), respectively, were analyzed per cell line using the Argus optical mapping system. Maps were aligned to the human reference by OPTIMA, a new glocal alignment method. Genome coverage of 6.8× and 5.7× was obtained, respectively; 2.9× and 1.7× more than the coverage obtained with previously available software. Optical mapping allows the resolution of large-scale structural variations of the genome, and the scaffold extension of NGS-based de novo assemblies. OPTIMA is an efficient new alignment method; our optical mapping data provide a resource for genome structure analyses of the human HapMap reference cell line GM12878, and the colorectal cancer cell line HCT116.

  11. Comparison of HapMap and 1000 Genomes Reference Panels in a Large-Scale Genome-Wide Association Study

    DEFF Research Database (Denmark)

    de Vries, Paul S; Sabater-Lleal, Maria; Chasman, Daniel I

    2017-01-01

    An increasing number of genome-wide association (GWA) studies are now using the higher resolution 1000 Genomes Project reference panel (1000G) for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In...

  12. Imputation of variants from the 1000 Genomes Project modestly improves known associations and can identify low-frequency variant-phenotype associations undetected by HapMap based imputation.

    Science.gov (United States)

    Wood, Andrew R; Perry, John R B; Tanaka, Toshiko; Hernandez, Dena G; Zheng, Hou-Feng; Melzer, David; Gibbs, J Raphael; Nalls, Michael A; Weedon, Michael N; Spector, Tim D; Richards, J Brent; Bandinelli, Stefania; Ferrucci, Luigi; Singleton, Andrew B; Frayling, Timothy M

    2013-01-01

    Genome-wide association (GWA) studies have been limited by the reliance on common variants present on microarrays or imputable from the HapMap Project data. More recently, the completion of the 1000 Genomes Project has provided variant and haplotype information for several million variants derived from sequencing over 1,000 individuals. To help understand the extent to which more variants (including low frequency (1% ≤ MAF 1000 Genomes imputation, respectively, and 9 and 11 that reached a stricter, likely conservative, threshold of P1000 Genomes genotype data modestly improved the strength of known associations. Of 20 associations detected at P1000 Genomes imputed data and one was nominally more strongly associated in HapMap imputed data. We also detected an association between a low frequency variant and phenotype that was previously missed by HapMap based imputation approaches. An association between rs112635299 and alpha-1 globulin near the SERPINA gene represented the known association between rs28929474 (MAF = 0.007) and alpha1-antitrypsin that predisposes to emphysema (P = 2.5×10(-12)). Our data provide important proof of principle that 1000 Genomes imputation will detect novel, low frequency-large effect associations.

  13. Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of genetic marker alleles associated with a trait indicative of fertility of the bovine subject and/or off-spring

    DEFF Research Database (Denmark)

    2009-01-01

    NOVELTY - Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of two or more genetic marker alleles that are associated with a trait indicative of fertility of the bovine subject and/or off-spring. USE - The methods are useful...... for determining fertility in a bovine subject; and selecting bovine subjects for breeding purposes (all claimed). DETAILED DESCRIPTION - Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of two or more genetic marker alleles...... that are associated with a trait indicative of fertility of the bovine subject and/or off-spring, where the two or more genetic marker alleles are single nucleotide polymorphisms selected from Hapmap60827-rs29019866, ARS-BFGL-NGS-40979, Hapmap47854-BTA-119090, ARS-BFGL-NGS-114679, Hapmap43841-BTA-34601, Hapmap43407...

  14. Genome-wide screen for universal individual identification SNPs based on the HapMap and 1000 Genomes databases.

    Science.gov (United States)

    Huang, Erwen; Liu, Changhui; Zheng, Jingjing; Han, Xiaolong; Du, Weian; Huang, Yuanjian; Li, Chengshi; Wang, Xiaoguang; Tong, Dayue; Ou, Xueling; Sun, Hongyu; Zeng, Zhaoshu; Liu, Chao

    2018-04-03

    Differences among SNP panels for individual identification in SNP-selecting and populations led to few common SNPs, compromising their universal applicability. To screen all universal SNPs, we performed a genome-wide SNP mining in multiple populations based on HapMap and 1000Genomes databases. SNPs with high minor allele frequencies (MAF) in 37 populations were selected. With MAF from ≥0.35 to ≥0.43, the number of selected SNPs decreased from 2769 to 0. A total of 117 SNPs with MAF ≥0.39 have no linkage disequilibrium with each other in every population. For 116 of the 117 SNPs, cumulative match probability (CMP) ranged from 2.01 × 10-48 to 1.93 × 10-50 and cumulative exclusion probability (CEP) ranged from 0.9999999996653 to 0.9999999999945. In 134 tested Han samples, 110 of the 117 SNPs remained within high MAF and conformed to Hardy-Weinberg equilibrium, with CMP = 4.70 × 10-47 and CEP = 0.999999999862. By analyzing the same number of autosomal SNPs as in the HID-Ion AmpliSeq Identity Panel, i.e. 90 randomized out of the 110 SNPs, our panel yielded preferable CMP and CEP. Taken together, the 110-SNPs panel is advantageous for forensic test, and this study provided plenty of highly informative SNPs for compiling final universal panels.

  15. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  16. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  17. Sample Scripts for Generating PaGE-OM XML [

    Lifescience Database Archive (English)

    Full Text Available Sample Scripts for Generating PaGE-OM XML This page is offering some sample scripts...on MySQL. Outline chart of procedure 6. Creating RDB tables for Generating PaGE-OM XML These scripts help yo...wnload: create_tables_sql2.zip 7. Generating PaGE-OM XML from phenotype data This sample Perl script helps y

  18. Validation of a next-generation sequencing assay for clinical molecular oncology.

    Science.gov (United States)

    Cottrell, Catherine E; Al-Kateb, Hussam; Bredemeyer, Andrew J; Duncavage, Eric J; Spencer, David H; Abel, Haley J; Lockwood, Christina M; Hagemann, Ian S; O'Guin, Stephanie M; Burcea, Lauren C; Sawyer, Christopher S; Oschwald, Dayna M; Stratman, Jennifer L; Sher, Dorie A; Johnson, Mark R; Brown, Justin T; Cliften, Paul F; George, Bijoy; McIntosh, Leslie D; Shrivastava, Savita; Nguyen, Tudung T; Payton, Jacqueline E; Watson, Mark A; Crosby, Seth D; Head, Richard D; Mitra, Robi D; Nagarajan, Rakesh; Kulkarni, Shashikant; Seibert, Karen; Virgin, Herbert W; Milbrandt, Jeffrey; Pfeifer, John D

    2014-01-01

    Currently, oncology testing includes molecular studies and cytogenetic analysis to detect genetic aberrations of clinical significance. Next-generation sequencing (NGS) allows rapid analysis of multiple genes for clinically actionable somatic variants. The WUCaMP assay uses targeted capture for NGS analysis of 25 cancer-associated genes to detect mutations at actionable loci. We present clinical validation of the assay and a detailed framework for design and validation of similar clinical assays. Deep sequencing of 78 tumor specimens (≥ 1000× average unique coverage across the capture region) achieved high sensitivity for detecting somatic variants at low allele fraction (AF). Validation revealed sensitivities and specificities of 100% for detection of single-nucleotide variants (SNVs) within coding regions, compared with SNP array sequence data (95% CI = 83.4-100.0 for sensitivity and 94.2-100.0 for specificity) or whole-genome sequencing (95% CI = 89.1-100.0 for sensitivity and 99.9-100.0 for specificity) of HapMap samples. Sensitivity for detecting variants at an observed 10% AF was 100% (95% CI = 93.2-100.0) in HapMap mixes. Analysis of 15 masked specimens harboring clinically reported variants yielded concordant calls for 13/13 variants at AF of ≥ 15%. The WUCaMP assay is a robust and sensitive method to detect somatic variants of clinical significance in molecular oncology laboratories, with reduced time and cost of genetic analysis allowing for strategic patient management. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  19. PWR steam generator tubing sample library

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    In order to compile the tubing sample library, two approaches were employed: (a) tubing sample replication by either chemical or mechanical means, based on field tube data and metallography reports for tubes already destructively examined; and (b) acquisition of field tubes removed from operating or retired steam generators. In addition, a unique mercury modeling concept is in use to guide the selection of replica samples. A compendium was compiled that summarizes field observations and morphologies of steam generator tube degradation types based on available NDE, destructive examinations, and field reports. This compendium was used in selecting candidate degradation types that were manufactured for inclusion in the tube library

  20. Treatment of Nuclear Data Covariance Information in Sample Generation

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wieselquist, William [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division

    2017-10-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  1. Treatment of Nuclear Data Covariance Information in Sample Generation

    International Nuclear Information System (INIS)

    Swiler, Laura Painton; Adams, Brian M.; Wieselquist, William

    2017-01-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  2. Getting DNA copy numbers without control samples.

    Science.gov (United States)

    Ortiz-Estevez, Maria; Aramburu, Ander; Rubio, Angel

    2012-08-16

    The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias.We propose NSA (Normality Search Algorithm), a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM), Ovarian, Prostate and Lung Cancer experiments) have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs). These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the data. The method is available in the open-source R package

  3. Comparison of Antidepressant Efficacy-related SNPs Among Taiwanese and Four Populations in the HapMap Database

    Directory of Open Access Journals (Sweden)

    Mei-Hung Chi

    2011-07-01

    Full Text Available The genetic influence of single nucleotide polymorphisms (SNPs on antidepressant efficacy has been previously demonstrated. To evaluate whether there are ethnic differences, we compared the allele frequencies of antidepressant efficacy-related SNPs between the Taiwanese population and four other populations in the HapMap database. We recruited 198 Taiwanese major depression patients and 106 Taiwanese controls. A panel of possible relevant SNPs (in brain-derived neurotrophic factor, 5-hydroxytryptamine receptor 2A, interleukin 1 beta, and G-protein beta 3 subunit genes was selected for comparisons of allele frequencies using the χ2 test. Our results suggested no difference between Taiwanese patients and controls, but there were significant differences among Taiwanese controls and the other four ethnic groups in brain-derived neurotrophic factor, 5-hydroxytryptamine receptor 2A, interleukin 1 beta and G-protein beta 3 subunit genes. We conclude that there are ethnic differences in the allele frequencies of antidepressant efficacy-related SNPs, and that the degree of variations is consistent with geographic distances. Further investigation is required to verify the attribution of genetic differences to ethnic-specific antidepressant responses.

  4. Gas generation from Hanford grout samples

    International Nuclear Information System (INIS)

    Jonah, C.D.; Kapoor, S.; Matheson, M.S.; Mulac, W.A.; Meisel, D.

    1996-01-01

    In an extension of our work on the radiolytic processes that occur in the waste tanks at the Hanford site, we studied the gas generation from grout samples that contained nuclear waste simulants. Grout is one option for the long-term storage of low-level nuclear waste solutions but the radiolytic effects on grout have not been thoroughly defined. In particular, the generation of potentially flammable and hazardous gases required quantification. A research team at Argonne examined this issue and found that the total amount of gases generated radiolytically from the WHC samples was an order of magnitude higher than predicted. This implies that novel pathways fro charge migration from the solid grout to the associated water are responsible for gas evolution. The grout samples produced hydrogen, nitrous oxide, and carbon monoxide as well as nitrogen and oxygen. Yields of each of these substances were determined for doses that are equivalent to about 80 years storage of the grout. Carbon monoxide, which was produced in 2% yield, is of particular importance because even small amounts may adversely affect catalytic conversion instrumentation that has been planned for installation in the storage vaults

  5. Cr(VI) generation during sample preparation of solid samples – A ...

    African Journals Online (AJOL)

    Cr(VI) generation during sample preparation of solid samples – A chromite ore case study. R.I Glastonbury, W van der Merwe, J.P Beukes, P.G van Zyl, G Lachmann, C.J.H Steenkamp, N.F Dawson, M.H Stewart ...

  6. Getting DNA copy numbers without control samples

    Directory of Open Access Journals (Sweden)

    Ortiz-Estevez Maria

    2012-08-01

    Full Text Available Abstract Background The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias. We propose NSA (Normality Search Algorithm, a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Results Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM, Ovarian, Prostate and Lung Cancer experiments have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs. These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. Conclusions NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the

  7. Effective selection of informative SNPs and classification on the HapMap genotype data

    Directory of Open Access Journals (Sweden)

    Wang Lipo

    2007-12-01

    Full Text Available Abstract Background Since the single nucleotide polymorphisms (SNPs are genetic variations which determine the difference between any two unrelated individuals, the SNPs can be used to identify the correct source population of an individual. For efficient population identification with the HapMap genotype data, as few informative SNPs as possible are required from the original 4 million SNPs. Recently, Park et al. (2006 adopted the nearest shrunken centroid method to classify the three populations, i.e., Utah residents with ancestry from Northern and Western Europe (CEU, Yoruba in Ibadan, Nigeria in West Africa (YRI, and Han Chinese in Beijing together with Japanese in Tokyo (CHB+JPT, from which 100,736 SNPs were obtained and the top 82 SNPs could completely classify the three populations. Results In this paper, we propose to first rank each feature (SNP using a ranking measure, i.e., a modified t-test or F-statistics. Then from the ranking list, we form different feature subsets by sequentially choosing different numbers of features (e.g., 1, 2, 3, ..., 100. with top ranking values, train and test them by a classifier, e.g., the support vector machine (SVM, thereby finding one subset which has the highest classification accuracy. Compared to the classification method of Park et al., we obtain a better result, i.e., good classification of the 3 populations using on average 64 SNPs. Conclusion Experimental results show that the both of the modified t-test and F-statistics method are very effective in ranking SNPs about their classification capabilities. Combined with the SVM classifier, a desirable feature subset (with the minimum size and most informativeness can be quickly found in the greedy manner after ranking all SNPs. Our method is able to identify a very small number of important SNPs that can determine the populations of individuals.

  8. Data analysis for steam generator tubing samples

    International Nuclear Information System (INIS)

    Dodd, C.V.

    1996-07-01

    The objective of the Improved Eddy-Current ISI for Steam Generators program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for inservice inspection of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC's mobile NDE laboratory and staff. This report provides a description of the application of advanced eddy-current neural network analysis methods for the detection and evaluation of common steam generator tubing flaws including axial and circumferential outer-diameter stress-corrosion cracking and intergranular attack. The report describes the training of the neural networks on tubing samples with known defects and the subsequent evaluation results for unknown samples. Evaluations were done in the presence of artifacts. Computer programs are given in the appendix

  9. cn.MOPS: mixture of Poissons for discovering copy number variations in next-generation sequencing data with a low false discovery rate.

    Science.gov (United States)

    Klambauer, Günter; Schwarzbauer, Karin; Mayr, Andreas; Clevert, Djork-Arné; Mitterecker, Andreas; Bodenhofer, Ulrich; Hochreiter, Sepp

    2012-05-01

    Quantitative analyses of next-generation sequencing (NGS) data, such as the detection of copy number variations (CNVs), remain challenging. Current methods detect CNVs as changes in the depth of coverage along chromosomes. Technological or genomic variations in the depth of coverage thus lead to a high false discovery rate (FDR), even upon correction for GC content. In the context of association studies between CNVs and disease, a high FDR means many false CNVs, thereby decreasing the discovery power of the study after correction for multiple testing. We propose 'Copy Number estimation by a Mixture Of PoissonS' (cn.MOPS), a data processing pipeline for CNV detection in NGS data. In contrast to previous approaches, cn.MOPS incorporates modeling of depths of coverage across samples at each genomic position. Therefore, cn.MOPS is not affected by read count variations along chromosomes. Using a Bayesian approach, cn.MOPS decomposes variations in the depth of coverage across samples into integer copy numbers and noise by means of its mixture components and Poisson distributions, respectively. The noise estimate allows for reducing the FDR by filtering out detections having high noise that are likely to be false detections. We compared cn.MOPS with the five most popular methods for CNV detection in NGS data using four benchmark datasets: (i) simulated data, (ii) NGS data from a male HapMap individual with implanted CNVs from the X chromosome, (iii) data from HapMap individuals with known CNVs, (iv) high coverage data from the 1000 Genomes Project. cn.MOPS outperformed its five competitors in terms of precision (1-FDR) and recall for both gains and losses in all benchmark data sets. The software cn.MOPS is publicly available as an R package at http://www.bioinf.jku.at/software/cnmops/ and at Bioconductor.

  10. New Generation Flask Sampling Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, James R. [AOS, Inc., Colorado Springs, CO (United States)

    2017-11-09

    Scientists are turning their focus to the Arctic, site of one of the strongest climate change signals. A new generation of technologies is required to function within that harsh environment, chart evolution of its trace gases and provide new kinds of information for models of the atmosphere. Our response to the solicitation tracks how global atmospheric monitoring was launched more than a half century ago; namely, acquisition of discrete samples of air by flask and subsequent analysis in the laboratory. AOS is proposing to develop a new generation of flask sampling technology. It will enable the new Arctic programs to begin with objective high density sampling of the atmosphere by UAS. The Phase I program will build the prototype flask technology and show that it can acquire and store mol fractions of CH4 and CO2 and value of δ13C with good fidelity. A CAD model will be produced for the entire platform including a package with 100 flasks and the airframe with auto-pilot, electronic propulsion and ground-to-air communications. A mobile flask analysis station will be prototyped in Phase I and designed to final form in Phase II. It expends very small sample per analysis and will interface directly to the flask package integrated permanently into the UAS fuselage. Commercial Applications and Other Benefits: • The New Generation Flask Sampling Technology able to provide a hundred or more samples of air per UAS mission. • A mobile analysis station expending far less sample than the existing ones and small enough to be stationed at the remote sites of Arctic operations. • A new form of validation for continuous trace gas observations from all platforms including the small UAS. • Further demonstration to potential customers of the AOS capabilities to invent, build, deploy and exploit entire platforms for observations of Earth’s atmosphere and ocean. Key Words: Flask Sampler, Mobile Analysis Station, Trace Gas, CO2, CH4, δC13, UAS, Baseline Airborne Observatory

  11. Generation of complementary sampled phase-only holograms.

    Science.gov (United States)

    Tsang, P W M; Chow, Y T; Poon, T-C

    2016-10-03

    If an image is uniformly down-sampled into a sparse form and converted into a hologram, the phase component alone will be adequate to reconstruct the image. However, the appearance of the reconstructed image is degraded with numerous empty holes. In this paper, we present a low complexity and non-iterative solution to this problem. Briefly, two phase-only holograms are generated for an image, each based on a different down-sampling lattice. Subsequently, the holograms are displayed alternately at high frame rate. The reconstructed images of the 2 holograms will appear to be a single, densely sampled image with enhance visual quality.

  12. Classifier Directed Data Hybridization for Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-11-01

    Full Text Available Quality segment generation is a well-known challenge and research objective within Geographic Object-based Image Analysis (GEOBIA. Although methodological avenues within GEOBIA are diverse, segmentation commonly plays a central role in most approaches, influencing and being influenced by surrounding processes. A general approach using supervised quality measures, specifically user provided reference segments, suggest casting the parameters of a given segmentation algorithm as a multidimensional search problem. In such a sample supervised segment generation approach, spatial metrics observing the user provided reference segments may drive the search process. The search is commonly performed by metaheuristics. A novel sample supervised segment generation approach is presented in this work, where the spectral content of provided reference segments is queried. A one-class classification process using spectral information from inside the provided reference segments is used to generate a probability image, which in turn is employed to direct a hybridization of the original input imagery. Segmentation is performed on such a hybrid image. These processes are adjustable, interdependent and form a part of the search problem. Results are presented detailing the performances of four method variants compared to the generic sample supervised segment generation approach, under various conditions in terms of resultant segment quality, required computing time and search process characteristics. Multiple metrics, metaheuristics and segmentation algorithms are tested with this approach. Using the spectral data contained within user provided reference segments to tailor the output generally improves the results in the investigated problem contexts, but at the expense of additional required computing time.

  13. Evaluation of sampling plans for in-service inspection of steam generator tubes

    International Nuclear Information System (INIS)

    Kurtz, R.J.; Heasler, P.G.; Baird, D.B.

    1994-02-01

    This report summarizes the results of three previous studies to evaluate and compare the effectiveness of sampling plans for steam generator tube inspections. An analytical evaluation and Monte Carlo simulation techniques were the methods used to evaluate sampling plan performance. To test the performance of candidate sampling plans under a variety of conditions, ranges of inspection system reliability were considered along with different distributions of tube degradation. Results from the eddy current reliability studies performed with the retired-from-service Surry 2A steam generator were utilized to guide the selection of appropriate probability of detection and flaw sizing models for use in the analysis. Different distributions of tube degradation were selected to span the range of conditions that might exist in operating steam generators. The principal means of evaluating sampling performance was to determine the effectiveness of the sampling plan for detecting and plugging defective tubes. A summary of key results from the eddy current reliability studies is presented. The analytical and Monte Carlo simulation analyses are discussed along with a synopsis of key results and conclusions

  14. Efficient Sampling of the Structure of Crypto Generators' State Transition Graphs

    Science.gov (United States)

    Keller, Jörg

    Cryptographic generators, e.g. stream cipher generators like the A5/1 used in GSM networks or pseudo-random number generators, are widely used in cryptographic network protocols. Basically, they are finite state machines with deterministic transition functions. Their state transition graphs typically cannot be analyzed analytically, nor can they be explored completely because of their size which typically is at least n = 264. Yet, their structure, i.e. number and sizes of weakly connected components, is of interest because a structure deviating significantly from expected values for random graphs may form a distinguishing attack that indicates a weakness or backdoor. By sampling, one randomly chooses k nodes, derives their distribution onto connected components by graph exploration, and extrapolates these results to the complete graph. In known algorithms, the computational cost to determine the component for one randomly chosen node is up to O(√n), which severely restricts the sample size k. We present an algorithm where the computational cost to find the connected component for one randomly chosen node is O(1), so that a much larger sample size k can be analyzed in a given time. We report on the performance of a prototype implementation, and about preliminary analysis for several generators.

  15. An automated synthesis-purification-sample-management platform for the accelerated generation of pharmaceutical candidates.

    Science.gov (United States)

    Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y

    2014-04-01

    A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.

  16. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  17. Development of graph self-generating program of radiation sampling for geophysical prospecting with AutoLISP

    International Nuclear Information System (INIS)

    Zhou Hongsheng

    2009-01-01

    A program of self-generating graph of radiation sampling for geophysical prospecting is developed with AutoLISP, which is developed wholly by the author and can self-generate and explain sampling graphs. The program has largely increased drawing efficiency and can avoid the graph errors due to manual drawing. (authors)

  18. Sampling and analysis plan for sampling of liquid waste streams generated by 222-S Laboratory Complex operations

    International Nuclear Information System (INIS)

    Benally, A.B.

    1997-01-01

    This Sampling and Analysis Plan (SAP) establishes the requirements and guidelines to be used by the Waste Management Federal Services of Hanford, Inc. personnel in characterizing liquid waste generated at the 222-S Laboratory Complex. The characterization process to verify the accuracy of process knowledge used for designation and subsequent management of wastes consists of three steps: to prepare the technical rationale and the appendix in accordance with the steps outlined in this SAP; to implement the SAP by sampling and analyzing the requested waste streams; and to compile the report and evaluate the findings to the objectives of this SAP. This SAP applies to portions of the 222-S Laboratory Complex defined as Generator under the Resource Conservation and Recovery Act (RCRA). Any portion of the 222-S Laboratory Complex that is defined or permitted under RCRA as a treatment, storage, or disposal (TSD) facility is excluded from this document. This SAP applies to the liquid waste generated in the 222-S Laboratory Complex. Because the analytical data obtained will be used to manage waste properly, including waste compatibility and waste designation, this SAP will provide directions for obtaining and maintaining the information as required by WAC173-303

  19. DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  20. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  1. Stuttering Attitudes among Turkish Family Generations and Neighbors from Representative Samples

    Science.gov (United States)

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: Attitudes toward stuttering, measured by the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S"), are compared among (a) two different representative samples; (b) family generations (children, parents, and either grandparents or uncles and aunts) and neighbors; (c) children, parents, grandparents/adult…

  2. Nitrogen Detection in Bulk Samples Using a D-D Reaction-Based Portable Neutron Generator

    Directory of Open Access Journals (Sweden)

    A. A. Naqvi

    2013-01-01

    Full Text Available Nitrogen concentration was measured via 2.52 MeV nitrogen gamma ray from melamine, caffeine, urea, and disperse orange bulk samples using a newly designed D-D portable neutron generator-based prompt gamma ray setup. Inspite of low flux of thermal neutrons produced by D-D reaction-based portable neutron generator and interference of 2.52 MeV gamma rays from nitrogen in bulk samples with 2.50 MeV gamma ray from bismuth in BGO detector material, an excellent agreement between the experimental and calculated yields of nitrogen gamma rays indicates satisfactory performance of the setup for detection of nitrogen in bulk samples.

  3. Gel-aided sample preparation (GASP)--a simplified method for gel-assisted proteomic sample generation from protein extracts and intact cells.

    Science.gov (United States)

    Fischer, Roman; Kessler, Benedikt M

    2015-04-01

    We describe a "gel-assisted" proteomic sample preparation method for MS analysis. Solubilized protein extracts or intact cells are copolymerized with acrylamide, facilitating denaturation, reduction, quantitative cysteine alkylation, and matrix formation. Gel-aided sample preparation has been optimized to be highly flexible, scalable, and to allow reproducible sample generation from 50 cells to milligrams of protein extracts. This methodology is fast, sensitive, easy-to-use on a wide range of sample types, and accessible to nonspecialists. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A generational perspective on work values in a South African sample

    Directory of Open Access Journals (Sweden)

    Petronella Jonck

    2017-01-01

    Full Text Available Orientation: In order to ensure harmonious relationships in the workplace, work values of different generational cohorts need to be investigated and understood. Research purpose: The purpose of this study was to investigate the work values of a South African sample from a generational perspective, in order to foster an understanding of the similarities and differences of different generational cohorts in terms of work values. Motivation of the study: Understanding the work values of different generational cohorts could assist organisations to manage and retain human capital in an increasingly competitive environment. Furthermore, it could assist organisations to develop an advanced understanding of employee behaviour, which should inform conflict-resolution strategies to deal with reported conflict between different generational cohorts. Research design, approach and method: The study was conducted within the positivist paradigm and was quantitative in nature. Data were gathered from 301 employees representing three different generational cohorts, namely the Baby Boomers, Generation X and Generation Y. A cross-sectional study was conducted, and data were collected once off by means of the Values Scale. The psychometric properties of the Values Scale have a reliability coefficient of 0.95, and the scale has been applied successfully in various iterations. Main findings: The findings indicate statistically significant differences and similarities between the various generational cohorts in terms of work values. More specifically, similarities and differences between the various generational cohorts were observed with regard to the values of authority, creativity, risk and social interaction in the work context. Practical/managerial implications: Organisations can use the findings of the study to strengthen employee interaction within the work environment. In addition, the findings can be used to inform retention and management strategies, in order

  5. Fault Sample Generation for Virtual Testability Demonstration Test Subject to Minimal Maintenance and Scheduled Replacement

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2015-01-01

    Full Text Available Virtual testability demonstration test brings new requirements to the fault sample generation. First, fault occurrence process is described by stochastic process theory. It is discussed that fault occurrence process subject to minimal repair is nonhomogeneous Poisson process (NHPP. Second, the interarrival time distribution function of the next fault event is proposed and three typical kinds of parameterized NHPP are discussed. Third, the procedure of fault sample generation is put forward with the assumptions of minimal maintenance and scheduled replacement. The fault modes and their occurrence time subject to specified conditions and time period can be obtained. Finally, an antenna driving subsystem in automatic pointing and tracking platform is taken as a case to illustrate the proposed method. Results indicate that both the size and structure of the fault samples generated by the proposed method are reasonable and effective. The proposed method can be applied to virtual testability demonstration test well.

  6. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    OpenAIRE

    Christoff Fourie; Elisabeth Schoepfer

    2014-01-01

    Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA). Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, c...

  7. FRAGSION: ultra-fast protein fragment library generation by IOHMM sampling.

    Science.gov (United States)

    Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2016-07-01

    Speed, accuracy and robustness of building protein fragment library have important implications in de novo protein structure prediction since fragment-based methods are one of the most successful approaches in template-free modeling (FM). Majority of the existing fragment detection methods rely on database-driven search strategies to identify candidate fragments, which are inherently time-consuming and often hinder the possibility to locate longer fragments due to the limited sizes of databases. Also, it is difficult to alleviate the effect of noisy sequence-based predicted features such as secondary structures on the quality of fragment. Here, we present FRAGSION, a database-free method to efficiently generate protein fragment library by sampling from an Input-Output Hidden Markov Model. FRAGSION offers some unique features compared to existing approaches in that it (i) is lightning-fast, consuming only few seconds of CPU time to generate fragment library for a protein of typical length (300 residues); (ii) can generate dynamic-size fragments of any length (even for the whole protein sequence) and (iii) offers ways to handle noise in predicted secondary structure during fragment sampling. On a FM dataset from the most recent Critical Assessment of Structure Prediction, we demonstrate that FGRAGSION provides advantages over the state-of-the-art fragment picking protocol of ROSETTA suite by speeding up computation by several orders of magnitude while achieving comparable performance in fragment quality. Source code and executable versions of FRAGSION for Linux and MacOS is freely available to non-commercial users at http://sysbio.rnet.missouri.edu/FRAGSION/ It is bundled with a manual and example data. chengji@missouri.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. The determination of arsenic, selenium, antimony, and tin in complex environmental samples by hydride generation AAS

    International Nuclear Information System (INIS)

    Johnson, D.; Beach, C.

    1990-01-01

    Hydride generation techniques are used routinely for the determination of As, Se, Sb and Sn in water samples. Advantages include high sensitivity, simplicity, and relative freedom from interferences. Continuous-flow designs greatly reduce analysis time as well as improve precision and allow for automation. However the accurate analysis of more complex environmental samples such as industrial sludges, soil samples, river sediments, and fly ash remains difficult. Numerous contributing factors influence the accuracy of the hydride technique. Sample digestion methods and sample preparation procedures are of critical importance. The digestion must adequately solubilize the elements of interest without loss by volatilization. Sample preparation procedures that guarantee the proper analyte oxidation state and eliminate the nitric acid and inter-element interferences are needed. In this study, difficult environmental samples were analyzed for As, Se, Sb, and Sn by continuous flow hydride generation. Sample preparation methods were optimized to eliminate interferences. The results of spike recovery studies will be presented. Data from the analysis of the same samples by graphite furnace AAS will be presented for comparison of accuracy, precision, and analysis time

  9. Optimal sampling period of the digital control system for the nuclear power plant steam generator water level control

    International Nuclear Information System (INIS)

    Hur, Woo Sung; Seong, Poong Hyun

    1995-01-01

    A great effort has been made to improve the nuclear plant control system by use of digital technologies and a long term schedule for the control system upgrade has been prepared with an aim to implementation in the next generation nuclear plants. In case of digital control system, it is important to decide the sampling period for analysis and design of the system, because the performance and the stability of a digital control system depend on the value of the sampling period of the digital control system. There is, however, currently no systematic method used universally for determining the sampling period of the digital control system. Generally, a traditional way to select the sampling frequency is to use 20 to 30 times the bandwidth of the analog control system which has the same system configuration and parameters as the digital one. In this paper, a new method to select the sampling period is suggested which takes into account of the performance as well as the stability of the digital control system. By use of the Irving's model steam generator, the optimal sampling period of an assumptive digital control system for steam generator level control is estimated and is actually verified in the digital control simulation system for Kori-2 nuclear power plant steam generator level control. Consequently, we conclude the optimal sampling period of the digital control system for Kori-2 nuclear power plant steam generator level control is 1 second for all power ranges. 7 figs., 3 tabs., 8 refs. (Author)

  10. Towards a Mobile Ecogenomic sensor: the Third Generation Environmental Sample Processor (3G-ESP).

    Science.gov (United States)

    Birch, J. M.; Pargett, D.; Jensen, S.; Roman, B.; Preston, C. M.; Ussler, W.; Yamahara, K.; Marin, R., III; Hobson, B.; Zhang, Y.; Ryan, J. P.; Scholin, C. A.

    2016-02-01

    Researchers are increasingly using one or more autonomous platforms to characterize ocean processes that change in both space and time. Conceptually, studying processes that change quickly both spatially and temporally seems relatively straightforward. One needs to sample in many locations synoptically over time, or follow a coherent water mass and sample it repeatedly. However, implementing either approach presents many challenges. For example, acquiring samples over days to weeks far from shore, without human intervention, requires multiple systems to work together seamlessly, and the level of autonomy, navigation and communications needed to conduct the work exposes the complexity of these requirements. We are addressing these challenges by developing a new generation of robotic systems that are primarily aimed at studies of microbial-mediated processes. As a step towards realizing this new capability, we have taken lessons learned from our second-generation Environmental Sample Processor (2G-ESP), a robotic microbiology "lab-in-a-can" and have re-engineered the system for use on a Tethys-class Long Range AUV (LRAUV). The new instrument is called the third-generation ESP (3G-ESP), and its integration with the LRAUV provides mobility and a persistent presence not seen before in microbial oceanography. The 3G-ESP autonomously filters a water sample and then either preserves that material for eventual return to a laboratory, or processes the sample in real-time for further downstream molecular analytical analyses. The 3G ESP modularizes hardware needed for the collection and preparation of a sample from subsequent molecular analyses by the use of self-contained "cartridges". Cartridges currently come in two forms: one for the preservation of a sample, and the other for onboard homogenization and handoff for downstream processing via one or more analytical devices. The 3G-ESP is designed as a stand-alone instrument, and thus could be deployed on a variety of

  11. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  12. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  13. Propionibacterium acnes: disease-causing agent or common contaminant? Detection in diverse patient samples by next generation sequencing

    DEFF Research Database (Denmark)

    Mollerup, Sarah; Friis-Nielsen, Jens; Vinner, Lasse

    2016-01-01

    Propionibacterium acnes is the most abundant bacterium on human skin, particularly in sebaceous areas. P. acnes is suggested to be an opportunistic pathogen involved in the development of diverse medical conditions, but is also a proven contaminant of human samples and surgical wounds. Its...... significance as a pathogen is consequently a matter of debate.In the present study we investigated the presence of P. acnes DNA in 250 next generation sequencing datasets generated from 180 samples of 20 different sample types, mostly of cancerous origin. The samples were either subjected to microbial...... enrichment, involving nuclease treatment to reduce the amount of host nucleic acids, or shotgun-sequenced.We detected high proportions of P. acnes in enriched samples, particularly skin derived and other tissue samples, with levels being higher in enriched compared to shotgun-sequenced samples. P. acnes...

  14. Efficient generation of volatile species for cadmium analysis in seafood and rice samples by a modified chemical vapor generation system coupled with atomic fluorescence spectrometry

    International Nuclear Information System (INIS)

    Yang, Xin-an; Chi, Miao-bin; Wang, Qing-qing; Zhang, Wang-bing

    2015-01-01

    Highlights: • We develop a modified chemical vapor generation method coupled with AFS for the determination of cadmium. • The response of Cd could be increased at least four-fold compared to conventional thiourea and Co(II) system. • A simple mixing sequences experiment is designed to study the reaction mechanism. • The interference of transition metal ions can be easily eliminated by adding DDTC. • The method is successfully applied in seafood samples and rice samples. - Abstract: A vapor generation procedure to determine Cd by atomic fluorescence spectrometry (AFS) has been established. Volatile species of Cd are generated by following reaction of acidified sample containing Fe(II) and L-cysteine (Cys) with sodium tetrahydroborate (NaBH 4 ). The presence of 5 mg L −1 Fe(II) and 0.05% m/v Cys improves the efficiency of Cd vapor generation substantially about four-fold compared with conventional thiourea and Co(II) system. Three experiments with different mixing sequences and reaction times are designed to study the reaction mechanism. The results document that the stability of Cd(II)–Cys complexes is better than Cys–THB complexes (THB means NaBH 4 ) while the Cys–THB complexes have more contribution to improve the Cd vapor generation efficiency than Cd(II)–Cys complexes. Meanwhile, the adding of Fe(II) can catalyze the Cd vapor generation. Under the optimized conditions, the detection limit of Cd is 0.012 μg L −1 ; relative standard deviations vary between 0.8% and 5.5% for replicate measurements of the standard solution. In the presence of 0.01% DDTC, Cu(II), Pb(II) and Zn(II) have no significant influence up to 5 mg L −1 , 10 mg L −1 and 10 mg L −1 , respectively. The accuracy of the method is verified through analysis of the certificated reference materials and the proposed method has been applied in the determination of Cd in seafood and rice samples

  15. Efficient generation of volatile species for cadmium analysis in seafood and rice samples by a modified chemical vapor generation system coupled with atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xin-an, E-mail: 13087641@qq.com; Chi, Miao-bin, E-mail: 1161306667@qq.com; Wang, Qing-qing, E-mail: wangqq8812@163.com; Zhang, Wang-bing, E-mail: ahutwbzh@163.com

    2015-04-15

    Highlights: • We develop a modified chemical vapor generation method coupled with AFS for the determination of cadmium. • The response of Cd could be increased at least four-fold compared to conventional thiourea and Co(II) system. • A simple mixing sequences experiment is designed to study the reaction mechanism. • The interference of transition metal ions can be easily eliminated by adding DDTC. • The method is successfully applied in seafood samples and rice samples. - Abstract: A vapor generation procedure to determine Cd by atomic fluorescence spectrometry (AFS) has been established. Volatile species of Cd are generated by following reaction of acidified sample containing Fe(II) and L-cysteine (Cys) with sodium tetrahydroborate (NaBH{sub 4}). The presence of 5 mg L{sup −1} Fe(II) and 0.05% m/v Cys improves the efficiency of Cd vapor generation substantially about four-fold compared with conventional thiourea and Co(II) system. Three experiments with different mixing sequences and reaction times are designed to study the reaction mechanism. The results document that the stability of Cd(II)–Cys complexes is better than Cys–THB complexes (THB means NaBH{sub 4}) while the Cys–THB complexes have more contribution to improve the Cd vapor generation efficiency than Cd(II)–Cys complexes. Meanwhile, the adding of Fe(II) can catalyze the Cd vapor generation. Under the optimized conditions, the detection limit of Cd is 0.012 μg L{sup −1}; relative standard deviations vary between 0.8% and 5.5% for replicate measurements of the standard solution. In the presence of 0.01% DDTC, Cu(II), Pb(II) and Zn(II) have no significant influence up to 5 mg L{sup −1}, 10 mg L{sup −1}and 10 mg L{sup −1}, respectively. The accuracy of the method is verified through analysis of the certificated reference materials and the proposed method has been applied in the determination of Cd in seafood and rice samples.

  16. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    Science.gov (United States)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  17. Radiolytic and thermal generation of gases from Hanford grout samples

    Energy Technology Data Exchange (ETDEWEB)

    Meisel, D.; Jonah, C.D.; Kapoor, S.; Matheson, M.S.; Mulac, W.A.

    1993-10-01

    Gamma irradiation of WHC-supplied samples of grouted Tank 102-AP simulated nonradioactive waste has been carried out at three dose rates, 0.25, 0.63, and 130 krad/hr. The low dose rate corresponds to that in the actual grout vaults; with the high dose rate, doses equivalent to more than 40 years in the grout vault were achieved. An average G(H{sub 2}) = 0.047 molecules/100 eV was found, independent of dose rate. The rate of H2 production decreases above 80 Mrad. For other gases, G(N{sub 2}) = 0.12, G(O{sub 2}) = 0.026, G(N{sub 2}O) = 0.011 and G(CO) = 0.0042 at 130 krad/hr were determined. At lower dose rates, N{sub 2} and O{sub 2} could not be measured because of interference by trapped air. The value of G(H{sub 2}) is higher than expected, suggesting segregation of water from nitrate and nitrite salts in the grout. The total pressure generated by the radiolysis at 130 krad/h has been independently measured, and total amounts of gases generated were calculated from this measurement. Good agreement between this measurement and the sum of all the gases that were independently determined was obtained. Therefore, the individual gas measurements account for most of the major components that are generated by the radiolysis. At 90 {degree}C, H{sub 2}, N{sub 2}, and N{sub 2}O were generated at a rate that could be described by exponential formation of each of the gases. Gases measured at the lower temperatures were probably residual trapped gases. An as yet unknown product interfered with oxygen determinations at temperatures above ambient. The thermal results do not affect the radiolytic findings.

  18. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  19. Measurements of tritium (HTO, TFWT, OBT) in environmental samples at varying distances from a nuclear generating station

    Energy Technology Data Exchange (ETDEWEB)

    Kotzer, T.G.; Workman, W.J.G

    1999-12-01

    Concentrations of tritium have been measured in environmental samples (vegetation, water, soil, air) from sites distal and proximal to a CANDU nuclear generating station in Southern Ontario (OPG-Pickering). Levels of tissue-free water tritium (TFWT) and organically bound tritium (OBT) in vegetation are as high as 24,000 TU immediately adjacent to the nuclear generating station and rapidly decrease to levels of tritium which are comparable to natural ambient concentrations for tritium in the environment (approximately {<=} 60 TU). Tritium concentrations (OBT, TFTW) have also been measured in samples of vegetation and tree rings growing substantial distances away from nuclear generating stations and are within a factor of 1 to 2 of the ambient levels of tritium measured in precipitation in several parts of Canada (approximately {<=}30 TU). (author)

  20. Measurements of tritium (HTO, TFWT, OBT) in environmental samples at varying distances from a nuclear generating station

    International Nuclear Information System (INIS)

    Kotzer, T.G.; Workman, W.J.G.

    1999-12-01

    Concentrations of tritium have been measured in environmental samples (vegetation, water, soil, air) from sites distal and proximal to a CANDU nuclear generating station in Southern Ontario (OPG-Pickering). Levels of tissue-free water tritium (TFWT) and organically bound tritium (OBT) in vegetation are as high as 24,000 TU immediately adjacent to the nuclear generating station and rapidly decrease to levels of tritium which are comparable to natural ambient concentrations for tritium in the environment (approximately ≤ 60 TU). Tritium concentrations (OBT, TFTW) have also been measured in samples of vegetation and tree rings growing substantial distances away from nuclear generating stations and are within a factor of 1 to 2 of the ambient levels of tritium measured in precipitation in several parts of Canada (approximately ≤30 TU). (author)

  1. Generator and Setup for Emulating Exposures of Biological Samples to Lightning Strokes.

    Science.gov (United States)

    Rebersek, Matej; Marjanovic, Igor; Begus, Samo; Pillet, Flavien; Rols, Marie-Pierre; Miklavcic, Damijan; Kotnik, Tadej

    2015-10-01

    We aimed to develop a system for controlled exposure of biological samples to conditions they experience when lightning strikes their habitats. We based the generator on a capacitor charged via a bridge rectifier and a dc-dc converter, and discharged via a relay, delivering arcs similar to natural lightning strokes in electric current waveform and similarly accompanied by acoustic shock waves. We coupled the generator to our exposure chamber described previously, measured electrical and acoustic properties of arc discharges delivered, and assessed their ability to inactivate bacterial spores. Submicrosecond discharges descended vertically from the conical emitting electrode across the air gap, entering the sample centrally and dissipating radially toward the ring-shaped receiving electrode. In contrast, longer discharges tended to short-circuit the electrodes. Recording at 341 000 FPS with Vision Research Phantom v2010 camera revealed that initial arc descent was still vertical, but became accompanied by arcs leaning increasingly sideways; after 8-12 μs, as the first of these arcs formed direct contact with the receiving electrode, it evolved into a channel of plasmified air and short-circuited the electrodes. We eliminated this artefact by incorporating an insulating cylinder concentrically between the electrodes, precluding short-circuiting between them. While bacterial spores are highly resistant to electric pulses delivered through direct contact, we showed that with arc discharges accompanied by an acoustic shock wave, spore inactivation is readily obtained. The presented system allows scientific investigation of effects of arc discharges on biological samples. This system will allow realistic experimental studies of lightning-triggered horizontal gene transfer and assessment of its role in evolution.

  2. Improving accuracy of rare variant imputation with a two-step imputation approach

    DEFF Research Database (Denmark)

    Kreiner-Møller, Eskil; Medina-Gomez, Carolina; Uitterlinden, André G

    2015-01-01

    not being comprehensively scrutinized. Next-generation arrays ensuring sufficient coverage together with new reference panels, as the 1000 Genomes panel, are emerging to facilitate imputation of low frequent single-nucleotide polymorphisms (minor allele frequency (MAF) ... reference sample genotyped on a dense array and hereafter to the 1000 Genomes reference panel. We show that mean imputation quality, measured by the r(2) using this approach, increases by 28% for variants with a MAF between 1 and 5% as compared with direct imputation to 1000 Genomes reference. Similarly......Genotype imputation has been the pillar of the success of genome-wide association studies (GWAS) for identifying common variants associated with common diseases. However, most GWAS have been run using only 60 HapMap samples as reference for imputation, meaning less frequent and rare variants...

  3. Evaluating multiplexed next-generation sequencing as a method in palynology for mixed pollen samples.

    Science.gov (United States)

    Keller, A; Danner, N; Grimmer, G; Ankenbrand, M; von der Ohe, K; von der Ohe, W; Rost, S; Härtel, S; Steffan-Dewenter, I

    2015-03-01

    The identification of pollen plays an important role in ecology, palaeo-climatology, honey quality control and other areas. Currently, expert knowledge and reference collections are essential to identify pollen origin through light microscopy. Pollen identification through molecular sequencing and DNA barcoding has been proposed as an alternative approach, but the assessment of mixed pollen samples originating from multiple plant species is still a tedious and error-prone task. Next-generation sequencing has been proposed to avoid this hindrance. In this study we assessed mixed pollen probes through next-generation sequencing of amplicons from the highly variable, species-specific internal transcribed spacer 2 region of nuclear ribosomal DNA. Further, we developed a bioinformatic workflow to analyse these high-throughput data with a newly created reference database. To evaluate the feasibility, we compared results from classical identification based on light microscopy from the same samples with our sequencing results. We assessed in total 16 mixed pollen samples, 14 originated from honeybee colonies and two from solitary bee nests. The sequencing technique resulted in higher taxon richness (deeper assignments and more identified taxa) compared to light microscopy. Abundance estimations from sequencing data were significantly correlated with counted abundances through light microscopy. Simulation analyses of taxon specificity and sensitivity indicate that 96% of taxa present in the database are correctly identifiable at the genus level and 70% at the species level. Next-generation sequencing thus presents a useful and efficient workflow to identify pollen at the genus and species level without requiring specialised palynological expert knowledge. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  4. A Frequency Matching Method for Generation of a Priori Sample Models from Training Images

    DEFF Research Database (Denmark)

    Lange, Katrine; Cordua, Knud Skou; Frydendall, Jan

    2011-01-01

    This paper presents a Frequency Matching Method (FMM) for generation of a priori sample models based on training images and illustrates its use by an example. In geostatistics, training images are used to represent a priori knowledge or expectations of models, and the FMM can be used to generate...... new images that share the same multi-point statistics as a given training image. The FMM proceeds by iteratively updating voxel values of an image until the frequency of patterns in the image matches the frequency of patterns in the training image; making the resulting image statistically...... indistinguishable from the training image....

  5. Concept and design of a genome-wide association genotyping array tailored for transplantation-specific studies

    DEFF Research Database (Denmark)

    Li, Yun R.; van Setten, Jessica; Verma, Shefali S.

    2015-01-01

    genome-wide genotyping array, the 'TxArray', comprising approximately 782,000 markers with tailored content for deeper capture of variants across HLA, KIR, pharmacogenomic, and metabolic loci important in transplantation. To test concordance and genotyping quality, we genotyped 85 HapMap samples...

  6. Real-time colour hologram generation based on ray-sampling plane with multi-GPU acceleration.

    Science.gov (United States)

    Sato, Hirochika; Kakue, Takashi; Ichihashi, Yasuyuki; Endo, Yutaka; Wakunami, Koki; Oi, Ryutaro; Yamamoto, Kenji; Nakayama, Hirotaka; Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2018-01-24

    Although electro-holography can reconstruct three-dimensional (3D) motion pictures, its computational cost is too heavy to allow for real-time reconstruction of 3D motion pictures. This study explores accelerating colour hologram generation using light-ray information on a ray-sampling (RS) plane with a graphics processing unit (GPU) to realise a real-time holographic display system. We refer to an image corresponding to light-ray information as an RS image. Colour holograms were generated from three RS images with resolutions of 2,048 × 2,048; 3,072 × 3,072 and 4,096 × 4,096 pixels. The computational results indicate that the generation of the colour holograms using multiple GPUs (NVIDIA Geforce GTX 1080) was approximately 300-500 times faster than those generated using a central processing unit. In addition, the results demonstrate that 3D motion pictures were successfully reconstructed from RS images of 3,072 × 3,072 pixels at approximately 15 frames per second using an electro-holographic reconstruction system in which colour holograms were generated from RS images in real time.

  7. Study of five novel non-synonymous polymorphisms in human brain-expressed genes in a Colombian sample.

    Science.gov (United States)

    Ojeda, Diego A; Forero, Diego A

    2014-10-01

    Non-synonymous single nucleotide polymorphisms (nsSNPs) in brain-expressed genes represent interesting candidates for genetic research in neuropsychiatric disorders. To study novel nsSNPs in brain-expressed genes in a sample of Colombian subjects. We applied an approach based on in silico mining of available genomic data to identify and select novel nsSNPs in brain-expressed genes. We developed novel genotyping assays, based in allele-specific PCR methods, for these nsSNPs and genotyped them in 171 Colombian subjects. Five common nsSNPs (rs6855837; p.Leu395Ile, rs2305160; p.Thr394Ala, rs10503929; p.Met289Thr, rs2270641; p.Thr4Pro and rs3822659; p.Ser735Ala) were studied, located in the CLOCK, NPAS2, NRG1, SLC18A1 and WWC1 genes. We reported allele and genotype frequencies in a sample of South American healthy subjects. There is previous experimental evidence, arising from genome-wide expression and association studies, for the involvement of these genes in several neuropsychiatric disorders and endophenotypes, such as schizophrenia, mood disorders or memory performance. Frequencies for these nsSNPSs in the Colombian samples varied in comparison to different HapMap populations. Future study of these nsSNPs in brain-expressed genes, a synaptogenomics approach, will be important for a better understanding of neuropsychiatric diseases and endophenotypes in different populations.

  8. Quantitative second-harmonic generation imaging to detect osteogenesis imperfecta in human skin samples

    Science.gov (United States)

    Adur, J.; Ferreira, A. E.; D'Souza-Li, L.; Pelegati, V. B.; de Thomaz, A. A.; Almeida, D. B.; Baratti, M. O.; Carvalho, H. F.; Cesar, C. L.

    2012-03-01

    Osteogenesis Imperfecta (OI) is a genetic disorder that leads to bone fractures due to mutations in the Col1A1 or Col1A2 genes that affect the primary structure of the collagen I chain with the ultimate outcome in collagen I fibrils that are either reduced in quantity or abnormally organized in the whole body. A quick test screening of the patients would largely reduce the sample number to be studied by the time consuming molecular genetics techniques. For this reason an assessment of the human skin collagen structure by Second Harmonic Generation (SHG) can be used as a screening technique to speed up the correlation of genetics/phenotype/OI types understanding. In the present work we have used quantitative second harmonic generation (SHG) imaging microscopy to investigate the collagen matrix organization of the OI human skin samples comparing with normal control patients. By comparing fibril collagen distribution and spatial organization, we calculated the anisotropy and texture patterns of this structural protein. The analysis of the anisotropy was performed by means of the two-dimensional Discrete Fourier Transform and image pattern analysis with Gray-Level Co-occurrence Matrix (GLCM). From these results, we show that statistically different results are obtained for the normal and disease states of OI.

  9. An unusual haplotype structure on human chromosome 8p23 derived from the inversion polymorphism.

    Science.gov (United States)

    Deng, Libin; Zhang, Yuezheng; Kang, Jian; Liu, Tao; Zhao, Hongbin; Gao, Yang; Li, Chaohua; Pan, Hao; Tang, Xiaoli; Wang, Dunmei; Niu, Tianhua; Yang, Huanming; Zeng, Changqing

    2008-10-01

    Chromosomal inversion is an important type of genomic variations involved in both evolution and disease pathogenesis. Here, we describe the refined genetic structure of a 3.8-Mb inversion polymorphism at chromosome 8p23. Using HapMap data of 1,073 SNPs generated from 209 unrelated samples from CEPH-Utah residents with ancestry from northern and western Europe (CEU); Yoruba in Ibadan, Nigeria (YRI); and Asian (ASN) samples, which were comprised of Han Chinese from Beijing, China (CHB) and Japanese from Tokyo, Japan (JPT)-we successfully deduced the inversion orientations of all their 418 haplotypes. In particular, distinct haplotype subgroups were identified based on principal component analysis (PCA). Such genetic substructures were consistent with clustering patterns based on neighbor-joining tree reconstruction, which revealed a total of four haplotype clades across all samples. Metaphase fluorescence in situ hybridization (FISH) in a subset of 10 HapMap samples verified their inversion orientations predicted by PCA or phylogenetic tree reconstruction. Positioning of the outgroup haplotype within one of YRI clades suggested that Human NCBI Build 36-inverted order is most likely the ancestral orientation. Furthermore, the population differentiation test and the relative extended haplotype homozygosity (REHH) analysis in this region discovered multiple selection signals, also in a population-specific manner. A positive selection signal was detected at XKR6 in the ASN population. These results revealed the correlation of inversion polymorphisms to population-specific genetic structures, and various selection patterns as possible mechanisms for the maintenance of a large chromosomal rearrangement at 8p23 region during evolution. In addition, our study also showed that haplotype-based clustering methods, such as PCA, can be applied in scanning for cryptic inversion polymorphisms at a genome-wide scale.

  10. Heritability, SNP- and gene-based analyses of cannabis use initiation and age at onset

    NARCIS (Netherlands)

    Minica, C.C.; Dolan, C.V.; Hottenga, J.J.; Pool, R.; Fedko, I.O.; Mbarek, H.; Huppertz, C.; Bartels, M.; Boomsma, D.I.; Vink, J.M.

    2015-01-01

    Prior searches for genetic variants (GVs) implicated in initiation of cannabis use have been limited to common single nucleotide polymorphisms (SNPs) typed in HapMap samples. Denser SNPs are now available with the completion of the 1000 Genomes and the Genome of the Netherlands projects. More

  11. Heritability, SNP- and Gene-Based Analyses of Cannabis Use Initiation and Age at Onset

    NARCIS (Netherlands)

    Minica, C.C.; Dolan, C.V.; Hottenga, J.J.; Pool, R.; Fedko, I.O.; Mbarek, H.; Huppertz, C.; Bartels, M.; Boomsma, D.I.; Vink, J.M.

    2015-01-01

    Prior searches for genetic variants (GVs) implicated in initiation of cannabis use have been limited to common single nucleotide polymorphisms (SNPs) typed in HapMap samples. Denser SNPs are now available with the completion of the 1000 Genomes and the Genome of the Netherlands projects. More

  12. Foam generation and sample composition optimization for the FOAM-C experiment of the ISS

    International Nuclear Information System (INIS)

    Carpy, R; Picker, G; Amann, B; Ranebo, H; Vincent-Bonnieu, S; Minster, O; Winter, J; Dettmann, J; Castiglione, L; Höhler, R; Langevin, D

    2011-01-01

    End of 2009 and early 2010 a sealed cell, for foam generation and observation, has been designed and manufactured at Astrium Friedrichshafen facilities. With the use of this cell, different sample compositions of 'wet foams' have been optimized for mixtures of chemicals such as water, dodecanol, pluronic, aethoxisclerol, glycerol, CTAB, SDS, as well as glass beads. This development is performed in the frame of the breadboarding development activities of the Experiment Container FOAM-C for operation in the ISS Fluid Science Laboratory (ISS). The sample cell supports multiple observation methods such as: Diffusing-Wave and Diffuse Transmission Spectrometry, Time Resolved Correlation Spectroscopy and microscope observation, all of these methods are applied in the cell with a relatively small experiment volume 3 . These units, will be on orbit replaceable sets, that will allow multiple sample compositions processing (in the range of >40).

  13. Evaluation of the 99Mo contamination in eluates samples generated of 99mTc in a clinic of Recife, Brazil

    International Nuclear Information System (INIS)

    Andrade, W.G.; Lima, F.F.

    2008-01-01

    This study evaluates the 99 Mo content in eluates of 99 Mo/ 99 m Tc generators, used in a nuclear medicine service in Recife. To do this, were collected eluates samples from 5 elution of 10 different generators using the attenuation method in own nuclear medicine service which provided routine activimeter CRC-127R model, manufactured by Capintec. The samples were measured, and the activities of 99m Tc and 99 Mo were determined and calculated the MBT (molybdenum break through) for 1 st , 3 rd , 5 th , 7 th and 9 th elution of each generator. It was observed in a sample the presence of molybdenum in the amount near the limit set by the United States Pharmacopoeia (USP), 0,15μCi/mCi). A second sample presented good high value, more than double the USP limit. The results obtained demonstrate the possibility of finding 99 Mo in the eluted solution, which reinforces the need to deploy the control test of the molybdenum content in all elution in quality control programs of service nuclear medicine

  14. Evaluation of gene expression data generated from expired Affymetrix GeneChip® microarrays using MAQC reference RNA samples

    Directory of Open Access Journals (Sweden)

    Tong Weida

    2010-10-01

    Full Text Available Abstract Background The Affymetrix GeneChip® system is a commonly used platform for microarray analysis but the technology is inherently expensive. Unfortunately, changes in experimental planning and execution, such as the unavailability of previously anticipated samples or a shift in research focus, may render significant numbers of pre-purchased GeneChip® microarrays unprocessed before their manufacturer’s expiration dates. Researchers and microarray core facilities wonder whether expired microarrays are still useful for gene expression analysis. In addition, it was not clear whether the two human reference RNA samples established by the MAQC project in 2005 still maintained their transcriptome integrity over a period of four years. Experiments were conducted to answer these questions. Results Microarray data were generated in 2009 in three replicates for each of the two MAQC samples with either expired Affymetrix U133A or unexpired U133Plus2 microarrays. These results were compared with data obtained in 2005 on the U133Plus2 microarray. The percentage of overlap between the lists of differentially expressed genes (DEGs from U133Plus2 microarray data generated in 2009 and in 2005 was 97.44%. While there was some degree of fold change compression in the expired U133A microarrays, the percentage of overlap between the lists of DEGs from the expired and unexpired microarrays was as high as 96.99%. Moreover, the microarray data generated using the expired U133A microarrays in 2009 were highly concordant with microarray and TaqMan® data generated by the MAQC project in 2005. Conclusions Our results demonstrated that microarray data generated using U133A microarrays, which were more than four years past the manufacturer’s expiration date, were highly specific and consistent with those from unexpired microarrays in identifying DEGs despite some appreciable fold change compression and decrease in sensitivity. Our data also suggested that the

  15. Genetic diversity in India and the inference of Eurasian population expansion.

    Science.gov (United States)

    Xing, Jinchuan; Watkins, W Scott; Hu, Ya; Huff, Chad D; Sabo, Aniko; Muzny, Donna M; Bamshad, Michael J; Gibbs, Richard A; Jorde, Lynn B; Yu, Fuli

    2010-01-01

    Genetic studies of populations from the Indian subcontinent are of great interest because of India's large population size, complex demographic history, and unique social structure. Despite recent large-scale efforts in discovering human genetic variation, India's vast reservoir of genetic diversity remains largely unexplored. To analyze an unbiased sample of genetic diversity in India and to investigate human migration history in Eurasia, we resequenced one 100-kb ENCODE region in 92 samples collected from three castes and one tribal group from the state of Andhra Pradesh in south India. Analyses of the four Indian populations, along with eight HapMap populations (692 samples), showed that 30% of all SNPs in the south Indian populations are not seen in HapMap populations. Several Indian populations, such as the Yadava, Mala/Madiga, and Irula, have nucleotide diversity levels as high as those of HapMap African populations. Using unbiased allele-frequency spectra, we investigated the expansion of human populations into Eurasia. The divergence time estimates among the major population groups suggest that Eurasian populations in this study diverged from Africans during the same time frame (approximately 90 to 110 thousand years ago). The divergence among different Eurasian populations occurred more than 40,000 years after their divergence with Africans. Our results show that Indian populations harbor large amounts of genetic variation that have not been surveyed adequately by public SNP discovery efforts. Our data also support a delayed expansion hypothesis in which an ancestral Eurasian founding population remained isolated long after the out-of-Africa diaspora, before expanding throughout Eurasia. © 2010 Xing et al.; licensee BioMed Central Ltd.

  16. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  17. 1-Hydroxypyrene Levels in Blood Samples of Rats After Exposure to Generator Fumes

    Science.gov (United States)

    Ifegwu, Clinton; Igwo-Ezikpe, Miriam N.; Anyakora, Chimezie; Osuntoki, Akinniyi; Oseni, Kafayat A.; Alao, Eragbae O.

    2013-01-01

    Polynuclear Aromatic Hydrocarbons (PAHs) are a major component of fuel generator fumes. Carcinogenicity of these compounds has long been established. In this study, 37 Swiss albino rats were exposed to generator fumes at varied distances for 8 hours per day for a period of 42 days and the level of 1-hydroxypyrene in their blood was evaluated. This study also tried to correlate the level of blood 1-hyroxypyrene with the distance from the source of pollution. Plasma was collected by centrifuging the whole blood sample followed by complete hydrolysis of the conjugated 1-hydroxypyrene glucuronide to yield the analyte of interest, 1-hydroxypyrene, which was achieved using beta glucuronidase. High performance liquid chromatography (HPLC) with UV detector was used to determine the 1-hydroxypyrene concentrations in the blood samples. The mobile phase was water:methanol (12:88 v/v) isocratic run at the flow rate of 1.2 mL/min with CI8 stationary phase at 250 nm. After 42 days of exposure, blood concentration level of 1-hydroxypyrene ranged from 34 μg/mL to 26.29 μg/mL depending on the distance from source of exposure. The control group had no 1-hydroxypyrene in their blood. After the period of exposure, percentage of death correlated with the distance from the source of exposure. Percentage of death ranged from 56% to zero depending on the proximity to source of pollution. PMID:24179393

  18. Foam generation and sample composition optimization for the FOAM-C experiment of the ISS

    Science.gov (United States)

    Carpy, R.; Picker, G.; Amann, B.; Ranebo, H.; Vincent-Bonnieu, S.; Minster, O.; Winter, J.; Dettmann, J.; Castiglione, L.; Höhler, R.; Langevin, D.

    2011-12-01

    End of 2009 and early 2010 a sealed cell, for foam generation and observation, has been designed and manufactured at Astrium Friedrichshafen facilities. With the use of this cell, different sample compositions of "wet foams" have been optimized for mixtures of chemicals such as water, dodecanol, pluronic, aethoxisclerol, glycerol, CTAB, SDS, as well as glass beads. This development is performed in the frame of the breadboarding development activities of the Experiment Container FOAM-C for operation in the ISS Fluid Science Laboratory (ISS). The sample cell supports multiple observation methods such as: Diffusing-Wave and Diffuse Transmission Spectrometry, Time Resolved Correlation Spectroscopy [1] and microscope observation, all of these methods are applied in the cell with a relatively small experiment volume 40).

  19. SNP calling, genotype calling, and sample allele frequency estimation from new-generation sequencing data

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Korneliussen, Thorfinn Sand; Albrechtsen, Anders

    2012-01-01

    We present a statistical framework for estimation and application of sample allele frequency spectra from New-Generation Sequencing (NGS) data. In this method, we first estimate the allele frequency spectrum using maximum likelihood. In contrast to previous methods, the likelihood function is cal...... be extended to various other cases including cases with deviations from Hardy-Weinberg equilibrium. We evaluate the statistical properties of the methods using simulations and by application to a real data set....

  20. A second generation human haplotype map of over 3.1 million SNPs.

    Science.gov (United States)

    Frazer, Kelly A; Ballinger, Dennis G; Cox, David R; Hinds, David A; Stuve, Laura L; Gibbs, Richard A; Belmont, John W; Boudreau, Andrew; Hardenbol, Paul; Leal, Suzanne M; Pasternak, Shiran; Wheeler, David A; Willis, Thomas D; Yu, Fuli; Yang, Huanming; Zeng, Changqing; Gao, Yang; Hu, Haoran; Hu, Weitao; Li, Chaohua; Lin, Wei; Liu, Siqi; Pan, Hao; Tang, Xiaoli; Wang, Jian; Wang, Wei; Yu, Jun; Zhang, Bo; Zhang, Qingrun; Zhao, Hongbin; Zhao, Hui; Zhou, Jun; Gabriel, Stacey B; Barry, Rachel; Blumenstiel, Brendan; Camargo, Amy; Defelice, Matthew; Faggart, Maura; Goyette, Mary; Gupta, Supriya; Moore, Jamie; Nguyen, Huy; Onofrio, Robert C; Parkin, Melissa; Roy, Jessica; Stahl, Erich; Winchester, Ellen; Ziaugra, Liuda; Altshuler, David; Shen, Yan; Yao, Zhijian; Huang, Wei; Chu, Xun; He, Yungang; Jin, Li; Liu, Yangfan; Shen, Yayun; Sun, Weiwei; Wang, Haifeng; Wang, Yi; Wang, Ying; Xiong, Xiaoyan; Xu, Liang; Waye, Mary M Y; Tsui, Stephen K W; Xue, Hong; Wong, J Tze-Fei; Galver, Luana M; Fan, Jian-Bing; Gunderson, Kevin; Murray, Sarah S; Oliphant, Arnold R; Chee, Mark S; Montpetit, Alexandre; Chagnon, Fanny; Ferretti, Vincent; Leboeuf, Martin; Olivier, Jean-François; Phillips, Michael S; Roumy, Stéphanie; Sallée, Clémentine; Verner, Andrei; Hudson, Thomas J; Kwok, Pui-Yan; Cai, Dongmei; Koboldt, Daniel C; Miller, Raymond D; Pawlikowska, Ludmila; Taillon-Miller, Patricia; Xiao, Ming; Tsui, Lap-Chee; Mak, William; Song, You Qiang; Tam, Paul K H; Nakamura, Yusuke; Kawaguchi, Takahisa; Kitamoto, Takuya; Morizono, Takashi; Nagashima, Atsushi; Ohnishi, Yozo; Sekine, Akihiro; Tanaka, Toshihiro; Tsunoda, Tatsuhiko; Deloukas, Panos; Bird, Christine P; Delgado, Marcos; Dermitzakis, Emmanouil T; Gwilliam, Rhian; Hunt, Sarah; Morrison, Jonathan; Powell, Don; Stranger, Barbara E; Whittaker, Pamela; Bentley, David R; Daly, Mark J; de Bakker, Paul I W; Barrett, Jeff; Chretien, Yves R; Maller, Julian; McCarroll, Steve; Patterson, Nick; Pe'er, Itsik; Price, Alkes; Purcell, Shaun; Richter, Daniel J; Sabeti, Pardis; Saxena, Richa; Schaffner, Stephen F; Sham, Pak C; Varilly, Patrick; Altshuler, David; Stein, Lincoln D; Krishnan, Lalitha; Smith, Albert Vernon; Tello-Ruiz, Marcela K; Thorisson, Gudmundur A; Chakravarti, Aravinda; Chen, Peter E; Cutler, David J; Kashuk, Carl S; Lin, Shin; Abecasis, Gonçalo R; Guan, Weihua; Li, Yun; Munro, Heather M; Qin, Zhaohui Steve; Thomas, Daryl J; McVean, Gilean; Auton, Adam; Bottolo, Leonardo; Cardin, Niall; Eyheramendy, Susana; Freeman, Colin; Marchini, Jonathan; Myers, Simon; Spencer, Chris; Stephens, Matthew; Donnelly, Peter; Cardon, Lon R; Clarke, Geraldine; Evans, David M; Morris, Andrew P; Weir, Bruce S; Tsunoda, Tatsuhiko; Mullikin, James C; Sherry, Stephen T; Feolo, Michael; Skol, Andrew; Zhang, Houcan; Zeng, Changqing; Zhao, Hui; Matsuda, Ichiro; Fukushima, Yoshimitsu; Macer, Darryl R; Suda, Eiko; Rotimi, Charles N; Adebamowo, Clement A; Ajayi, Ike; Aniagwu, Toyin; Marshall, Patricia A; Nkwodimmah, Chibuzor; Royal, Charmaine D M; Leppert, Mark F; Dixon, Missy; Peiffer, Andy; Qiu, Renzong; Kent, Alastair; Kato, Kazuto; Niikawa, Norio; Adewole, Isaac F; Knoppers, Bartha M; Foster, Morris W; Clayton, Ellen Wright; Watkin, Jessica; Gibbs, Richard A; Belmont, John W; Muzny, Donna; Nazareth, Lynne; Sodergren, Erica; Weinstock, George M; Wheeler, David A; Yakub, Imtaz; Gabriel, Stacey B; Onofrio, Robert C; Richter, Daniel J; Ziaugra, Liuda; Birren, Bruce W; Daly, Mark J; Altshuler, David; Wilson, Richard K; Fulton, Lucinda L; Rogers, Jane; Burton, John; Carter, Nigel P; Clee, Christopher M; Griffiths, Mark; Jones, Matthew C; McLay, Kirsten; Plumb, Robert W; Ross, Mark T; Sims, Sarah K; Willey, David L; Chen, Zhu; Han, Hua; Kang, Le; Godbout, Martin; Wallenburg, John C; L'Archevêque, Paul; Bellemare, Guy; Saeki, Koji; Wang, Hongguang; An, Daochang; Fu, Hongbo; Li, Qing; Wang, Zhen; Wang, Renwu; Holden, Arthur L; Brooks, Lisa D; McEwen, Jean E; Guyer, Mark S; Wang, Vivian Ota; Peterson, Jane L; Shi, Michael; Spiegel, Jack; Sung, Lawrence M; Zacharia, Lynn F; Collins, Francis S; Kennedy, Karen; Jamieson, Ruth; Stewart, John

    2007-10-18

    We describe the Phase II HapMap, which characterizes over 3.1 million human single nucleotide polymorphisms (SNPs) genotyped in 270 individuals from four geographically diverse populations and includes 25-35% of common SNP variation in the populations surveyed. The map is estimated to capture untyped common variation with an average maximum r2 of between 0.9 and 0.96 depending on population. We demonstrate that the current generation of commercial genome-wide genotyping products captures common Phase II SNPs with an average maximum r2 of up to 0.8 in African and up to 0.95 in non-African populations, and that potential gains in power in association studies can be obtained through imputation. These data also reveal novel aspects of the structure of linkage disequilibrium. We show that 10-30% of pairs of individuals within a population share at least one region of extended genetic identity arising from recent ancestry and that up to 1% of all common variants are untaggable, primarily because they lie within recombination hotspots. We show that recombination rates vary systematically around genes and between genes of different function. Finally, we demonstrate increased differentiation at non-synonymous, compared to synonymous, SNPs, resulting from systematic differences in the strength or efficacy of natural selection between populations.

  1. Simultaneous analysis of arsenic, antimony, selenium and tellurium in environmental samples using hydride generation ICPMS

    International Nuclear Information System (INIS)

    Jankowski, L.M.; Breidenbach, R.; Bakker, I.J.I.; Epema, O.J.

    2009-01-01

    Full text: A quantitative method for simultaneous analysis of arsenic, antimony, selenium and tellurium in environmental samples is being developed using hydride generation ICPMS. These elements must be first transformed into hydride-forming oxidation states. This is particularly challenging for selenium and antimony because selenium is susceptible to reduction to the non-hydride-forming elemental state and antimony requires strong reducing conditions. The effectiveness of three reducing agents (KI, thiourea, cysteine) is studied. A comparison is made between addition of reducing agent to the sample and addition of KI to the NaBH 4 solution. Best results were obtained with the latter approach. (author)

  2. Near-optimal alternative generation using modified hit-and-run sampling for non-linear, non-convex problems

    Science.gov (United States)

    Rosenberg, D. E.; Alafifi, A.

    2016-12-01

    Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one

  3. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    Directory of Open Access Journals (Sweden)

    Der-Chiang Li

    Full Text Available It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP method merging in the D3C method (PPDP+D3C with those of the one-sided selection (OSS, the well-known SMOTEBoost (SB study, and the normal distribution-based oversampling (NDO approach, and the proposed data pre-processing (PPDP method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  4. Assessing accuracy of genotype imputation in American Indians.

    Directory of Open Access Journals (Sweden)

    Alka Malhotra

    Full Text Available Genotype imputation is commonly used in genetic association studies to test untyped variants using information on linkage disequilibrium (LD with typed markers. Imputing genotypes requires a suitable reference population in which the LD pattern is known, most often one selected from HapMap. However, some populations, such as American Indians, are not represented in HapMap. In the present study, we assessed accuracy of imputation using HapMap reference populations in a genome-wide association study in Pima Indians.Data from six randomly selected chromosomes were used. Genotypes in the study population were masked (either 1% or 20% of SNPs available for a given chromosome. The masked genotypes were then imputed using the software Markov Chain Haplotyping Algorithm. Using four HapMap reference populations, average genotype error rates ranged from 7.86% for Mexican Americans to 22.30% for Yoruba. In contrast, use of the original Pima Indian data as a reference resulted in an average error rate of 1.73%.Our results suggest that the use of HapMap reference populations results in substantial inaccuracy in the imputation of genotypes in American Indians. A possible solution would be to densely genotype or sequence a reference American Indian population.

  5. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-04-01

    Full Text Available Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA. Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, consisting not only of the segmentation algorithm parameters, but also of low-level, parameterized image processing functions. Such higher dimensional search landscapes potentially allow for achieving better segmentation accuracies. The proposed method is tested with a range of low-level image transformation functions and two segmentation algorithms. The general effectiveness of such an approach is demonstrated compared to a variant only optimising segmentation algorithm parameters. Further, it is shown that the resultant search landscapes obtained from combining mid- and low-level image processing parameter domains, in our problem contexts, are sufficiently complex to warrant the use of population based stochastic search methods. Interdependencies of these two parameter domains are also demonstrated, necessitating simultaneous optimization.

  6. Comparing genetic variants detected in the 1000 genomes project ...

    Indian Academy of Sciences (India)

    Single-nucleotide polymorphisms (SNPs) determined based on SNP arrays from the international HapMap consortium (HapMap) and the genetic variants detected in the 1000 genomes project (1KGP) can serve as two references for genomewide association studies (GWAS). We conducted comparative analyses to provide ...

  7. Molecular typing of lung adenocarcinoma on cytological samples using a multigene next generation sequencing panel.

    Directory of Open Access Journals (Sweden)

    Aldo Scarpa

    Full Text Available Identification of driver mutations in lung adenocarcinoma has led to development of targeted agents that are already approved for clinical use or are in clinical trials. Therefore, the number of biomarkers that will be needed to assess is expected to rapidly increase. This calls for the implementation of methods probing the mutational status of multiple genes for inoperable cases, for which limited cytological or bioptic material is available. Cytology specimens from 38 lung adenocarcinomas were subjected to the simultaneous assessment of 504 mutational hotspots of 22 lung cancer-associated genes using 10 nanograms of DNA and Ion Torrent PGM next-generation sequencing. Thirty-six cases were successfully sequenced (95%. In 24/36 cases (67% at least one mutated gene was observed, including EGFR, KRAS, PIK3CA, BRAF, TP53, PTEN, MET, SMAD4, FGFR3, STK11, MAP2K1. EGFR and KRAS mutations, respectively found in 6/36 (16% and 10/36 (28% cases, were mutually exclusive. Nine samples (25% showed concurrent alterations in different genes. The next-generation sequencing test used is superior to current standard methodologies, as it interrogates multiple genes and requires limited amounts of DNA. Its applicability to routine cytology samples might allow a significant increase in the fraction of lung cancer patients eligible for personalized therapy.

  8. Haplotype mapping of a diploid non-meiotic organism using existing and induced aneuploidies.

    Directory of Open Access Journals (Sweden)

    Melanie Legrand

    2008-01-01

    Full Text Available Haplotype maps (HapMaps reveal underlying sequence variation and facilitate the study of recombination and genetic diversity. In general, HapMaps are produced by analysis of Single-Nucleotide Polymorphism (SNP segregation in large numbers of meiotic progeny. Candida albicans, the most common human fungal pathogen, is an obligate diploid that does not appear to undergo meiosis. Thus, standard methods for haplotype mapping cannot be used. We exploited naturally occurring aneuploid strains to determine the haplotypes of the eight chromosome pairs in the C. albicans laboratory strain SC5314 and in a clinical isolate. Comparison of the maps revealed that the clinical strain had undergone a significant amount of genome rearrangement, consisting primarily of crossover or gene conversion recombination events. SNP map haplotyping revealed that insertion and activation of the UAU1 cassette in essential and non-essential genes can result in whole chromosome aneuploidy. UAU1 is often used to construct homozygous deletions of targeted genes in C. albicans; the exact mechanism (trisomy followed by chromosome loss versus gene conversion has not been determined. UAU1 insertion into the essential ORC1 gene resulted in a large proportion of trisomic strains, while gene conversion events predominated when UAU1 was inserted into the non-essential LRO1 gene. Therefore, induced aneuploidies can be used to generate HapMaps, which are essential for analyzing genome alterations and mitotic recombination events in this clonal organism.

  9. Transcriptome sequencing of the Microarray Quality Control (MAQC RNA reference samples using next generation sequencing

    Directory of Open Access Journals (Sweden)

    Thierry-Mieg Danielle

    2009-06-01

    Full Text Available Abstract Background Transcriptome sequencing using next-generation sequencing platforms will soon be competing with DNA microarray technologies for global gene expression analysis. As a preliminary evaluation of these promising technologies, we performed deep sequencing of cDNA synthesized from the Microarray Quality Control (MAQC reference RNA samples using Roche's 454 Genome Sequencer FLX. Results We generated more that 3.6 million sequence reads of average length 250 bp for the MAQC A and B samples and introduced a data analysis pipeline for translating cDNA read counts into gene expression levels. Using BLAST, 90% of the reads mapped to the human genome and 64% of the reads mapped to the RefSeq database of well annotated genes with e-values ≤ 10-20. We measured gene expression levels in the A and B samples by counting the numbers of reads that mapped to individual RefSeq genes in multiple sequencing runs to evaluate the MAQC quality metrics for reproducibility, sensitivity, specificity, and accuracy and compared the results with DNA microarrays and Quantitative RT-PCR (QRTPCR from the MAQC studies. In addition, 88% of the reads were successfully aligned directly to the human genome using the AceView alignment programs with an average 90% sequence similarity to identify 137,899 unique exon junctions, including 22,193 new exon junctions not yet contained in the RefSeq database. Conclusion Using the MAQC metrics for evaluating the performance of gene expression platforms, the ExpressSeq results for gene expression levels showed excellent reproducibility, sensitivity, and specificity that improved systematically with increasing shotgun sequencing depth, and quantitative accuracy that was comparable to DNA microarrays and QRTPCR. In addition, a careful mapping of the reads to the genome using the AceView alignment programs shed new light on the complexity of the human transcriptome including the discovery of thousands of new splice variants.

  10. A multi-sample based method for identifying common CNVs in normal human genomic structure using high-resolution aCGH data.

    Directory of Open Access Journals (Sweden)

    Chihyun Park

    Full Text Available BACKGROUND: It is difficult to identify copy number variations (CNV in normal human genomic data due to noise and non-linear relationships between different genomic regions and signal intensity. A high-resolution array comparative genomic hybridization (aCGH containing 42 million probes, which is very large compared to previous arrays, was recently published. Most existing CNV detection algorithms do not work well because of noise associated with the large amount of input data and because most of the current methods were not designed to analyze normal human samples. Normal human genome analysis often requires a joint approach across multiple samples. However, the majority of existing methods can only identify CNVs from a single sample. METHODOLOGY AND PRINCIPAL FINDINGS: We developed a multi-sample-based genomic variations detector (MGVD that uses segmentation to identify common breakpoints across multiple samples and a k-means-based clustering strategy. Unlike previous methods, MGVD simultaneously considers multiple samples with different genomic intensities and identifies CNVs and CNV zones (CNVZs; CNVZ is a more precise measure of the location of a genomic variant than the CNV region (CNVR. CONCLUSIONS AND SIGNIFICANCE: We designed a specialized algorithm to detect common CNVs from extremely high-resolution multi-sample aCGH data. MGVD showed high sensitivity and a low false discovery rate for a simulated data set, and outperformed most current methods when real, high-resolution HapMap datasets were analyzed. MGVD also had the fastest runtime compared to the other algorithms evaluated when actual, high-resolution aCGH data were analyzed. The CNVZs identified by MGVD can be used in association studies for revealing relationships between phenotypes and genomic aberrations. Our algorithm was developed with standard C++ and is available in Linux and MS Windows format in the STL library. It is freely available at: http://embio.yonsei.ac.kr/~Park/mgvd.php.

  11. Progress on using deuteron-deuteron fusion generated neutrons for 40Ar/39Ar sample irradiation

    Science.gov (United States)

    Rutte, Daniel; Renne, Paul R.; Becker, Tim; Waltz, Cory; Ayllon Unzueta, Mauricio; Zimmerman, Susan; Hidy, Alan; Finkel, Robert; Bauer, Joseph D.; Bernstein, Lee; van Bibber, Karl

    2017-04-01

    We present progress on the development and proof of concept of a deuteron-deuteron fusion based neutron generator for 40Ar/39Ar sample irradiation. Irradiation with deuteron-deuteron fusion neutrons is anticipated to reduce Ar recoil and Ar production from interfering reactions. This will allow dating of smaller grains and increase accuracy and precision of the method. The instrument currently achieves neutron fluxes of ˜9×107 cm-2s-1 as determined by irradiation of indium foils and use of the activation reaction 115In(n,n')115mIn. Multiple foils and simulations were used to determine flux gradients in the sample chamber. A first experiment quantifying the loss of 39Ar is underway and will likely be available at the time of the presentation of this abstract. In ancillary experiments via irradiation of K salts and subsequent mass spectrometric analysis we determined the cross-sections of the 39K(n,p)39Ar reaction at ˜2.8 MeV to be 160 ± 35 mb (1σ). This result is in good agreement with bracketing cross-section data of ˜96 mb at ˜2.45 MeV and ˜270 mb at ˜4 MeV [Johnson et al., 1967; Dixon and Aitken, 1961 and Bass et al. 1964]. Our data disfavor a much lower value of ˜45 mb at 2.59 MeV [Lindström & Neuer, 1958]. In another ancillary experiment the cross section for 39K(n,α)36Cl at ˜2.8 MeV was determined as 11.7 ± 0.5 mb (1σ), which is significant for 40Ar/39Ar geochronology due to subsequent decay to 36Ar as well as for the determination of production rates of cosmogenic 36Cl. Additional experiments resolving the cross section functions on 39K between 1.5 and 3.6 MeV are on their way using the LICORNE neutron source of the IPN Orsay tandem accelerator. Results will likely be available at the time of the presentation of this abstract. While the neutron generator is designed for fluxes of ˜109 cm-2s-1, arcing in the sample chamber currently limits the power—straightforwardly correlated to the neutron flux—the generator can safely be run at. Further

  12. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  13. Second generation laser-heated microfurnace for the preparation of microgram-sized graphite samples

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Bin; Smith, A.M.; Long, S.

    2015-10-15

    We present construction details and test results for two second-generation laser-heated microfurnaces (LHF-II) used to prepare graphite samples for Accelerator Mass Spectrometry (AMS) at ANSTO. Based on systematic studies aimed at optimising the performance of our prototype laser-heated microfurnace (LHF-I) (Smith et al., 2007 [1]; Smith et al., 2010 [2,3]; Yang et al., 2014 [4]), we have designed the LHF-II to have the following features: (i) it has a small reactor volume of 0.25 mL allowing us to completely graphitise carbon dioxide samples containing as little as 2 μg of C, (ii) it can operate over a large pressure range (0–3 bar) and so has the capacity to graphitise CO{sub 2} samples containing up to 100 μg of C; (iii) it is compact, with three valves integrated into the microfurnace body, (iv) it is compatible with our new miniaturised conventional graphitisation furnaces (MCF), also designed for small samples, and shares a common vacuum system. Early tests have shown that the extraneous carbon added during graphitisation in each LHF-II is of the order of 0.05 μg, assuming 100 pMC activity, similar to that of the prototype unit. We use a ‘budget’ fibre packaged array for the diode laser with custom built focusing optics. The use of a new infrared (IR) thermometer with a short focal length has allowed us to decrease the height of the light-proof safety enclosure. These innovations have produced a cheaper and more compact device. As with the LHF-I, feedback control of the catalyst temperature and logging of the reaction parameters is managed by a LabVIEW interface.

  14. The use of 99Mo/99mTc generators in the analysis of low levels of 99Tc in environmental samples by radiochemical methods

    International Nuclear Information System (INIS)

    Dowdall, M.; Selnaes, Oe.G.; Lind, B.; Gwynn, J.P.

    2010-01-01

    The analysis of low levels of 99 Tc in environmental samples presents special challenges, particularly with respect to the selection of an appropriate and practicable chemical yield tracer. Of all the tracers available, 99m Tc eluted from 99 Mo/ 99m Tc generators appears to be the most practicable in terms of availability, ease of use and cost. These factors have led to an increase in the use of such generators for the provision of 99m Tc as yield tracer for 99 Tc. For the analysis of low levels ( 3 or kg) of 99 Tc in environmental samples, consideration must be given to the radiochemical purity of the tracer solution with respect to contamination with both 99 Tc and other radionuclides. Due to the variable nature of the extent of the interference from tracer solution to tracer solution, it is unwise to try and establish a correction factor for any single generator. The only practical solution to the problem therefore is to run a 'blank' sample with each batch of samples drawn from a single tracer solution. (LN)

  15. Multiplex target enrichment using DNA indexing for ultra-high throughput SNP detection.

    LENUS (Irish Health Repository)

    Kenny, Elaine M

    2011-02-01

    Screening large numbers of target regions in multiple DNA samples for sequence variation is an important application of next-generation sequencing but an efficient method to enrich the samples in parallel has yet to be reported. We describe an advanced method that combines DNA samples using indexes or barcodes prior to target enrichment to facilitate this type of experiment. Sequencing libraries for multiple individual DNA samples, each incorporating a unique 6-bp index, are combined in equal quantities, enriched using a single in-solution target enrichment assay and sequenced in a single reaction. Sequence reads are parsed based on the index, allowing sequence analysis of individual samples. We show that the use of indexed samples does not impact on the efficiency of the enrichment reaction. For three- and nine-indexed HapMap DNA samples, the method was found to be highly accurate for SNP identification. Even with sequence coverage as low as 8x, 99% of sequence SNP calls were concordant with known genotypes. Within a single experiment, this method can sequence the exonic regions of hundreds of genes in tens of samples for sequence and structural variation using as little as 1 μg of input DNA per sample.

  16. Synthesis of high generation thermo-sensitive dendrimers for extraction of rivaroxaban from human fluid and pharmaceutic samples.

    Science.gov (United States)

    Parham, Negin; Panahi, Homayon Ahmad; Feizbakhsh, Alireza; Moniri, Elham

    2018-04-13

    In this present study, poly (N-isopropylacrylamide) as a thermo-sensitive agent was grafted onto magnetic nanoparticles, then ethylenediamine and methylmethacrylate were used to synthesize the first generation of poly amidoamine (PAMAM) dendrimers successively and the process continued alternatively until the ten generations of dendrimers. The synthesized nanocomposite was investigated using Fourier transform infrared spectrometry, thermalgravimetry analysis, X-ray diffractometry, elemental analysis and vibrating-sample magnetometer. The particle size and morphology were characterized using dynamic light scattering, field emission scanning electron microscopy and transmission electron microscopy. Batch experiments were conducted to investigate the parameters affecting adsorption and desorption of rivaroxaban by synthesized nanocomposite. The maximum sorption of rivaroxaban by the synthesized nanocomposite was obtained at pH of 8. The resulting grafted magnetic nanoparticle dendrimers were applied for extraction of rivaroxaban from biologic human liquids and medicinal samples. The specifications of rivaroxaban sorbed by a magnetic nanoparticle dendrimer showed good accessibility and high capacity of the active sites within the dendrimers. Urine and drug matrix extraction recoveries of more than 92.5 and 99.8 were obtained, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  18. Project and construction of a pneumatic system for the transference of samples to a neutron generator

    International Nuclear Information System (INIS)

    Carvalho, A.N. de

    1983-01-01

    A prototype of a system for the transport of irradiated samples to and from a neutron generator, was constructed, using compressed air as propeller agent. Compressed air was injected through electrically driven values. The sample, transported by the pressure wave, was inserted into a PVC tube 50m long and weighing 23.0 g. The first tests were carried out in order to determine the times needed to transport the above-mentioned PVC support along a PVC tube of 3m length and 3/4 diameter for different air pressures applied; it was verified that for pressures between 3.0 and 8.0 kgf/cm 2 , transport times were always smaller than 2 seconds. These results showed the viability of constructing a definitive system, already projected. (C.L.B.) [pt

  19. iHAP – integrated haplotype analysis pipeline for characterizing the haplotype structure of genes

    Directory of Open Access Journals (Sweden)

    Lim Yun Ping

    2006-12-01

    Full Text Available Abstract Background The advent of genotype data from large-scale efforts that catalog the genetic variants of different populations have given rise to new avenues for multifactorial disease association studies. Recent work shows that genotype data from the International HapMap Project have a high degree of transferability to the wider population. This implies that the design of genotyping studies on local populations may be facilitated through inferences drawn from information contained in HapMap populations. Results To facilitate analysis of HapMap data for characterizing the haplotype structure of genes or any chromosomal regions, we have developed an integrated web-based resource, iHAP. In addition to incorporating genotype and haplotype data from the International HapMap Project and gene information from the UCSC Genome Browser Database, iHAP also provides capabilities for inferring haplotype blocks and selecting tag SNPs that are representative of haplotype patterns. These include block partitioning algorithms, block definitions, tag SNP definitions, as well as SNPs to be "force included" as tags. Based on the parameters defined at the input stage, iHAP performs on-the-fly analysis and displays the result graphically as a webpage. To facilitate analysis, intermediate and final result files can be downloaded. Conclusion The iHAP resource, available at http://ihap.bii.a-star.edu.sg, provides a convenient yet flexible approach for the user community to analyze HapMap data and identify candidate targets for genotyping studies.

  20. Use of respondent driven sampling (RDS generates a very diverse sample of men who have sex with men (MSM in Buenos Aires, Argentina.

    Directory of Open Access Journals (Sweden)

    Alex Carballo-Diéguez

    Full Text Available Prior research focusing on men who have sex with men (MSM conducted in Buenos Aires, Argentina, used convenience samples that included mainly gay identified men. To increase MSM sample representativeness, we used Respondent Driven Sampling (RDS for the first time in Argentina. Using RDS, under certain specified conditions, the observed estimates for the percentage of the population with a specific trait are asymptotically unbiased. We describe, the diversity of the recruited sample, from the point of view of sexual orientation, and contrast the different subgroups in terms of their HIV sexual risk behavior.500 MSM were recruited using RDS. Behavioral data were collected through face-to-face interviews and Web-based CASI.In contrast with prior studies, RDS generated a very diverse sample of MSM from a sexual identity perspective. Only 24.5% of participants identified as gay; 36.2% identified as bisexual, 21.9% as heterosexual, and 17.4% were grouped as "other." Gay and non-gay identified MSM differed significantly in their sexual behavior, the former having higher numbers of partners, more frequent sexual contacts and less frequency of condom use. One third of the men (gay, 3%; bisexual, 34%, heterosexual, 51%; other, 49% reported having had sex with men, women and transvestites in the two months prior to the interview. This population requires further study and, potentially, HIV prevention strategies tailored to such diversity of partnerships. Our results highlight the potential effectiveness of using RDS to reach non-gay identified MSM. They also present lessons learned in the implementation of RDS to recruit MSM concerning both the importance and limitations of formative work, the need to tailor incentives to circumstances of the less affluent potential participants, the need to prevent masking, and the challenge of assessing network size.

  1. MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data

    Science.gov (United States)

    Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong

    2015-01-01

    Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201

  2. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection

    Czech Academy of Sciences Publication Activity Database

    Huber, C. S.; Vale, M. G. R.; Dessuy, M. B.; Svoboda, Milan; Musil, Stanislav; Dědina, Jiří

    2017-01-01

    Roč. 175, DEC (2017), s. 406 -412 ISSN 0039-9140 R&D Projects: GA MŠk(CZ) LH15174 Institutional support: RVO:68081715 Keywords : slurry sampling * methyl-substituted arsenic species * hydride generation-cryotrapping-atomic absorption spectrometry Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 4.162, year: 2016

  3. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection

    Czech Academy of Sciences Publication Activity Database

    Huber, C. S.; Vale, M. G. R.; Dessuy, M. B.; Svoboda, Milan; Musil, Stanislav; Dědina, Jiří

    2017-01-01

    Roč. 175, DEC (2017), s. 406-412 ISSN 0039-9140 R&D Projects: GA MŠk(CZ) LH15174 Institutional support: RVO:68081715 Keywords : slurry sampling * methyl-substituted arsenic species * hydride generation-cryotrapping-atomic absorption spectrometry Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 4.162, year: 2016

  4. OpenADAM: an open source genome-wide association data management system for Affymetrix SNP arrays

    Directory of Open Access Journals (Sweden)

    Sham P C

    2008-12-01

    Full Text Available Abstract Background Large scale genome-wide association studies have become popular since the introduction of high throughput genotyping platforms. Efficient management of the vast array of data generated poses many challenges. Description We have developed an open source web-based data management system for the large amount of genotype data generated from the Affymetrix GeneChip® Mapping Array and Affymetrix Genome-Wide Human SNP Array platforms. The database supports genotype calling using DM, BRLMM, BRLMM-P or Birdseed algorithms provided by the Affymetrix Power Tools. The genotype and corresponding pedigree data are stored in a relational database for efficient downstream data manipulation and analysis, such as calculation of allele and genotype frequencies, sample identity checking, and export of genotype data in various file formats for analysis using commonly-available software. A novel method for genotyping error estimation is implemented using linkage disequilibrium information from the HapMap project. All functionalities are accessible via a web-based user interface. Conclusion OpenADAM provides an open source database system for management of Affymetrix genome-wide association SNP data.

  5. A genome-wide association study of serum uric acid in African Americans

    Directory of Open Access Journals (Sweden)

    Gerry Norman P

    2011-02-01

    Full Text Available Abstract Background Uric acid is the primary byproduct of purine metabolism. Hyperuricemia is associated with body mass index (BMI, sex, and multiple complex diseases including gout, hypertension (HTN, renal disease, and type 2 diabetes (T2D. Multiple genome-wide association studies (GWAS in individuals of European ancestry (EA have reported associations between serum uric acid levels (SUAL and specific genomic loci. The purposes of this study were: 1 to replicate major signals reported in EA populations; and 2 to use the weak LD pattern in African ancestry population to better localize (fine-map reported loci and 3 to explore the identification of novel findings cognizant of the moderate sample size. Methods African American (AA participants (n = 1,017 from the Howard University Family Study were included in this study. Genotyping was performed using the Affymetrix® Genome-wide Human SNP Array 6.0. Imputation was performed using MACH and the HapMap reference panels for CEU and YRI. A total of 2,400,542 single nucleotide polymorphisms (SNPs were assessed for association with serum uric acid under the additive genetic model with adjustment for age, sex, BMI, glomerular filtration rate, HTN, T2D, and the top two principal components identified in the assessment of admixture and population stratification. Results Four variants in the gene SLC2A9 achieved genome-wide significance for association with SUAL (p-values ranging from 8.88 × 10-9 to 1.38 × 10-9. Fine-mapping of the SLC2A9 signals identified a 263 kb interval of linkage disequilibrium in the HapMap CEU sample. This interval was reduced to 37 kb in our AA and the HapMap YRI samples. Conclusions The most strongly associated locus for SUAL in EA populations was also the most strongly associated locus in this AA sample. This finding provides evidence for the role of SLC2A9 in uric acid metabolism across human populations. Additionally, our findings demonstrate the utility of following-up EA

  6. The Paternal Landscape along the Bight of Benin - Testing Regional Representativeness of West-African Population Samples Using Y-Chromosomal Markers.

    Directory of Open Access Journals (Sweden)

    Maarten H D Larmuseau

    Full Text Available Patterns of genetic variation in human populations across the African continent are still not well studied in comparison with Eurasia and America, despite the high genetic and cultural diversity among African populations. In population and forensic genetic studies a single sample is often used to represent a complete African region. In such a scenario, inappropriate sampling strategies and/or the use of local, isolated populations may bias interpretations and pose questions of representativeness at a macrogeographic-scale. The non-recombining region of the Y-chromosome (NRY has great potential to reveal the regional representation of a sample due to its powerful phylogeographic information content. An area poorly characterized for Y-chromosomal data is the West-African region along the Bight of Benin, despite its important history in the trans-Atlantic slave trade and its large number of ethnic groups, languages and lifestyles. In this study, Y-chromosomal haplotypes from four Beninese populations were determined and a global meta-analysis with available Y-SNP and Y-STR data from populations along the Bight of Benin and surrounding areas was performed. A thorough methodology was developed allowing comparison of population samples using Y-chromosomal lineage data based on different Y-SNP panels and phylogenies. Geographic proximity turned out to be the best predictor of genetic affinity between populations along the Bight of Benin. Nevertheless, based on Y-chromosomal data from the literature two population samples differed strongly from others from the same or neighbouring areas and are not regionally representative within large-scale studies. Furthermore, the analysis of the HapMap sample YRI of a Yoruban population from South-western Nigeria based on Y-SNPs and Y-STR data showed for the first time its regional representativeness, a result which is important for standard population and forensic genetic applications using the YRI sample

  7. A 12 kV, 1 kHz, Pulse Generator for Breakdown Studies of Samples for CLIC RF Accelerating Structures

    CERN Document Server

    Soares, R H; Kovermann, J; Calatroni, S; Wuensch, W

    2012-01-01

    Compact Linear Collider (CLIC) RF structures must be capable of sustaining high surface electric fields, in excess of 200 MV/m, with a breakdown (BD) rate below 3×10-7 breakdowns/pulse/m. Achieving such a low rate requires a detailed understanding of all the steps involved in the mechanism of breakdown. One of the fundamental studies is to investigate the statistical characteristics of the BD rate phenomenon at very low values to understand the origin of an observed dependency of the surface electric field raised to the power of 30. To acquire sufficient BD data, in a reasonable period of time, a high repetition rate pulse generator is required for an existing d.c. spark system at CERN. Following BD of the material sample the pulse generator must deliver a current pulse of several 10’s of Amperes for ~2 μs. A high repetition rate pulse generator has been designed, built and tested; this utilizes pulse forming line technology and employs MOSFET switches. This paper describes the design of the pulse generat...

  8. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    Science.gov (United States)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  9. Constrained approximation of effective generators for multiscale stochastic reaction networks and application to conditioned path sampling

    Energy Technology Data Exchange (ETDEWEB)

    Cotter, Simon L., E-mail: simon.cotter@manchester.ac.uk

    2016-10-15

    Efficient analysis and simulation of multiscale stochastic systems of chemical kinetics is an ongoing area for research, and is the source of many theoretical and computational challenges. In this paper, we present a significant improvement to the constrained approach, which is a method for computing effective dynamics of slowly changing quantities in these systems, but which does not rely on the quasi-steady-state assumption (QSSA). The QSSA can cause errors in the estimation of effective dynamics for systems where the difference in timescales between the “fast” and “slow” variables is not so pronounced. This new application of the constrained approach allows us to compute the effective generator of the slow variables, without the need for expensive stochastic simulations. This is achieved by finding the null space of the generator of the constrained system. For complex systems where this is not possible, or where the constrained subsystem is itself multiscale, the constrained approach can then be applied iteratively. This results in breaking the problem down into finding the solutions to many small eigenvalue problems, which can be efficiently solved using standard methods. Since this methodology does not rely on the quasi steady-state assumption, the effective dynamics that are approximated are highly accurate, and in the case of systems with only monomolecular reactions, are exact. We will demonstrate this with some numerics, and also use the effective generators to sample paths of the slow variables which are conditioned on their endpoints, a task which would be computationally intractable for the generator of the full system.

  10. Comparing Generative and Inter-generative Subjectivity in Post-Revolutionary Academic Generations in Iran

    Directory of Open Access Journals (Sweden)

    mehran Sohrabzadeh

    2010-01-01

    Full Text Available Comparative study of different post-revolutionary generation has been broadly applied by social scientists; among them some believe there is a gulf between generations, while some others endorsing some small differences among generations, emphasis that this variety is natural. Avoiding being loyal to any of these two views, the present study attempts to compare three different post-revolutionary academic generations using theory of “generative objects” which explores generations’ view about their behaviors, Beliefs, and historical monuments. Sampling was carried among 3 generations; firstly ones who were student in 60s and now are experienced faculties in the university, secondly ones who are recently employed as faculty members, and finally who are now students in universities. Results show that in all 3 generations there are essential in-generation similarities, while comparatively there are some differentiations in inter-generative analysis.

  11. Generativity Does Not Necessarily Satisfy All Your Needs: Associations among Cultural Demand for Generativity, Generative Concern, Generative Action, and Need Satisfaction in the Elderly in Four Cultures

    Science.gov (United States)

    Hofer, Jan; Busch, Holger; Au, Alma; Polácková Šolcová, Iva; Tavel, Peter; Tsien Wong, Teresa

    2016-01-01

    The present study examines the association between various facets of generativity, that is, cultural demand for generativity, generative concern, and generative action, with the satisfaction of the needs for relatedness, competence, and autonomy in samples of elderly from Cameroon, China (Hong Kong), the Czech Republic, and Germany. Participants…

  12. Mixed effects modeling of proliferation rates in cell-based models: consequence for pharmacogenomics and cancer.

    Directory of Open Access Journals (Sweden)

    Hae Kyung Im

    2012-02-01

    Full Text Available The International HapMap project has made publicly available extensive genotypic data on a number of lymphoblastoid cell lines (LCLs. Building on this resource, many research groups have generated a large amount of phenotypic data on these cell lines to facilitate genetic studies of disease risk or drug response. However, one problem that may reduce the usefulness of these resources is the biological noise inherent to cellular phenotypes. We developed a novel method, termed Mixed Effects Model Averaging (MEM, which pools data from multiple sources and generates an intrinsic cellular growth rate phenotype. This intrinsic growth rate was estimated for each of over 500 HapMap cell lines. We then examined the association of this intrinsic growth rate with gene expression levels and found that almost 30% (2,967 out of 10,748 of the genes tested were significant with FDR less than 10%. We probed further to demonstrate evidence of a genetic effect on intrinsic growth rate by determining a significant enrichment in growth-associated genes among genes targeted by top growth-associated SNPs (as eQTLs. The estimated intrinsic growth rate as well as the strength of the association with genetic variants and gene expression traits are made publicly available through a cell-based pharmacogenomics database, PACdb. This resource should enable researchers to explore the mediating effects of proliferation rate on other phenotypes.

  13. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  14. EXONSAMPLER: a computer program for genome-wide and candidate gene exon sampling for targeted next-generation sequencing.

    Science.gov (United States)

    Cosart, Ted; Beja-Pereira, Albano; Luikart, Gordon

    2014-11-01

    The computer program EXONSAMPLER automates the sampling of thousands of exon sequences from publicly available reference genome sequences and gene annotation databases. It was designed to provide exon sequences for the efficient, next-generation gene sequencing method called exon capture. The exon sequences can be sampled by a list of gene name abbreviations (e.g. IFNG, TLR1), or by sampling exons from genes spaced evenly across chromosomes. It provides a list of genomic coordinates (a bed file), as well as a set of sequences in fasta format. User-adjustable parameters for collecting exon sequences include a minimum and maximum acceptable exon length, maximum number of exonic base pairs (bp) to sample per gene, and maximum total bp for the entire collection. It allows for partial sampling of very large exons. It can preferentially sample upstream (5 prime) exons, downstream (3 prime) exons, both external exons, or all internal exons. It is written in the Python programming language using its free libraries. We describe the use of EXONSAMPLER to collect exon sequences from the domestic cow (Bos taurus) genome for the design of an exon-capture microarray to sequence exons from related species, including the zebu cow and wild bison. We collected ~10% of the exome (~3 million bp), including 155 candidate genes, and ~16,000 exons evenly spaced genomewide. We prioritized the collection of 5 prime exons to facilitate discovery and genotyping of SNPs near upstream gene regulatory DNA sequences, which control gene expression and are often under natural selection. © 2014 John Wiley & Sons Ltd.

  15. Selective reduction of arsenic species by hydride generation - atomic absorption spectrometry. Part 2 - sample storage and arsenic determination in natural waters

    Directory of Open Access Journals (Sweden)

    Quináia Sueli P.

    2001-01-01

    Full Text Available Total arsenic, arsenite, arsinate and dimethylarsinic acid (DMA were selectively determined in natural waters by hydride generation - atomic absorption spectrometry, using sodium tetrahydroborate(III as reductant but in different reduction media. River water samples from the north region of Paraná State, Brazil, were analysed and showed arsenate as the principal arsenical form. Detection limits found for As(III (citrate buffer, As(III + DMA (acetic acid and As(III + As(V (hydrochloric acid were 0.6, 1.1 and 0.5 mg As L-1, respectively. Sample storage on the proper reaction media revealed to be a useful way to preserve the water sample.

  16. Use of oxidative and reducing vapor generation for reducing the detection limits of iodine in biological samples by inductively coupled plasma atomic emission spectrometry

    International Nuclear Information System (INIS)

    Vtorushina, Eh.A.; Saprykin, A.I.; Knapp, G.

    2009-01-01

    Procedures of microwave combustion in an oxygen flow and microwave acid decomposition of biological samples were optimized for the subsequent determination of iodine. A new method was proposed for the generation of molecular iodine from periodate iona using hydrogen peroxide as a reductant. Procedures were developed for determining iodine in biological samples by inductively coupled plasma atomic emission spectrometry (ICP-AES) using oxidative and reducing vapor generation; these allowed the detection limit for iodine to be lowered by 3-4 orders of magnitude. The developed procedures were used to analyze certified reference materials of milk (Skim Milk Powder BCR 150) and seaweed (Sea Lettuce BCR 279) and a Supradyn vitamin complex

  17. Direct impact aerosol sampling by electrostatic precipitation

    Science.gov (United States)

    Braden, Jason D.; Harter, Andrew G.; Stinson, Brad J.; Sullivan, Nicholas M.

    2016-02-02

    The present disclosure provides apparatuses for collecting aerosol samples by ionizing an air sample at different degrees. An air flow is generated through a cavity in which at least one corona wire is disposed and electrically charged to form a corona therearound. At least one grounded sample collection plate is provided downstream of the at least one corona wire so that aerosol ions generated within the corona are deposited on the at least one grounded sample collection plate. A plurality of aerosol samples ionized to different degrees can be generated. The at least one corona wire may be perpendicular to the direction of the flow, or may be parallel to the direction of the flow. The apparatus can include a serial connection of a plurality of stages such that each stage is capable of generating at least one aerosol sample, and the air flow passes through the plurality of stages serially.

  18. Exome sequencing generates high quality data in non-target regions

    Directory of Open Access Journals (Sweden)

    Guo Yan

    2012-05-01

    Full Text Available Abstract Background Exome sequencing using next-generation sequencing technologies is a cost efficient approach to selectively sequencing coding regions of human genome for detection of disease variants. A significant amount of DNA fragments from the capture process fall outside target regions, and sequence data for positions outside target regions have been mostly ignored after alignment. Result We performed whole exome sequencing on 22 subjects using Agilent SureSelect capture reagent and 6 subjects using Illumina TrueSeq capture reagent. We also downloaded sequencing data for 6 subjects from the 1000 Genomes Project Pilot 3 study. Using these data, we examined the quality of SNPs detected outside target regions by computing consistency rate with genotypes obtained from SNP chips or the Hapmap database, transition-transversion (Ti/Tv ratio, and percentage of SNPs inside dbSNP. For all three platforms, we obtained high-quality SNPs outside target regions, and some far from target regions. In our Agilent SureSelect data, we obtained 84,049 high-quality SNPs outside target regions compared to 65,231 SNPs inside target regions (a 129% increase. For our Illumina TrueSeq data, we obtained 222,171 high-quality SNPs outside target regions compared to 95,818 SNPs inside target regions (a 232% increase. For the data from the 1000 Genomes Project, we obtained 7,139 high-quality SNPs outside target regions compared to 1,548 SNPs inside target regions (a 461% increase. Conclusions These results demonstrate that a significant amount of high quality genotypes outside target regions can be obtained from exome sequencing data. These data should not be ignored in genetic epidemiology studies.

  19. Sampling the Mouse Hippocampal Dentate Gyrus

    OpenAIRE

    Lisa Basler; Lisa Basler; Stephan Gerdes; David P. Wolfer; David P. Wolfer; David P. Wolfer; Lutz Slomianka; Lutz Slomianka

    2017-01-01

    Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE) have been develope...

  20. Direct determination of arsenic in soil samples by fast pyrolysis–chemical vapor generation using sodium formate as a reductant followed by nondispersive atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Xuchuan; Zhang, Jingya; Bu, Fanlong

    2015-09-01

    This new study shows for the first time that sodium formate can react with trace arsenic to form volatile species via fast pyrolysis – chemical vapor generation. We found that the presence of thiourea greatly enhanced the generation efficiency and eliminated the interference of copper. We studied the reaction temperature, the volume of sodium formate, the reaction acidity, and the carried argon rate using nondispersive atomic fluorescence spectrometry. Under optimal conditions of T = 500 °C, the volumes of 30% sodium formate and 10% thiourea were 0.2 ml and 0.05 ml, respectively. The carrier argon rate was 300 ml min{sup −1} and the detection limit and precision of arsenic were 0.39 ng and 3.25%, respectively. The amount of arsenic in soil can be directly determined by adding trace amount of hydrochloric acid as a decomposition reagent without any sample pretreatment. The method was successfully applied to determine trace amount of arsenic in two soil-certified reference materials (GBW07453 and GBW07450), and the results were found to be in agreement with certified reference values. - Highlights: • Sodium formate can react with trace arsenic to form volatile species via pyrolysis–chemical vapor generation. • Thiourea can enhance the generation efficiency and eliminate the interference of copper. • Arsenic in soil Sample can be directly determined without sample pretreatment.

  1. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  2. Accuracy of CNV Detection from GWAS Data.

    Directory of Open Access Journals (Sweden)

    Dandan Zhang

    2011-01-01

    Full Text Available Several computer programs are available for detecting copy number variants (CNVs using genome-wide SNP arrays. We evaluated the performance of four CNV detection software suites--Birdsuite, Partek, HelixTree, and PennCNV-Affy--in the identification of both rare and common CNVs. Each program's performance was assessed in two ways. The first was its recovery rate, i.e., its ability to call 893 CNVs previously identified in eight HapMap samples by paired-end sequencing of whole-genome fosmid clones, and 51,440 CNVs identified by array Comparative Genome Hybridization (aCGH followed by validation procedures, in 90 HapMap CEU samples. The second evaluation was program performance calling rare and common CNVs in the Bipolar Genome Study (BiGS data set (1001 bipolar cases and 1033 controls, all of European ancestry as measured by the Affymetrix SNP 6.0 array. Accuracy in calling rare CNVs was assessed by positive predictive value, based on the proportion of rare CNVs validated by quantitative real-time PCR (qPCR, while accuracy in calling common CNVs was assessed by false positive/false negative rates based on qPCR validation results from a subset of common CNVs. Birdsuite recovered the highest percentages of known HapMap CNVs containing >20 markers in two reference CNV datasets. The recovery rate increased with decreased CNV frequency. In the tested rare CNV data, Birdsuite and Partek had higher positive predictive values than the other software suites. In a test of three common CNVs in the BiGS dataset, Birdsuite's call was 98.8% consistent with qPCR quantification in one CNV region, but the other two regions showed an unacceptable degree of accuracy. We found relatively poor consistency between the two "gold standards," the sequence data of Kidd et al., and aCGH data of Conrad et al. Algorithms for calling CNVs especially common ones need substantial improvement, and a "gold standard" for detection of CNVs remains to be established.

  3. The effect of therapeutic x-radiation on a sample of pacemaker generators

    International Nuclear Information System (INIS)

    Maxted, K.J.

    1984-01-01

    Tests were made on nineteen generators, comprising seventeen types from five manufacturers and including four programmable units, and x-ray energies of about 4 MeV. The bipolar generators suffered no measureable damage and radiotherapy patients using these would appear not to be exposed to any hazard. Nor were any of the generators using entirely CMOS circuitry, including the programmable types, affected. Generators using hybrid bipolar and CMOS circuitry were damaged by radiation exposure, the majority of these being Medtronic pacemakers. Transient recovery of function followed by total failure suggested that even a transient loss of function must be regarded as a precursor to permanent damage. This transient effect indicates that it is most likely the CMOS circuit element that is affected. (U.K.)

  4. Selection signatures in worldwide sheep populations.

    OpenAIRE

    Fariello, Maria-Ines; Servin, Bertrand; Tosser-Klopp, Gwenola; Rupp, Rachel; Moreno, Carole; San Cristobal, Magali; Boitard, Simon; Drögemüller, Cord; The International Sheep Genomics Consortium, ISGC

    2014-01-01

    The diversity of populations in domestic species offers great opportunities to study genome response to selection. The recently published Sheep HapMap dataset is a great example of characterization of the world wide genetic diversity in sheep. In this study, we re-analyzed the Sheep HapMap dataset to identify selection signatures in worldwide sheep populations. Compared to previous analyses, we made use of statistical methods that (i) take account of the hierarchical structure of sheep popula...

  5. Selection Signatures in Worldwide Sheep Populations

    OpenAIRE

    Fariello, Maria-Ines; Servin, Bertrand; Tosser-Klopp, Gwenola; Rupp, Rachel; Moreno, Carole; Cristobal, Magali San; Boitard, Simon

    2014-01-01

    The diversity of populations in domestic species offers great opportunities to study genome response to selection. The recently published Sheep HapMap dataset is a great example of characterization of the world wide genetic diversity in sheep. In this study, we re-analyzed the Sheep HapMap dataset to identify selection signatures in worldwide sheep populations. Compared to previous analyses, we made use of statistical methods that (i) take account of the hierarchical structure of sheep popula...

  6. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  7. Determination of total mercury and methylmercury in biological samples by photochemical vapor generation

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Mariana A.; Ribeiro, Anderson S.; Curtius, Adilson J. [Universidade Federal de Santa Catarina, Departamento de Quimica, Florianopolis, SC (Brazil); Sturgeon, Ralph E. [National Research Council Canada, Institute for National Measurement Standards, Ottawa, ON (Canada)

    2007-06-15

    Cold vapor atomic absorption spectrometry (CV-AAS) based on photochemical reduction by exposure to UV radiation is described for the determination of methylmercury and total mercury in biological samples. Two approaches were investigated: (a) tissues were digested in either formic acid or tetramethylammonium hydroxide (TMAH), and total mercury was determined following reduction of both species by exposure of the solution to UV irradiation; (b) tissues were solubilized in TMAH, diluted to a final concentration of 0.125% m/v TMAH by addition of 10% v/v acetic acid and CH{sub 3}Hg{sup +} was selectively quantitated, or the initial digests were diluted to 0.125% m/v TMAH by addition of deionized water, adjusted to pH 0.3 by addition of HCl and CH{sub 3}Hg{sup +} was selectively quantitated. For each case, the optimum conditions for photochemical vapor generation (photo-CVG) were investigated. The photochemical reduction efficiency was estimated to be {proportional_to}95% by comparing the response with traditional SnCl{sub 2} chemical reduction. The method was validated by analysis of several biological Certified Reference Materials, DORM-1, DORM-2, DOLT-2 and DOLT-3, using calibration against aqueous solutions of Hg{sup 2+}; results showed good agreement with the certified values for total and methylmercury in all cases. Limits of detection of 6 ng/g for total mercury using formic acid, 8 ng/g for total mercury and 10 ng/g for methylmercury using TMAH were obtained. The proposed methodology is sensitive, simple and inexpensive, and promotes ''green'' chemistry. The potential for application to other sample types and analytes is evident. (orig.)

  8. NGSCheckMate: software for validating sample identity in next-generation sequencing studies within and across data types.

    Science.gov (United States)

    Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J

    2017-06-20

    In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  10. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  11. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  12. Determination of antimony by electrochemical hydride generation atomic absorption spectrometry in samples with high iron content using chelating resins as on-line removal system

    International Nuclear Information System (INIS)

    Bolea, E.; Arroyo, D.; Laborda, F.; Castillo, J.R.

    2006-01-01

    A method for the removal of the interference caused by iron on electrochemical generation of stibine is proposed. It consists of a chelating resin Chelex 100 column integrated into a flow injection system and coupled to the electrochemical hydride generator quartz tube atomic absorption spectrometer (EcHG-QT-AAS). Iron, as Fe(II), is retained in the column with high efficiency, close to 99.9% under optimal conditions. No significant retention was observed for Sb(III) under same conditions and a 97 ± 5% signal recovery was achieved. An electrochemical hydride generator with a concentric configuration and a reticulated vitreous carbon cathode was employed. The system is able to determine antimony concentrations in the range of ng ml -1 in presence of iron concentrations up to 400 mg l -1 . The procedure was validated by analyzing PACS-2 marine sediments reference material with a 4% (w/w) iron content and a [Fe]:[Sb] ratio of 4000:1, which caused total antimony signal suppression on the electrochemical hydride generation system. A compost sample with high iron content (0.7%, w/w), was also analyzed. A good agreement was found on both samples with the certified value and the antimony concentration determined by ICP-MS, respectively

  13. Calibrated acoustic emission system records M -3.5 to M -8 events generated on a saw-cut granite sample

    Science.gov (United States)

    McLaskey, Gregory C.; Lockner, David A.

    2016-01-01

    Acoustic emission (AE) analyses have been used for decades for rock mechanics testing, but because AE systems are not typically calibrated, the absolute sizes of dynamic microcrack growth and other physical processes responsible for the generation of AEs are poorly constrained. We describe a calibration technique for the AE recording system as a whole (transducers + amplifiers + digitizers + sample + loading frame) that uses the impact of a 4.76-mm free-falling steel ball bearing as a reference source. We demonstrate the technique on a 76-mm diameter cylinder of westerly granite loaded in a triaxial deformation apparatus at 40 MPa confining pressure. The ball bearing is dropped inside a cavity within the sample while inside the pressure vessel. We compare this reference source to conventional AEs generated during loading of a saw-cut fault in a second granite sample. All located AEs occur on the saw-cut surface and have moment magnitudes ranging from M −5.7 down to at least M −8. Dynamic events rupturing the entire simulated fault surface (stick–slip events) have measurable stress drop and macroscopic slip and radiate seismic waves similar to those from a M −3.5 earthquake. The largest AE events that do not rupture the entire fault are M −5.7. For these events, we also estimate the corner frequency (200–300 kHz), and we assume the Brune model to estimate source dimensions of 4–6 mm. These AE sources are larger than the 0.2 mm grain size and smaller than the 76 × 152 mm fault surface.

  14. Arsenic speciation in environmental samples by hydride generation and electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Anawar, Hossain Md

    2012-01-15

    For the past few years many studies have been performed to determine arsenic (As) speciation in drinking water, food chain and other environmental samples due to its well-recognized carcinogenic and toxic effects relating to its chemical forms and oxidation states. This review provides an overview of analytical methods, preconcentration and separation techniques, developed up to now, using HGAAS and ETAAS for the determination of inorganic As and organoarsenic species in environmental samples. Specific advantages, disadvantages, selectivity, sensitivity, efficiency, rapidity, detection limit (DL), and some aspects of recent improvements and modifications for different analytical and separation techniques, that can define their application for a particular sample analysis, are highlighted. HG-AAS has high sensitivity, selectivity and low DL using suitable separation techniques; and it is a more suitable, affordable and much less expensive technique than other detectors. The concentrations of HCl and NaBH(4) have a critical effect on the HG response of As species. Use of l-cysteine as pre-reductant is advantageous over KI to obtain the same signal response for different As species under the same, optimum and mild acid concentration, and to reduce the interference of transition metals on the arsine generation. Use of different pretreatment, digestion, separation techniques and surfactants can determine As species with DL from ngL(-1) to μgL(-1). Out of all the chromatographic techniques coupled with HGAAS/ETAAS, ion-pair reversed-phase chromatography (IP-RP) is the most popular due to its higher separation efficiency, resolution selectivity, simplicity, and ability to separate up to seven As species for both non-ionic and ionic compounds in a signal run using the same column and short time. However, a combination of anion- and cation-exchange chromatography seems the most promising for complete resolution up to eight As species. The ETAAS method using different

  15. Influence of physical properties and chemical composition of sample on formation of aerosol particles generated by nanosecond laser ablation at 213 nm

    Energy Technology Data Exchange (ETDEWEB)

    Hola, Marketa, E-mail: mhola@sci.muni.c [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Konecna, Veronika [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Mikuska, Pavel [Institute of Analytical Chemistry, Academy of Sciences of the Czech Republic v.v.i., Veveri 97, 602 00 Brno (Czech Republic); Kaiser, Jozef [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technicka 2896/2, 616 69 Brno (Czech Republic); Kanicky, Viktor [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic)

    2010-01-15

    The influence of sample properties and composition on the size and concentration of aerosol particles generated by nanosecond Nd:YAG laser ablation at 213 nm was investigated for three sets of different materials, each containing five specimens with a similar matrix (Co-cemented carbides with a variable content of W and Co, steel samples with minor differences in elemental content and silica glasses with various colors). The concentration of ablated particles (particle number concentration, PNC) was measured in two size ranges (10-250 nm and 0.25-17 mum) using an optical aerosol spectrometer. The shapes and volumes of the ablation craters were obtained by Scanning Electron Microscopy (SEM) and by an optical profilometer, respectively. Additionally, the structure of the laser-generated particles was studied after their collection on a filter using SEM. The results of particle concentration measurements showed a significant dominance of particles smaller than 250 nm in comparison with larger particles, irrespective of the kind of material. Even if the number of particles larger than 0.25 mum is negligible (up to 0.1%), the volume of large particles that left the ablation cell can reach 50% of the whole particle volume depending on the material. Study of the ablation craters and the laser-generated particles showed a various number of particles produced by different ablation mechanisms (particle splashing or condensation), but the similar character of released particles for all materials was observed by SEM after particle collection on the membrane filter. The created aerosol always consisted of two main structures - spherical particles with diameters from tenths to units of micrometers originally ejected from the molten surface layer and mum-sized 'fibres' composed of primary agglomerates with diameters in the range between tens and hundreds of nanometers. The shape and structure of ablation craters were in good agreement with particle concentration

  16. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  17. Reliable single chip genotyping with semi-parametric log-concave mixtures.

    Directory of Open Access Journals (Sweden)

    Ralph C A Rippe

    Full Text Available The common approach to SNP genotyping is to use (model-based clustering per individual SNP, on a set of arrays. Genotyping all SNPs on a single array is much more attractive, in terms of flexibility, stability and applicability, when developing new chips. A new semi-parametric method, named SCALA, is proposed. It is based on a mixture model using semi-parametric log-concave densities. Instead of using the raw data, the mixture is fitted on a two-dimensional histogram, thereby making computation time almost independent of the number of SNPs. Furthermore, the algorithm is effective in low-MAF situations.Comparisons between SCALA and CRLMM on HapMap genotypes show very reliable calling of single arrays. Some heterozygous genotypes from HapMap are called homozygous by SCALA and to lesser extent by CRLMM too. Furthermore, HapMap's NoCalls (NN could be genotyped by SCALA, mostly with high probability. The software is available as R scripts from the website www.math.leidenuniv.nl/~rrippe.

  18. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  19. Experimental and numerical examination of eddy (Foucault) currents in rotating micro-coils: Generation of heat and its impact on sample temperature

    Science.gov (United States)

    Aguiar, Pedro M.; Jacquinot, Jacques-François; Sakellariou, Dimitris

    2009-09-01

    The application of nuclear magnetic resonance (NMR) to systems of limited quantity has stimulated the use of micro-coils (diameter Foucault (eddy) currents, which generate heat. We report the first data acquired with a 4 mm MACS system and spinning up to 10 kHz. The need to spin faster necessitates improved methods to control heating. We propose an approximate solution to calculate the power losses (heat) from the eddy currents for a solenoidal coil, in order to provide insight into the functional dependencies of Foucault currents. Experimental tests of the dependencies reveal conditions which result in reduced sample heating and negligible temperature distributions over the sample volume.

  20. Experimental and numerical examination of eddy (Foucault) currents in rotating micro-coils: Generation of heat and its impact on sample temperature.

    Science.gov (United States)

    Aguiar, Pedro M; Jacquinot, Jacques-François; Sakellariou, Dimitris

    2009-09-01

    The application of nuclear magnetic resonance (NMR) to systems of limited quantity has stimulated the use of micro-coils (diameter Foucault (eddy) currents, which generate heat. We report the first data acquired with a 4mm MACS system and spinning up to 10kHz. The need to spin faster necessitates improved methods to control heating. We propose an approximate solution to calculate the power losses (heat) from the eddy currents for a solenoidal coil, in order to provide insight into the functional dependencies of Foucault currents. Experimental tests of the dependencies reveal conditions which result in reduced sample heating and negligible temperature distributions over the sample volume.

  1. Post-Flight Microbial Analysis of Samples from the International Space Station Water Recovery System and Oxygen Generation System

    Science.gov (United States)

    Birmele, Michele N.

    2011-01-01

    The Regenerative, Environmental Control and Life Support System (ECLSS) on the International Space Station (ISS) includes the the Water Recovery System (WRS) and the Oxygen Generation System (OGS). The WRS consists of a Urine Processor Assembly (UPA) and Water Processor Assembly (WPA). This report describes microbial characterization of wastewater and surface samples collected from the WRS and OGS subsystems, returned to KSC, JSC, and MSFC on consecutive shuttle flights (STS-129 and STS-130) in 2009-10. STS-129 returned two filters that contained fluid samples from the WPA Waste Tank Orbital Recovery Unit (ORU), one from the waste tank and the other from the ISS humidity condensate. Direct count by microscopic enumeration revealed 8.38 x 104 cells per mL in the humidity condensate sample, but none of those cells were recoverable on solid agar media. In contrast, 3.32 x lOs cells per mL were measured from a surface swab of the WRS waste tank, including viable bacteria and fungi recovered after S12 days of incubation on solid agar media. Based on rDNA sequencing and phenotypic characterization, a fungus recovered from the filter was determined to be Lecythophora mutabilis. The bacterial isolate was identified by rDNA sequence data to be Methylobacterium radiotolerans. Additional UPA subsystem samples were returned on STS-130 for analysis. Both liquid and solid samples were collected from the Russian urine container (EDV), Distillation Assembly (DA) and Recycle Filter Tank Assembly (RFTA) for post-flight analysis. The bacterium Pseudomonas aeruginosa and fungus Chaetomium brasiliense were isolated from the EDV samples. No viable bacteria or fungi were recovered from RFTA brine samples (N= 6), but multiple samples (N = 11) from the DA and RFTA were found to contain fungal and bacterial cells. Many recovered cells have been identified to genus by rDNA sequencing and carbon source utilization profiling (BiOLOG Gen III). The presence of viable bacteria and fungi from WRS

  2. Generating Seismograms with Deep Neural Networks

    Science.gov (United States)

    Krischer, L.; Fichtner, A.

    2017-12-01

    The recent surge of successful uses of deep neural networks in computer vision, speech recognition, and natural language processing, mainly enabled by the availability of fast GPUs and extremely large data sets, is starting to see many applications across all natural sciences. In seismology these are largely confined to classification and discrimination tasks. In this contribution we explore the use of deep neural networks for another class of problems: so called generative models.Generative modelling is a branch of statistics concerned with generating new observed data samples, usually by drawing from some underlying probability distribution. Samples with specific attributes can be generated by conditioning on input variables. In this work we condition on seismic source (mechanism and location) and receiver (location) parameters to generate multi-component seismograms.The deep neural networks are trained on synthetic data calculated with Instaseis (http://instaseis.net, van Driel et al. (2015)) and waveforms from the global ShakeMovie project (http://global.shakemovie.princeton.edu, Tromp et al. (2010)). The underlying radially symmetric or smoothly three dimensional Earth structures result in comparatively small waveform differences from similar events or at close receivers and the networks learn to interpolate between training data samples.Of particular importance is the chosen misfit functional. Generative adversarial networks (Goodfellow et al. (2014)) implement a system in which two networks compete: the generator network creates samples and the discriminator network distinguishes these from the true training examples. Both are trained in an adversarial fashion until the discriminator can no longer distinguish between generated and real samples. We show how this can be applied to seismograms and in particular how it compares to networks trained with more conventional misfit metrics. Last but not least we attempt to shed some light on the black-box nature of

  3. Genetic admixture and population substructure in Guanacaste Costa Rica.

    Directory of Open Access Journals (Sweden)

    Zhaoming Wang

    2010-10-01

    Full Text Available The population of Costa Rica (CR represents an admixture of major continental populations. An investigation of the CR population structure would provide an important foundation for mapping genetic variants underlying common diseases and traits. We conducted an analysis of 1,301 women from the Guanacaste region of CR using 27,904 single nucleotide polymorphisms (SNPs genotyped on a custom Illumina InfiniumII iSelect chip. The program STRUCTURE was used to compare the CR Guanacaste sample with four continental reference samples, including HapMap Europeans (CEU, East Asians (JPT+CHB, West African Yoruba (YRI, as well as Native Americans (NA from the Illumina iControl database. Our results show that the CR Guanacaste sample comprises a three-way admixture estimated to be 43% European, 38% Native American and 15% West African. An estimated 4% residual Asian ancestry may be within the error range. Results from principal components analysis reveal a correlation between genetic and geographic distance. The magnitude of linkage disequilibrium (LD measured by the number of tagging SNPs required to cover the same region in the genome in the CR Guanacaste sample appeared to be weaker than that observed in CEU, JPT+CHB and NA reference samples but stronger than that of the HapMap YRI sample. Based on the clustering pattern observed in both STRUCTURE and principal components analysis, two subpopulations were identified that differ by approximately 20% in LD block size averaged over all LD blocks identified by Haploview. We also show in a simulated association study conducted within the two subpopulations, that the failure to account for population stratification (PS could lead to a noticeable inflation in the false positive rate. However, we further demonstrate that existing PS adjustment approaches can reduce the inflation to an acceptable level for gene discovery.

  4. Determination of ultra trace arsenic species in water samples by hydride generation atomic absorption spectrometry after cloud point extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ulusoy, Halil Ibrahim, E-mail: hiulusoy@yahoo.com [University of Cumhuriyet, Faculty of Science, Department of Chemistry, TR-58140, Sivas (Turkey); Akcay, Mehmet; Ulusoy, Songuel; Guerkan, Ramazan [University of Cumhuriyet, Faculty of Science, Department of Chemistry, TR-58140, Sivas (Turkey)

    2011-10-10

    Graphical abstract: The possible complex formation mechanism for ultra-trace As determination. Highlights: {yields} CPE/HGAAS system for arsenic determination and speciation in real samples has been applied first time until now. {yields} The proposed method has the lowest detection limit when compared with those of similar CPE studies present in literature. {yields} The linear range of the method is highly wide and suitable for its application to real samples. - Abstract: Cloud point extraction (CPE) methodology has successfully been employed for the preconcentration of ultra-trace arsenic species in aqueous samples prior to hydride generation atomic absorption spectrometry (HGAAS). As(III) has formed an ion-pairing complex with Pyronine B in presence of sodium dodecyl sulfate (SDS) at pH 10.0 and extracted into the non-ionic surfactant, polyethylene glycol tert-octylphenyl ether (Triton X-114). After phase separation, the surfactant-rich phase was diluted with 2 mL of 1 M HCl and 0.5 mL of 3.0% (w/v) Antifoam A. Under the optimized conditions, a preconcentration factor of 60 and a detection limit of 0.008 {mu}g L{sup -1} with a correlation coefficient of 0.9918 was obtained with a calibration curve in the range of 0.03-4.00 {mu}g L{sup -1}. The proposed preconcentration procedure was successfully applied to the determination of As(III) ions in certified standard water samples (TMDA-53.3 and NIST 1643e, a low level fortified standard for trace elements) and some real samples including natural drinking water and tap water samples.

  5. Assessment of Epstein-Barr virus nucleic acids in gastric but not in breast cancer by next-generation sequencing of pooled Mexican samples

    Science.gov (United States)

    Fuentes-Pananá, Ezequiel M; Larios-Serrato, Violeta; Méndez-Tenorio, Alfonso; Morales-Sánchez, Abigail; Arias, Carlos F; Torres, Javier

    2016-01-01

    Gastric (GC) and breast (BrC) cancer are two of the most common and deadly tumours. Different lines of evidence suggest a possible causative role of viral infections for both GC and BrC. Wide genome sequencing (WGS) technologies allow searching for viral agents in tissues of patients with cancer. These technologies have already contributed to establish virus-cancer associations as well as to discovery new tumour viruses. The objective of this study was to document possible associations of viral infection with GC and BrC in Mexican patients. In order to gain idea about cost effective conditions of experimental sequencing, we first carried out an in silico simulation of WGS. The next-generation-platform IlluminaGallx was then used to sequence GC and BrC tumour samples. While we did not find viral sequences in tissues from BrC patients, multiple reads matching Epstein-Barr virus (EBV) sequences were found in GC tissues. An end-point polymerase chain reaction confirmed an enrichment of EBV sequences in one of the GC samples sequenced, validating the next-generation sequencing-bioinformatics pipeline. PMID:26910355

  6. Evaluation of Oconee steam-generator debris. Final report

    International Nuclear Information System (INIS)

    Rigdon, M.A.; Rubright, M.M.; Sarver, L.W.

    1981-10-01

    Pieces of debris were observed near damaged tubes at the 14th support plate elevation in the Oconee 1-B steam generator. A project was initiated to evaluate the physical and chemical nature of the debris, to identify its source, and to determine its role in tube damage at this elevation. Various laboratory techniques were used to characterize several debris and mill scale samples. Data from these samples were then compared with each other and with literature data. It was concluded that seven of eight debris samples were probably formed in the steam generator. Six of these samples were probably formed by high temperature aqueous corrosion early in the life of the steam generator. The seventh sample was probably formed by the deposition and spalling of magnetite on the Inconel steam generator tubes. None of the debris samples resembled any of the mill scale samples

  7. Assessing Generative Braille Responding Following Training in a Matching-to-Sample Format

    Science.gov (United States)

    Putnam, Brittany C.; Tiger, Jeffrey H.

    2016-01-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters…

  8. Integrative analysis of single nucleotide polymorphisms and gene expression efficiently distinguishes samples from closely related ethnic populations

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2012-07-01

    Full Text Available Abstract Background Ancestry informative markers (AIMs are a type of genetic marker that is informative for tracing the ancestral ethnicity of individuals. Application of AIMs has gained substantial attention in population genetics, forensic sciences, and medical genetics. Single nucleotide polymorphisms (SNPs, the materials of AIMs, are useful for classifying individuals from distinct continental origins but cannot discriminate individuals with subtle genetic differences from closely related ancestral lineages. Proof-of-principle studies have shown that gene expression (GE also is a heritable human variation that exhibits differential intensity distributions among ethnic groups. GE supplies ethnic information supplemental to SNPs; this motivated us to integrate SNP and GE markers to construct AIM panels with a reduced number of required markers and provide high accuracy in ancestry inference. Few studies in the literature have considered GE in this aspect, and none have integrated SNP and GE markers to aid classification of samples from closely related ethnic populations. Results We integrated a forward variable selection procedure into flexible discriminant analysis to identify key SNP and/or GE markers with the highest cross-validation prediction accuracy. By analyzing genome-wide SNP and/or GE markers in 210 independent samples from four ethnic groups in the HapMap II Project, we found that average testing accuracies for a majority of classification analyses were quite high, except for SNP-only analyses that were performed to discern study samples containing individuals from two close Asian populations. The average testing accuracies ranged from 0.53 to 0.79 for SNP-only analyses and increased to around 0.90 when GE markers were integrated together with SNP markers for the classification of samples from closely related Asian populations. Compared to GE-only analyses, integrative analyses of SNP and GE markers showed comparable testing

  9. Generation and analysis of chemical compound libraries

    Science.gov (United States)

    Gregoire, John M.; Jin, Jian; Kan, Kevin S.; Marcin, Martin R.; Mitrovic, Slobodan; Newhouse, Paul F.; Suram, Santosh K.; Xiang, Chengxiang; Zhou, Lan

    2017-10-03

    Various samples are generated on a substrate. The samples each includes or consists of one or more analytes. In some instances, the samples are generated through the use of gels or through vapor deposition techniques. The samples are used in an instrument for screening large numbers of analytes by locating the samples between a working electrode and a counter electrode assembly. The instrument also includes one or more light sources for illuminating each of the samples. The instrument is configured to measure the photocurrent formed through a sample as a result of the illumination of the sample.

  10. The HLA-net GENE[RATE] pipeline for effective HLA data analysis and its application to 145 population samples from Europe and neighbouring areas.

    Science.gov (United States)

    Nunes, J M; Buhler, S; Roessli, D; Sanchez-Mazas, A

    2014-05-01

    In this review, we present for the first time an integrated version of the Gene[rate] computer tools which have been developed during the last 5 years to analyse human leukocyte antigen (HLA) data in human populations, as well as the results of their application to a large dataset of 145 HLA-typed population samples from Europe and its two neighbouring areas, North Africa and West Asia, now forming part of the Gene[va] database. All these computer tools and genetic data are, from now, publicly available through a newly designed bioinformatics platform, HLA-net, here presented as a main achievement of the HLA-NET scientific programme. The Gene[rate] pipeline offers user-friendly computer tools to estimate allele and haplotype frequencies, to test Hardy-Weinberg equilibrium (HWE), selective neutrality and linkage disequilibrium, to recode HLA data, to convert file formats, to display population frequencies of chosen alleles and haplotypes in selected geographic regions, and to perform genetic comparisons among chosen sets of population samples, including new data provided by the user. Both numerical and graphical outputs are generated, the latter being highly explicit and of publication quality. All these analyses can be performed on the pipeline after scrupulous validation of the population sample's characterisation and HLA typing reporting according to HLA-NET recommendations. The Gene[va] database offers direct access to the HLA-A, -B, -C, -DQA1, -DQB1, -DRB1 and -DPB1 frequencies and summary statistics of 145 population samples having successfully passed these HLA-NET 'filters', and representing three European subregions (South-East, North-East and Central-West Europe) and two neighbouring areas (North Africa, as far as Sudan, and West Asia, as far as South India). The analysis of these data, summarized in this review, shows a substantial genetic variation at the regional level in this continental area. These results have main implications for population genetics

  11. Preliminary report on the development of some indices of relative nutritive value (RNV) of cereal and legume samples, applicable in the early generations of selection

    International Nuclear Information System (INIS)

    Kaul, A.K.; Niemann, E.G.

    1975-01-01

    Rapid screening methods for biuret nitrogen determination and fluorometric lysine estimation are described. While the biuret method has been found to be suitable for early generation screening for peptide nitrogen determination, fluorescence estimation of dansylated grain meal could be taken as a good index of available lysine in cereal and legume samples. The necessity of rapid and inexpensive tests for the determination of Relative Nutritive Value (RNV) in the advance generations of screening, is discussed. Preliminary data available on two such tests, utilizing protozoan Tetrahymena pyriformis W. and flour beetle Tribolium confusum Duval, indicated promise. Both techniques were tried on different cereal and legume samples. The relative lethality of beetle larvae and their nitrogen retention were taken as indices of RNV in legumes. Larval Nitrogen Retention Index (LNRI) of cereal samples was found to be dependent both on nitrogen content and on protein quality. It was concluded that both these organisms need to be further investigated for their potential as test animals for RNV determination in the advance segregating populations. (author)

  12. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  13. Evaluation of sampling schemes for in-service inspection of steam generator tubing

    International Nuclear Information System (INIS)

    Hanlen, R.C.

    1990-03-01

    This report is a follow-on of work initially sponsored by the US Nuclear Regulatory Commission (Bowen et al. 1989). The work presented here is funded by EPRI and is jointly sponsored by the Electric Power Research Institute (EPRI) and the US Nuclear Regulatory Commission (NRC). The goal of this research was to evaluate fourteen sampling schemes or plans. The main criterion used for evaluating plan performance was the effectiveness for sampling, detecting and plugging defective tubes. The performance criterion was evaluated across several choices of distributions of degraded/defective tubes, probability of detection (POD) curves and eddy-current sizing models. Conclusions from this study are dependent upon the tube defect distributions, sample size, and expansion rules considered. As degraded/defective tubes form ''clusters'' (i.e., maps 6A, 8A and 13A), the smaller sample sizes provide a capability of detecting and sizing defective tubes that approaches 100% inspection. When there is little or no clustering (i.e., maps 1A, 20 and 21), sample efficiency is approximately equal to the initial sample size taken. Thee is an indication (though not statistically significant) that the systematic sampling plans are better than the random sampling plans for equivalent initial sample size. There was no indication of an effect due to modifying the threshold value for the second stage expansion. The lack of an indication is likely due to the specific tube flaw sizes considered for the six tube maps. 1 ref., 11 figs., 19 tabs

  14. Multielemental Determination of As, Bi, Ge, Sb, and Sn in Agricultural Samples Using Hydride Generation Coupled to Microwave-Induced Plasma Optical Emission Spectrometry.

    Science.gov (United States)

    Machado, Raquel C; Amaral, Clarice D B; Nóbrega, Joaquim A; Araujo Nogueira, Ana Rita

    2017-06-14

    A microwave-induced plasma optical emission spectrometer with N 2 -based plasma was combined with a multimode sample introduction system (MSIS) for hydride generation (HG) and multielemental determination of As, Bi, Ge, Sb, and Sn in samples of forage, bovine liver, powdered milk, agricultural gypsum, rice, and mineral fertilizer, using a single condition of prereduction and reduction. The accuracy of the developed analytical method was evaluated using certified reference materials of water and mineral fertilizer, and recoveries ranged from 95 to 106%. Addition and recovery experiments were carried out, and the recoveries varied from 85 to 117% for all samples evaluated. The limits of detection for As, Bi, Ge, Sb, and Sn were 0.46, 0.09, 0.19, 0.46, and 5.2 μg/L, respectively, for liquid samples, and 0.18, 0.04, 0.08, 0.19, and 2.1 mg/kg, respectively, for solid samples. The method proposed offers a simple, fast, multielemental, and robust alternative for successful determination of all five analytes in agricultural samples with low operational cost without compromising analytical performance.

  15. In-line electrochemical reagent generation coupled to a flow injection biamperometric system for the determination of sulfite in beverage samples.

    Science.gov (United States)

    de Paula, Nattany T G; Barbosa, Elaine M O; da Silva, Paulo A B; de Souza, Gustavo C S; Nascimento, Valberes B; Lavorante, André F

    2016-07-15

    This work reports an in-line electrochemical reagent generation coupled to a flow injection biamperometric procedure for the determination of SO3(2-). The method was based on a redox reaction between the I3(-) and SO3(2-) ions, after the diffusion of SO2 through a gas diffusion chamber. Under optimum experimental conditions, a linear response ranging from 1.0 to 12.0 mg L(-1) (R=0.9999 and n=7), a detection and quantification limit estimated at 0.26 and 0.86 mg L(-1), respectively, a standard deviation relative of 0.4% (n=10) for a reference solution of 4.0 mg L(-1) SO3(2-) and sampling throughput for 40 determinations per hour were achieved. Addition and recovery tests with juice and wine samples were performed resulting in a range between 92% and 110%. There were no significant differences at a 95% confidence level in the analysis of eight samples when comparing the new method with a reference procedure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Determination of arsenic species in seafood samples from the Aegean Sea by liquid chromatography-(photo-oxidation)-hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Schaeffer, Richard [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Soeroes, Csilla [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Ipolyi, Ildiko [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Fodor, Peter [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Thomaidis, Nikolaos S. [Laboratory of Analytical Chemistry, Department of Chemistry, University of Athens, Panepistiomopolis Zografou, 15776 Athens (Greece)]. E-mail: ntho@chem.uoa.gr

    2005-08-15

    In this study arsenic compounds were determined in mussels (Mytulis galloprovincialis), anchovies (Engraulis encrasicholus), sea-breams (Sparus aurata), sea bass (Dicentrarchus labrax) and sardines (Sardina pilchardus) collected from Aegean Sea using liquid chromatography-photo-oxidation-hydride generation-atomic fluorescence spectrometry [LC-(PO)-HG-AFS] system. Twelve arsenicals were separated and determined on the basis of their difference in two properties: (i) the pK {sub a} values and (ii) hydride generation capacity. The separation was carried out both with an anion- and a cation-exchange column, with and without photo-oxidation. In all the samples arsenobetaine, AB was detected as the major compound (concentrations ranging between 2.7 and 23.1 {mu}g g{sup -1} dry weight), with trace amounts of arsenite, As(III), dimethylarsinic acid, DMA and arsenocholine, AC, also present. Arsenosugars were detected only in the mussel samples (in concentrations of 0.9-3.6 {mu}g g{sup -1} dry weight), along with the presence of an unknown compound, which, based on its retention time on the anion-exchange column Hamilton PRP-X100 and a recent communication [E. Schmeisser, R. Raml, K.A. Francesconi, D. Kuehnelt, A. Lindberg, Cs. Soeroes, W. Goessler, Chem. Commun. 16 (2004) 1824], is supposed to be a thio-arsenic analogue.

  17. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  18. Synergetic enhancement effect of ionic liquid and diethyldithiocarbamate on the chemical vapor generation of nickel for its atomic fluorescence spectrometric determination in biological samples

    International Nuclear Information System (INIS)

    Zhang Chuan; Li Yan; Wu Peng; Yan Xiuping

    2009-01-01

    Room-temperature ionic liquid in combination with sodium diethyldithiocarbamate (DDTC) was used to synergetically improve the chemical vapor generation (CVG) of nickel. Volatile species of nickel were effectively generated through reduction of acidified analyte solution with KBH 4 in the presence of 0.02% DDTC and 25 mmol L -1 1-butyl-3-methylimidazolium bromide ([C 4 mim]Br) at room temperature. Thus, a new flow injection (FI)-CVG-atomic fluorescence spectrometric (FI-CVG-AFS) method was developed for determination of nickel with a detection limit of 0.65 μg L -1 (3 s) and a sampling frequency of 180 h -1 . With consumption of 0.5 mL sample solution, an enhancement factor of 2400 was obtained. The precision (RSD) for eleven replicate determinations of 20 μg L -1 Ni was 3.4%. The developed FI-CVG-AFS method was successfully applied to determination of trace Ni in several certified biological reference materials.

  19. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  20. UMAPRM: Uniformly sampling the medial axis

    KAUST Repository

    Yeh, Hsin-Yi Cindy

    2014-05-01

    © 2014 IEEE. Maintaining clearance, or distance from obstacles, is a vital component of successful motion planning algorithms. Maintaining high clearance often creates safer paths for robots. Contemporary sampling-based planning algorithms That utilize The medial axis, or The set of all points equidistant To Two or more obstacles, produce higher clearance paths. However, They are biased heavily Toward certain portions of The medial axis, sometimes ignoring parts critical To planning, e.g., specific Types of narrow passages. We introduce Uniform Medial Axis Probabilistic RoadMap (UMAPRM), a novel planning variant That generates samples uniformly on The medial axis of The free portion of Cspace. We Theoretically analyze The distribution generated by UMAPRM and show its uniformity. Our results show That UMAPRM\\'s distribution of samples along The medial axis is not only uniform but also preferable To other medial axis samplers in certain planning problems. We demonstrate That UMAPRM has negligible computational overhead over other sampling Techniques and can solve problems The others could not, e.g., a bug Trap. Finally, we demonstrate UMAPRM successfully generates higher clearance paths in The examples.

  1. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection.

    Science.gov (United States)

    Huber, Charles S; Vale, Maria Goreti R; Dessuy, Morgana B; Svoboda, Milan; Musil, Stanislav; Dědina, Jiři

    2017-12-01

    A slurry sampling procedure for arsenic speciation analysis in baby food by arsane generation, cryogenic trapping and detection with atomic absorption spectrometry is presented. Several procedures were tested for slurry preparation, including different reagents (HNO 3 , HCl and tetramethylammonium hydroxide - TMAH) and their concentrations, water bath heating and ultrasound-assisted agitation. The best results for inorganic arsenic (iAs) and dimethylarsinate (DMA) were reached when using 3molL -1 HCl under heating and ultrasound-assisted agitation. The developed method was applied for the analysis of five porridge powder and six baby meal samples. The trueness of the method was checked with a certified reference material (CRM) of total arsenic (tAs), iAs and DMA in rice (ERM-BC211). Arsenic recoveries (mass balance) for all samples and CRM were performed by the determination of the tAs by inductively coupled plasma mass spectrometry (ICP-MS) after microwave-assisted digestion and its comparison against the sum of the results from the speciation analysis. The relative limits of detection were 0.44, 0.24 and 0.16µgkg -1 for iAs, methylarsonate and DMA, respectively. The concentrations of the most toxic arsenic species (iAs) in the analyzed baby food samples ranged between 4.2 and 99µgkg -1 which were below the limits of 300, 200 and 100µgkg -1 set by the Brazilian, Chinese and European legislation, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Genome-wide analysis in Brazilian Xavante Indians reveals low degree of admixture.

    Science.gov (United States)

    Kuhn, Patricia C; Horimoto, Andréa R V Russo; Sanches, José Maurício; Vieira Filho, João Paulo B; Franco, Luciana; Fabbro, Amaury Dal; Franco, Laercio Joel; Pereira, Alexandre C; Moises, Regina S

    2012-01-01

    Characterization of population genetic variation and structure can be used as tools for research in human genetics and population isolates are of great interest. The aim of the present study was to characterize the genetic structure of Xavante Indians and compare it with other populations. The Xavante, an indigenous population living in Brazilian Central Plateau, is one of the largest native groups in Brazil. A subset of 53 unrelated subjects was selected from the initial sample of 300 Xavante Indians. Using 86,197 markers, Xavante were compared with all populations of HapMap Phase III and HGDP-CEPH projects and with a Southeast Brazilian population sample to establish its population structure. Principal Components Analysis showed that the Xavante Indians are concentrated in the Amerindian axis near other populations of known Amerindian ancestry such as Karitiana, Pima, Surui and Maya and a low degree of genetic admixture was observed. This is consistent with the historical records of bottlenecks experience and cultural isolation. By calculating pair-wise F(st) statistics we characterized the genetic differentiation between Xavante Indians and representative populations of the HapMap and from HGDP-CEPH project. We found that the genetic differentiation between Xavante Indians and populations of Ameridian, Asian, European, and African ancestry increased progressively. Our results indicate that the Xavante is a population that remained genetically isolated over the past decades and can offer advantages for genome-wide mapping studies of inherited disorders.

  3. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  4. Monolith Chromatography as Sample Preparation Step in Virome Studies of Water Samples.

    Science.gov (United States)

    Gutiérrez-Aguirre, Ion; Kutnjak, Denis; Rački, Nejc; Rupar, Matevž; Ravnikar, Maja

    2018-01-01

    Viruses exist in aquatic media and many of them use this media as transmission route. Next-generation sequencing (NGS) technologies have opened new doors in virus research, allowing also to reveal a hidden diversity of viral species in aquatic environments. Not surprisingly, many of the newly discovered viruses are found in environmental fresh and marine waters. One of the problems in virome research can be the low amount of viral nucleic acids present in the sample in contrast to the background ones (host, eukaryotic, prokaryotic, environmental). Therefore, virus enrichment prior to NGS is necessary in many cases. In water samples, an added problem resides in the low concentration of viruses typically present in aquatic media. Different concentration strategies have been used to overcome such limitations. CIM monoliths are a new generation of chromatographic supports that due to their particular structural characteristics are very efficient in concentration and purification of viruses. In this chapter, we describe the use of CIM monolithic chromatography for sample preparation step in NGS studies targeting viruses in fresh or marine water. The step-by-step protocol will include a case study where CIM concentration was used to study the virome of a wastewater sample using NGS.

  5. Device for sampling HTGR recycle fuel particles

    International Nuclear Information System (INIS)

    Suchomel, R.R.; Lackey, W.J.

    1977-03-01

    Devices for sampling High-Temperature Gas-Cooled Reactor fuel microspheres were evaluated. Analysis of samples obtained with each of two specially designed passive samplers were compared with data generated by more common techniques. A ten-stage two-way sampler was found to produce a representative sample with a constant batch-to-sample ratio

  6. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    Science.gov (United States)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  7. Revisiting sample size: are big trials the answer?

    Science.gov (United States)

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  8. Genes with stable DNA methylation levels show higher evolutionary conservation than genes with fluctuant DNA methylation levels.

    Science.gov (United States)

    Zhang, Ruijie; Lv, Wenhua; Luan, Meiwei; Zheng, Jiajia; Shi, Miao; Zhu, Hongjie; Li, Jin; Lv, Hongchao; Zhang, Mingming; Shang, Zhenwei; Duan, Lian; Jiang, Yongshuai

    2015-11-24

    Different human genes often exhibit different degrees of stability in their DNA methylation levels between tissues, samples or cell types. This may be related to the evolution of human genome. Thus, we compared the evolutionary conservation between two types of genes: genes with stable DNA methylation levels (SM genes) and genes with fluctuant DNA methylation levels (FM genes). For long-term evolutionary characteristics between species, we compared the percentage of the orthologous genes, evolutionary rate dn/ds and protein sequence identity. We found that the SM genes had greater percentages of the orthologous genes, lower dn/ds, and higher protein sequence identities in all the 21 species. These results indicated that the SM genes were more evolutionarily conserved than the FM genes. For short-term evolutionary characteristics among human populations, we compared the single nucleotide polymorphism (SNP) density, and the linkage disequilibrium (LD) degree in HapMap populations and 1000 genomes project populations. We observed that the SM genes had lower SNP densities, and higher degrees of LD in all the 11 HapMap populations and 13 1000 genomes project populations. These results mean that the SM genes had more stable chromosome genetic structures, and were more conserved than the FM genes.

  9. TRU Waste Sampling Program: Volume I. Waste characterization

    International Nuclear Information System (INIS)

    Clements, T.L. Jr.; Kudera, D.E.

    1985-09-01

    Volume I of the TRU Waste Sampling Program report presents the waste characterization information obtained from sampling and characterizing various aged transuranic waste retrieved from storage at the Idaho National Engineering Laboratory and the Los Alamos National Laboratory. The data contained in this report include the results of gas sampling and gas generation, radiographic examinations, waste visual examination results, and waste compliance with the Waste Isolation Pilot Plant-Waste Acceptance Criteria (WIPP-WAC). A separate report, Volume II, contains data from the gas generation studies

  10. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  11. Application of hydrocyanic acid vapor generation via focused microwave radiation to the preparation of industrial effluent samples prior to free and total cyanide determinations by spectrophotometric flow injection analysis.

    Science.gov (United States)

    Quaresma, Maria Cristina Baptista; de Carvalho, Maria de Fátima Batista; Meirelles, Francis Assis; Santiago, Vânia Maria Junqueira; Santelli, Ricardo Erthal

    2007-02-01

    A sample preparation procedure for the quantitative determination of free and total cyanides in industrial effluents has been developed that involves hydrocyanic acid vapor generation via focused microwave radiation. Hydrocyanic acid vapor was generated from free cyanides using only 5 min of irradiation time (90 W power) and a purge time of 5 min. The HCN generated was absorbed into an accepting NaOH solution using very simple glassware apparatus that was appropriate for the microwave oven cavity. After that, the cyanide concentration was determined within 90 s using a well-known spectrophotometric flow injection analysis system. Total cyanide analysis required 15 min irradiation time (90 W power), as well as chemical conditions such as the presence of EDTA-acetate buffer solution or ascorbic acid, depending on the effluent to be analyzed (petroleum refinery or electroplating effluents, respectively). The detection limit was 0.018 mg CN l(-1) (quantification limit of 0.05 mg CN l(-1)), and the measured RSD was better than 8% for ten independent analyses of effluent samples (1.4 mg l(-1) cyanide). The accuracy of the procedure was assessed via analyte spiking (with free and complex cyanides) and by performing an independent sample analysis based on the standard methodology recommended by the APHA for comparison. The sample preparation procedure takes only 10 min for free and 20 min for total cyanide, making this procedure much faster than traditional methodologies (conventional heating and distillation), which are time-consuming (they require at least 1 h). Samples from oil (sour and stripping tower bottom waters) and electroplating effluents were analyzed successfully.

  12. SAAS-CNV: A Joint Segmentation Approach on Aggregated and Allele Specific Signals for the Identification of Somatic Copy Number Alterations with Next-Generation Sequencing Data.

    Science.gov (United States)

    Zhang, Zhongyang; Hao, Ke

    2015-11-01

    Cancer genomes exhibit profound somatic copy number alterations (SCNAs). Studying tumor SCNAs using massively parallel sequencing provides unprecedented resolution and meanwhile gives rise to new challenges in data analysis, complicated by tumor aneuploidy and heterogeneity as well as normal cell contamination. While the majority of read depth based methods utilize total sequencing depth alone for SCNA inference, the allele specific signals are undervalued. We proposed a joint segmentation and inference approach using both signals to meet some of the challenges. Our method consists of four major steps: 1) extracting read depth supporting reference and alternative alleles at each SNP/Indel locus and comparing the total read depth and alternative allele proportion between tumor and matched normal sample; 2) performing joint segmentation on the two signal dimensions; 3) correcting the copy number baseline from which the SCNA state is determined; 4) calling SCNA state for each segment based on both signal dimensions. The method is applicable to whole exome/genome sequencing (WES/WGS) as well as SNP array data in a tumor-control study. We applied the method to a dataset containing no SCNAs to test the specificity, created by pairing sequencing replicates of a single HapMap sample as normal/tumor pairs, as well as a large-scale WGS dataset consisting of 88 liver tumors along with adjacent normal tissues. Compared with representative methods, our method demonstrated improved accuracy, scalability to large cancer studies, capability in handling both sequencing and SNP array data, and the potential to improve the estimation of tumor ploidy and purity.

  13. In situ sampling cart development engineering task plan

    International Nuclear Information System (INIS)

    DeFord, D.K.

    1995-01-01

    This Engineering Task Plan (ETP) supports the development for facility use of the next generation in situ sampling system for characterization of tank vapors. In situ sampling refers to placing sample collection devices (primarily sorbent tubes) directly into the tank headspace, then drawing tank gases through the collection devices to obtain samples. The current in situ sampling system is functional but was not designed to provide the accurate flow measurement required by today's data quality objectives (DQOs) for vapor characterization. The new system will incorporate modern instrumentation to achieve much tighter control. The next generation system will be referred to in this ETP as the New In Situ System (NISS) or New System. The report describes the current sampling system and the modifications that are required for more accuracy

  14. Sample Preprocessing For Atomic Spectrometry

    International Nuclear Information System (INIS)

    Kim, Sun Tae

    2004-08-01

    This book gives descriptions of atomic spectrometry, which deals with atomic absorption spectrometry such as Maxwell-Boltzmann equation and Beer-Lambert law, atomic absorption spectrometry for solvent extraction, HGAAS, ETASS, and CVAAS and inductively coupled plasma emission spectrometer, such as basic principle, generative principle of plasma and device and equipment, and interferences, and inductively coupled plasma mass spectrometry like device, pros and cons of ICP/MS, sample analysis, reagent, water, acid, flux, materials of experiments, sample and sampling and disassembling of sample and pollution and loss in open system and closed system.

  15. Cloud point extraction for trace inorganic arsenic speciation analysis in water samples by hydride generation atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Li, Shan, E-mail: ls_tuzi@163.com; Wang, Mei, E-mail: wmei02@163.com; Zhong, Yizhou, E-mail: yizhz@21cn.com; Zhang, Zehua, E-mail: kazuki.0101@aliyun.com; Yang, Bingyi, E-mail: e_yby@163.com

    2015-09-01

    A new cloud point extraction technique was established and used for the determination of trace inorganic arsenic species in water samples combined with hydride generation atomic fluorescence spectrometry (HGAFS). As(III) and As(V) were complexed with ammonium pyrrolidinedithiocarbamate and molybdate, respectively. The complexes were quantitatively extracted with the non-ionic surfactant (Triton X-114) by centrifugation. After addition of antifoam, the surfactant-rich phase containing As(III) was diluted with 5% HCl for HGAFS determination. For As(V) determination, 50% HCl was added to the surfactant-rich phase, and the mixture was placed in an ultrasonic bath at 70 °C for 30 min. As(V) was reduced to As(III) with thiourea–ascorbic acid solution, followed by HGAFS. Under the optimum conditions, limits of detection of 0.009 and 0.012 μg/L were obtained for As(III) and As(V), respectively. Concentration factors of 9.3 and 7.9, respectively, were obtained for a 50 mL sample. The precisions were 2.1% for As(III) and 2.3% for As(V). The proposed method was successfully used for the determination of trace As(III) and As(V) in water samples, with satisfactory recoveries. - Highlights: • Cloud point extraction was firstly established to determine trace inorganic arsenic(As) species combining with HGAFS. • Separate As(III) and As(V) determinations improve the accuracy. • Ultrasonic release of complexed As(V) enables complete As(V) reduction to As(III). • Direct HGAFS analysis can be performed.

  16. The CFTR Met 470 allele is associated with lower birth rates in fertile men from a population isolate.

    Directory of Open Access Journals (Sweden)

    Gülüm Kosova

    2010-06-01

    Full Text Available Although little is known about the role of the cystic fibrosis transmembrane regulator (CFTR gene in reproductive physiology, numerous variants in this gene have been implicated in etiology of male infertility due to congenital bilateral absence of the vas deferens (CBAVD. Here, we studied the fertility effects of three CBAVD-associated CFTR polymorphisms, the (TGm and polyT repeat polymorphisms in intron 8 and Met470Val in exon 10, in healthy men of European descent. Homozygosity for the Met470 allele was associated with lower birth rates, defined as the number of births per year of marriage (P = 0.0029. The Met470Val locus explained 4.36% of the phenotypic variance in birth rate, and men homozygous for the Met470 allele had 0.56 fewer children on average compared to Val470 carrier men. The derived Val470 allele occurs at high frequencies in non-African populations (allele frequency = 0.51 in HapMap CEU, whereas it is very rare in African population (Fst = 0.43 between HapMap CEU and YRI. In addition, haplotypes bearing Val470 show a lack of genetic diversity and are thus longer than haplotypes bearing Met470 (measured by an integrated haplotype score [iHS] of -1.93 in HapMap CEU. The fraction of SNPs in the HapMap Phase2 data set with more extreme Fst and iHS measures is 0.003, consistent with a selective sweep outside of Africa. The fertility advantage conferred by Val470 relative to Met470 may provide a selective mechanism for these population genetic observations.

  17. Cerium Oxide Nanoparticle Nose-Only Inhalation Exposures Using a Low-Sample-Consumption String Generator

    Science.gov (United States)

    There is a critical need to assess the health effects associated with exposure of commercially produced NPs across the size ranges reflective of that detected in the industrial sectors that are generating, as well as incorporating, NPs into products. Generation of stable and low ...

  18. Operational air sampling report

    International Nuclear Information System (INIS)

    Lyons, C.L.

    1994-03-01

    Nevada Test Site vertical shaft and tunnel events generate beta/gamma fission products. The REECo air sampling program is designed to measure these radionuclides at various facilities supporting these events. The current testing moratorium and closure of the Decontamination Facility has decreased the scope of the program significantly. Of the 118 air samples collected in the only active tunnel complex, only one showed any airborne fission products. Tritiated water vapor concentrations were very similar to previously reported levels. The 206 air samples collected at the Area-6 decontamination bays and laundry were again well below any Derived Air Concentration calculation standard. Laboratory analyses of these samples were negative for any airborne fission products

  19. Next-Generation Sequencing Workflow for NSCLC Critical Samples Using a Targeted Sequencing Approach by Ion Torrent PGM™ Platform.

    Science.gov (United States)

    Vanni, Irene; Coco, Simona; Truini, Anna; Rusmini, Marta; Dal Bello, Maria Giovanna; Alama, Angela; Banelli, Barbara; Mora, Marco; Rijavec, Erika; Barletta, Giulia; Genova, Carlo; Biello, Federica; Maggioni, Claudia; Grossi, Francesco

    2015-12-03

    Next-generation sequencing (NGS) is a cost-effective technology capable of screening several genes simultaneously; however, its application in a clinical context requires an established workflow to acquire reliable sequencing results. Here, we report an optimized NGS workflow analyzing 22 lung cancer-related genes to sequence critical samples such as DNA from formalin-fixed paraffin-embedded (FFPE) blocks and circulating free DNA (cfDNA). Snap frozen and matched FFPE gDNA from 12 non-small cell lung cancer (NSCLC) patients, whose gDNA fragmentation status was previously evaluated using a multiplex PCR-based quality control, were successfully sequenced with Ion Torrent PGM™. The robust bioinformatic pipeline allowed us to correctly call both Single Nucleotide Variants (SNVs) and indels with a detection limit of 5%, achieving 100% specificity and 96% sensitivity. This workflow was also validated in 13 FFPE NSCLC biopsies. Furthermore, a specific protocol for low input gDNA capable of producing good sequencing data with high coverage, high uniformity, and a low error rate was also optimized. In conclusion, we demonstrate the feasibility of obtaining gDNA from FFPE samples suitable for NGS by performing appropriate quality controls. The optimized workflow, capable of screening low input gDNA, highlights NGS as a potential tool in the detection, disease monitoring, and treatment of NSCLC.

  20. Creating Turbulent Flow Realizations with Generative Adversarial Networks

    Science.gov (United States)

    King, Ryan; Graf, Peter; Chertkov, Michael

    2017-11-01

    Generating valid inflow conditions is a crucial, yet computationally expensive, step in unsteady turbulent flow simulations. We demonstrate a new technique for rapid generation of turbulent inflow realizations that leverages recent advances in machine learning for image generation using a deep convolutional generative adversarial network (DCGAN). The DCGAN is an unsupervised machine learning technique consisting of two competing neural networks that are trained against each other using backpropagation. One network, the generator, tries to produce samples from the true distribution of states, while the discriminator tries to distinguish between true and synthetic samples. We present results from a fully-trained DCGAN that is able to rapidly draw random samples from the full distribution of possible inflow states without needing to solve the Navier-Stokes equations, eliminating the costly process of spinning up inflow turbulence. This suggests a new paradigm in physics informed machine learning where the turbulence physics can be encoded in either the discriminator or generator. Finally, we also propose additional applications such as feature identification and subgrid scale modeling.

  1. Wideband 4-diode sampling circuit

    Science.gov (United States)

    Wojtulewicz, Andrzej; Radtke, Maciej

    2016-09-01

    The objective of this work was to develop a wide-band sampling circuit. The device should have the ability to collect samples of a very fast signal applied to its input, strengthen it and prepare for further processing. The study emphasizes the method of sampling pulse shaping. The use of ultrafast pulse generator allows sampling signals with a wide frequency spectrum, reaching several gigahertzes. The device uses a pulse transformer to prepare symmetrical pulses. Their final shape is formed with the help of the step recovery diode, two coplanar strips and Schottky diode. Made device can be used in the sampling oscilloscope, as well as other measurement system.

  2. Surry steam generator - examination and evaluation

    International Nuclear Information System (INIS)

    Clark, R.A.; Doctor, P.G.; Ferris, R.H.

    1985-10-01

    This report summarizes research conducted during the fourth year of the five year Steam Generator Group Project. During this period the project conducted numerous nondestructive examination (NDE) round robin inspections of the original Surry 2A steam generator. They included data acquisition/analysis and analysis-only round robins using multifrequency bobbin coil eddy current tests. In addition, the generator was nondestructively examined by alternate or advanced techniques including ultrasonics, optical fiber, profilometry and special eddy current instrumentation. The round robin interpretation data were compared. To validate the NDE results and for tube integrity testing, a selection of tubing samples, determined to be representative of the generator, was designated for removal. Initial sample removals from the generator included three sections of tube sheet, two sections of support plate and encompassed tubes, and a number of straight and U-bend tubing sections. Metallographic examination of these sections was initiated. Details of significant results are presented in the following paper. 13 figs

  3. Surry steam generator - examination and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Clark, R A; Doctor, P G; Ferris, R H

    1987-01-01

    This report summarizes research conducted during the fourth year of the five year Steam Generator Group Project. During this period the project conducted numerous nondestructive examination (NDE) round robin inspections of the original Surry 2A steam generator. They included data acquisition/analysis and analysis-only round robins using multifrequency bobbin coil eddy current tests. In addition, the generator was nondestructively examined by alternate or advanced techniques including ultrasonics, optical fiber, profilometry and special eddy current instrumentation. The round robin interpretation data were compared. To validate the NDE results and for tube integrity testing, a selection of tubing samples, determined to be representative of the generator, was designated for removal. Initial sample removals from the generator included three sections of tube sheet, two sections of support plate and encompassed tubes, and a number of straight and U-bend tubing sections. Metallographic examination of these sections was initiated. Details of significant results are presented in the following paper.

  4. HAMMER: Reweighting tool for simulated data samples

    CERN Document Server

    Duell, Stephan; Ligeti, Zoltan; Papucci, Michele; Robinson, Dean

    2016-01-01

    Modern flavour physics experiments, such as Belle II or LHCb, require large samples of generated Monte Carlo events. Monte Carlo events often are processed in a sophisticated chain that includes a simulation of the detector response. The generation and reconstruction of large samples is resource-intensive and in principle would need to be repeated if e.g. parameters responsible for the underlying models change due to new measurements or new insights. To avoid having to regenerate large samples, we work on a tool, The Helicity Amplitude Module for Matrix Element Reweighting (HAMMER), which allows one to easily reweight existing events in the context of semileptonic b → q ` ̄ ν ` analyses to new model parameters or new physics scenarios.

  5. Improvements to robotics-inspired conformational sampling in rosetta.

    Directory of Open Access Journals (Sweden)

    Amelie Stein

    Full Text Available To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  6. Determination of As(III) and total inorganic As in water samples using an on-line solid phase extraction and flow injection hydride generation atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Sigrist, Mirna; Albertengo, Antonela; Beldomenico, Horacio; Tudino, Mabel

    2011-01-01

    A simple and robust on-line sequential injection system based on solid phase extraction (SPE) coupled to a flow injection hydride generation atomic absorption spectrometer (FI-HGAAS) with a heated quartz tube atomizer (QTA) was developed and optimized for the determination of As(III) in groundwater without any kind of sample pretreatment. The method was based on the selective retention of inorganic As(V) that was carried out by passing the filtered original sample through a cartridge containing a chloride-form strong anion exchanger. Thus the most toxic form, inorganic As(III), was determined fast and directly by AsH 3 generation using 3.5 mol L -1 HCl as carrier solution and 0.35% (m/v) NaBH 4 in 0.025% NaOH as the reductant. Since the uptake of As(V) should be interfered by several anions of natural occurrence in waters, the effect of Cl - , SO 4 2- , NO 3 - , HPO 4 2- , HCO 3 - on retention was evaluated and discussed. The total soluble inorganic arsenic concentration was determined on aliquots of filtered samples acidified with concentrated HCl and pre-reduced with 5% KI-5% C 6 H 8 O 6 solution. The concentration of As(V) was calculated by difference between the total soluble inorganic arsenic and As(III) concentrations. Detection limits (LODs) of 0.5 μg L -1 and 0.6 μg L -1 for As(III) and inorganic total As, respectively, were obtained for a 500 μL sample volume. The obtained limits of detection allowed testing the water quality according to the national and international regulations. The analytical recovery for water samples spiked with As(III) ranged between 98% and 106%. The sampling throughput for As(III) determination was 60 samples h -1 . The device for groundwater sampling was especially designed for the authors. Metallic components were avoided and the contact between the sample and the atmospheric oxygen was carried to a minimum. On-field arsenic species separation was performed through the employ of a serial connection of membrane filters and

  7. Entropy generation as an environmental impact indicator and a sample application to freshwater ecosystems eutrophication

    International Nuclear Information System (INIS)

    Diaz-Mendez, S.E.; Sierra-Grajeda, J.M.T.; Hernandez-Guerrero, A.; Rodriguez-Lelis, J.M.

    2013-01-01

    Generally speaking, an ecosystem is seen as a complex set, it is composed by different biotic and abiotic parts. Naturally, each part has specifics functions related with mass and energy, those functions have influence between the parts directly and indirectly, and these functions are subjected to the basic laws of thermodynamics. If each part of the ecosystem is taken as thermodynamics system its entropy generation could be evaluated, then the total entropy generation of the ecosystem must be sum of the entropy generation in each part, to be in accordance with the Gouy-Stodola theorem. With this in mind, in this work an environmental indicator, for any kind of ecosystems, can be determined as a function of the ratio of total entropy generation for reference state, for instance a healthy forest; and the entropy generation of new different state of the same ecosystem can take, for instance a deforestation. Thus, thermodynamics concepts are applied to study the eutrophication of freshwater ecosystems; the strategy is based on techniques that integrate assumptions of the methodology of entropy generation inside ecosystems. The results show that if the amount of entropy generation is small respect a reference state; the sustainability of the ecosystem will be greater. - Highlights: • We estimate an environmental impact indicator using the concept of entropy generation. • It can be a useful tool for assessing the environmental impacts of the society over the environment. • It can be a useful tool to compare new technologies and improve their efficiencies even more. • It can help for a better understanding of the concept of entropy and its role among various classes of processes. • It can help to reduce environmental concerns and increase the sustainability of the planet

  8. Robotic system for process sampling

    International Nuclear Information System (INIS)

    Dyches, G.M.

    1985-01-01

    A three-axis cartesian geometry robot for process sampling was developed at the Savannah River Laboratory (SRL) and implemented in one of the site radioisotope separations facilities. Use of the robot reduces personnel radiation exposure and contamination potential by routinely handling sample containers under operator control in a low-level radiation area. This robot represents the initial phase of a longer term development program to use robotics for further sample automation. Preliminary design of a second generation robot with additional capabilities is also described. 8 figs

  9. Input energy measurement toward warm dense matter generation using intense pulsed power generator

    Science.gov (United States)

    Hayashi, R.; Ito, T.; Ishitani, T.; Tamura, F.; Kudo, T.; Takakura, N.; Kashine, K.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob.; Jiang, W.; Tokuchi, A.

    2016-05-01

    In order to investigate properties of warm dense matter (WDM) in inertial confinement fusion (ICF), evaluation method for the WDM with isochoric heating on the implosion time-scale using an intense pulsed power generator ETIGO-II (∼1 TW, ∼50 ns) has been considered. In this study, the history of input energy into the sample is measured from the voltage and the current waveforms. To achieve isochoric heating, a foamed aluminum with pore sizes 600 μm and with 90% porosity was packed into a hollow glass capillary (ø 5 mm × 10 mm). The temperature of the sample is calculated from the numerical calculation using the measured input power. According to the above measurements, the input energy into a sample and the achievable temperature are estimated to be 300 J and 6000 K. It indicates that the WDM state is generated using the proposed method with ICF implosion time-scale.

  10. Generation and storage of quantum states using cold atoms

    DEFF Research Database (Denmark)

    Dantan, Aurelien Romain; Josse, Vincent; Cviklinski, Jean

    2006-01-01

    Cold cesium or rubidium atomic samples have a good potential both for generation and storage of nonclassical states of light. Generation of nonclassical states of light is possible through the high non-linearity of cold atomic samples excited close to a resonance line. Quadrature squeezing, polar...

  11. Slurry sampling flow injection chemical vapor generation inductively coupled plasma mass spectrometry for the determination of trace Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Wei-Ni [Department of Chemistry, National Sun Yat-sen University, Kaohsiung 80424, Taiwan (China); Jiang, Shiuh-Jen, E-mail: sjjiang@faculty.nsysu.edu.tw [Department of Chemistry, National Sun Yat-sen University, Kaohsiung 80424, Taiwan (China); Department of Medical Laboratory Science and Biotechnology, Kaohsiung Medical University, Kaohsiung 80708, Taiwan (China); Chen, Yen-Ling [Department of Fragrance and Cosmetic Science, Kaohsiung Medical University, Kaohsiung 80708, Taiwan (China); Sahayam, A.C. [National Centre for Compositional Characterisation of Materials (CCCM), Hyderabad (India)

    2015-02-20

    Highlights: • Determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions in a single run. • Accurate analysis using isotope dilution and standard addition methods. • Vapor generation ICP-MS yielded superior detection limits compared to ETV-ICP-MS. • No sample dissolution increased sample through put. • Analysis of GBW09305 Cosmetic (Cream) reference material for accuracy. - Abstract: A slurry sampling inductively coupled plasma mass spectrometry (ICP-MS) method has been developed for the determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions using flow injection (FI) vapor generation (VG) as the sample introduction system. A slurry containing 2% m/v lotion, 2% m/v thiourea, 0.05% m/v L-cysteine, 0.5 μg mL{sup −1} Co(II), 0.1% m/v Triton X-100 and 1.2% v/v HCl was injected into a VG-ICP-MS system for the determination of Ge, As, Cd, Sb, Hg and Bi without dissolution and mineralization. Because the sensitivities of the analytes in the slurry and that of aqueous solution were quite different, an isotope dilution method and a standard addition method were used for the determination. This method has been validated by the determination of Ge, As, Cd, Sb, Hg and Bi in GBW09305 Cosmetic (Cream) reference material. The method was also applied for the determination of Ge, As, Cd, Sb, Hg and Bi in three cosmetic lotion samples obtained locally. The analysis results of the reference material agreed with the certified value and/or ETV-ICP-MS results. The detection limit estimated from the standard addition curve was 0.025, 0.1, 0.2, 0.1, 0.15, and 0.03 ng g{sup −1} for Ge, As, Cd, Sb, Hg and Bi, respectively, in original cosmetic lotion sample.

  12. Slurry sampling flow injection chemical vapor generation inductively coupled plasma mass spectrometry for the determination of trace Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions

    International Nuclear Information System (INIS)

    Chen, Wei-Ni; Jiang, Shiuh-Jen; Chen, Yen-Ling; Sahayam, A.C.

    2015-01-01

    Highlights: • Determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions in a single run. • Accurate analysis using isotope dilution and standard addition methods. • Vapor generation ICP-MS yielded superior detection limits compared to ETV-ICP-MS. • No sample dissolution increased sample through put. • Analysis of GBW09305 Cosmetic (Cream) reference material for accuracy. - Abstract: A slurry sampling inductively coupled plasma mass spectrometry (ICP-MS) method has been developed for the determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions using flow injection (FI) vapor generation (VG) as the sample introduction system. A slurry containing 2% m/v lotion, 2% m/v thiourea, 0.05% m/v L-cysteine, 0.5 μg mL −1 Co(II), 0.1% m/v Triton X-100 and 1.2% v/v HCl was injected into a VG-ICP-MS system for the determination of Ge, As, Cd, Sb, Hg and Bi without dissolution and mineralization. Because the sensitivities of the analytes in the slurry and that of aqueous solution were quite different, an isotope dilution method and a standard addition method were used for the determination. This method has been validated by the determination of Ge, As, Cd, Sb, Hg and Bi in GBW09305 Cosmetic (Cream) reference material. The method was also applied for the determination of Ge, As, Cd, Sb, Hg and Bi in three cosmetic lotion samples obtained locally. The analysis results of the reference material agreed with the certified value and/or ETV-ICP-MS results. The detection limit estimated from the standard addition curve was 0.025, 0.1, 0.2, 0.1, 0.15, and 0.03 ng g −1 for Ge, As, Cd, Sb, Hg and Bi, respectively, in original cosmetic lotion sample

  13. On the impact of relatedness on SNP association analysis.

    Science.gov (United States)

    Gross, Arnd; Tönjes, Anke; Scholz, Markus

    2017-12-06

    When testing for SNP (single nucleotide polymorphism) associations in related individuals, observations are not independent. Simple linear regression assuming independent normally distributed residuals results in an increased type I error and the power of the test is also affected in a more complicate manner. Inflation of type I error is often successfully corrected by genomic control. However, this reduces the power of the test when relatedness is of concern. In the present paper, we derive explicit formulae to investigate how heritability and strength of relatedness contribute to variance inflation of the effect estimate of the linear model. Further, we study the consequences of variance inflation on hypothesis testing and compare the results with those of genomic control correction. We apply the developed theory to the publicly available HapMap trio data (N=129), the Sorbs (a self-contained population with N=977 characterised by a cryptic relatedness structure) and synthetic family studies with different sample sizes (ranging from N=129 to N=999) and different degrees of relatedness. We derive explicit and easily to apply approximation formulae to estimate the impact of relatedness on the variance of the effect estimate of the linear regression model. Variance inflation increases with increasing heritability. Relatedness structure also impacts the degree of variance inflation as shown for example family structures. Variance inflation is smallest for HapMap trios, followed by a synthetic family study corresponding to the trio data but with larger sample size than HapMap. Next strongest inflation is observed for the Sorbs, and finally, for a synthetic family study with a more extreme relatedness structure but with similar sample size as the Sorbs. Type I error increases rapidly with increasing inflation. However, for smaller significance levels, power increases with increasing inflation while the opposite holds for larger significance levels. When genomic control

  14. Tetrahedral meshing via maximal Poisson-disk sampling

    KAUST Repository

    Guo, Jianwei

    2016-02-15

    In this paper, we propose a simple yet effective method to generate 3D-conforming tetrahedral meshes from closed 2-manifold surfaces. Our approach is inspired by recent work on maximal Poisson-disk sampling (MPS), which can generate well-distributed point sets in arbitrary domains. We first perform MPS on the boundary of the input domain, we then sample the interior of the domain, and we finally extract the tetrahedral mesh from the samples by using 3D Delaunay or regular triangulation for uniform or adaptive sampling, respectively. We also propose an efficient optimization strategy to protect the domain boundaries and to remove slivers to improve the meshing quality. We present various experimental results to illustrate the efficiency and the robustness of our proposed approach. We demonstrate that the performance and quality (e.g., minimal dihedral angle) of our approach are superior to current state-of-the-art optimization-based approaches.

  15. Optimization of chemical and instrumental parameters in hydride generation laser-induced breakdown spectrometry for the determination of arsenic, antimony, lead and germanium in aqueous samples.

    Science.gov (United States)

    Yeşiller, Semira Unal; Yalçın, Serife

    2013-04-03

    A laser induced breakdown spectrometry hyphenated with on-line continuous flow hydride generation sample introduction system, HG-LIBS, has been used for the determination of arsenic, antimony, lead and germanium in aqueous environments. Optimum chemical and instrumental parameters governing chemical hydride generation, laser plasma formation and detection were investigated for each element under argon and nitrogen atmosphere. Arsenic, antimony and germanium have presented strong enhancement in signal strength under argon atmosphere while lead has shown no sensitivity to ambient gas type. Detection limits of 1.1 mg L(-1), 1.0 mg L(-1), 1.3 mg L(-1) and 0.2 mg L(-1) were obtained for As, Sb, Pb and Ge, respectively. Up to 77 times enhancement in detection limit of Pb were obtained, compared to the result obtained from the direct analysis of liquids by LIBS. Applicability of the technique to real water samples was tested through spiking experiments and recoveries higher than 80% were obtained. Results demonstrate that, HG-LIBS approach is suitable for quantitative analysis of toxic elements and sufficiently fast for real time continuous monitoring in aqueous environments. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Selecting SNPs informative for African, American Indian and European Ancestry: application to the Family Investigation of Nephropathy and Diabetes (FIND).

    Science.gov (United States)

    Williams, Robert C; Elston, Robert C; Kumar, Pankaj; Knowler, William C; Abboud, Hanna E; Adler, Sharon; Bowden, Donald W; Divers, Jasmin; Freedman, Barry I; Igo, Robert P; Ipp, Eli; Iyengar, Sudha K; Kimmel, Paul L; Klag, Michael J; Kohn, Orly; Langefeld, Carl D; Leehey, David J; Nelson, Robert G; Nicholas, Susanne B; Pahl, Madeleine V; Parekh, Rulan S; Rotter, Jerome I; Schelling, Jeffrey R; Sedor, John R; Shah, Vallabh O; Smith, Michael W; Taylor, Kent D; Thameem, Farook; Thornley-Brown, Denyse; Winkler, Cheryl A; Guo, Xiuqing; Zager, Phillip; Hanson, Robert L

    2016-05-04

    The presence of population structure in a sample may confound the search for important genetic loci associated with disease. Our four samples in the Family Investigation of Nephropathy and Diabetes (FIND), European Americans, Mexican Americans, African Americans, and American Indians are part of a genome- wide association study in which population structure might be particularly important. We therefore decided to study in detail one component of this, individual genetic ancestry (IGA). From SNPs present on the Affymetrix 6.0 Human SNP array, we identified 3 sets of ancestry informative markers (AIMs), each maximized for the information in one the three contrasts among ancestral populations: Europeans (HAPMAP, CEU), Africans (HAPMAP, YRI and LWK), and Native Americans (full heritage Pima Indians). We estimate IGA and present an algorithm for their standard errors, compare IGA to principal components, emphasize the importance of balancing information in the ancestry informative markers (AIMs), and test the association of IGA with diabetic nephropathy in the combined sample. A fixed parental allele maximum likelihood algorithm was applied to the FIND to estimate IGA in four samples: 869 American Indians; 1385 African Americans; 1451 Mexican Americans; and 826 European Americans. When the information in the AIMs is unbalanced, the estimates are incorrect with large error. Individual genetic admixture is highly correlated with principle components for capturing population structure. It takes ~700 SNPs to reduce the average standard error of individual admixture below 0.01. When the samples are combined, the resulting population structure creates associations between IGA and diabetic nephropathy. The identified set of AIMs, which include American Indian parental allele frequencies, may be particularly useful for estimating genetic admixture in populations from the Americas. Failure to balance information in maximum likelihood, poly-ancestry models creates biased

  17. Characterization plan for the Hanford Generating Plant (HGP)

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-09-01

    This characterization plan describes the sample collection and sample analysis activities to characterize the Hanford Generating Plant and associated solid waste management units (SWMUs). The analytical data will be used to identify the radiological contamination in the Hanford Generating Plant as well as the presence of radiological and hazardous materials in the SWMUs to support further estimates of decontamination interpretation for demolition

  18. Direct inference of SNP heterozygosity rates and resolution of LOH detection.

    Directory of Open Access Journals (Sweden)

    Xiaohong Li

    2007-11-01

    Full Text Available Single nucleotide polymorphisms (SNPs have been increasingly utilized to investigate somatic genetic abnormalities in premalignancy and cancer. LOH is a common alteration observed during cancer development, and SNP assays have been used to identify LOH at specific chromosomal regions. The design of such studies requires consideration of the resolution for detecting LOH throughout the genome and identification of the number and location of SNPs required to detect genetic alterations in specific genomic regions. Our study evaluated SNP distribution patterns and used probability models, Monte Carlo simulation, and real human subject genotype data to investigate the relationships between the number of SNPs, SNP HET rates, and the sensitivity (resolution for detecting LOH. We report that variances of SNP heterozygosity rate in dbSNP are high for a large proportion of SNPs. Two statistical methods proposed for directly inferring SNP heterozygosity rates require much smaller sample sizes (intermediate sizes and are feasible for practical use in SNP selection or verification. Using HapMap data, we showed that a region of LOH greater than 200 kb can be reliably detected, with losses smaller than 50 kb having a substantially lower detection probability when using all SNPs currently in the HapMap database. Higher densities of SNPs may exist in certain local chromosomal regions that provide some opportunities for reliably detecting LOH of segment sizes smaller than 50 kb. These results suggest that the interpretation of the results from genome-wide scans for LOH using commercial arrays need to consider the relationships among inter-SNP distance, detection probability, and sample size for a specific study. New experimental designs for LOH studies would also benefit from considering the power of detection and sample sizes required to accomplish the proposed aims.

  19. A Fault Sample Simulation Approach for Virtual Testability Demonstration Test

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yong; QIU Jing; LIU Guanjun; YANG Peng

    2012-01-01

    Virtual testability demonstration test has many advantages,such as low cost,high efficiency,low risk and few restrictions.It brings new requirements to the fault sample generation.A fault sample simulation approach for virtual testability demonstration test based on stochastic process theory is proposed.First,the similarities and differences of fault sample generation between physical testability demonstration test and virtual testability demonstration test are discussed.Second,it is pointed out that the fault occurrence process subject to perfect repair is renewal process.Third,the interarrival time distribution function of the next fault event is given.Steps and flowcharts of fault sample generation are introduced.The number of faults and their occurrence time are obtained by statistical simulation.Finally,experiments are carried out on a stable tracking platform.Because a variety of types of life distributions and maintenance modes are considered and some assumptions are removed,the sample size and structure of fault sample simulation results are more similar to the actual results and more reasonable.The proposed method can effectively guide the fault injection in virtual testability demonstration test.

  20. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  1. Self-Esteem, Locus of Control, College Adjustment, and GPA among First- and Continuing-Generation Students: A Moderator Model of Generational Status

    Science.gov (United States)

    Aspelmeier, Jeffery E.; Love, Michael M.; McGill, Lauren A.; Elliott, Ann N.; Pierce, Thomas W.

    2012-01-01

    The role of generational status (first-generation vs. continuing-generation college students) as a moderator of the relationship between psychological factors and college outcomes was tested to determine whether generational status acts as a risk factor or as a sensitizing factor. The sample consisted of 322 undergraduate students who completed…

  2. Determination of As(III) and total inorganic As in water samples using an on-line solid phase extraction and flow injection hydride generation atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Sigrist, Mirna, E-mail: msigrist@fiq.unl.edu.ar [Laboratorio Central, Facultad de Ingenieria Quimica, Universidad Nacional del Litoral, Santiago del Estero 2654-Piso 6, (3000) Santa Fe (Argentina); Albertengo, Antonela; Beldomenico, Horacio [Laboratorio Central, Facultad de Ingenieria Quimica, Universidad Nacional del Litoral, Santiago del Estero 2654-Piso 6, (3000) Santa Fe (Argentina); Tudino, Mabel [Laboratorio de Analisis de Trazas, Departamento de Quimica Inorganica, Analitica y Quimica Fisica/INQUIMAE, Facultad de Ciencias Exactas y Naturales, Pabellon II, Ciudad Universitaria (1428), Buenos Aires (Argentina)

    2011-04-15

    A simple and robust on-line sequential injection system based on solid phase extraction (SPE) coupled to a flow injection hydride generation atomic absorption spectrometer (FI-HGAAS) with a heated quartz tube atomizer (QTA) was developed and optimized for the determination of As(III) in groundwater without any kind of sample pretreatment. The method was based on the selective retention of inorganic As(V) that was carried out by passing the filtered original sample through a cartridge containing a chloride-form strong anion exchanger. Thus the most toxic form, inorganic As(III), was determined fast and directly by AsH{sub 3} generation using 3.5 mol L{sup -1} HCl as carrier solution and 0.35% (m/v) NaBH{sub 4} in 0.025% NaOH as the reductant. Since the uptake of As(V) should be interfered by several anions of natural occurrence in waters, the effect of Cl{sup -}, SO{sub 4}{sup 2-}, NO{sub 3}{sup -}, HPO{sub 4}{sup 2-}, HCO{sub 3}{sup -} on retention was evaluated and discussed. The total soluble inorganic arsenic concentration was determined on aliquots of filtered samples acidified with concentrated HCl and pre-reduced with 5% KI-5% C{sub 6}H{sub 8}O{sub 6} solution. The concentration of As(V) was calculated by difference between the total soluble inorganic arsenic and As(III) concentrations. Detection limits (LODs) of 0.5 {mu}g L{sup -1} and 0.6 {mu}g L{sup -1} for As(III) and inorganic total As, respectively, were obtained for a 500 {mu}L sample volume. The obtained limits of detection allowed testing the water quality according to the national and international regulations. The analytical recovery for water samples spiked with As(III) ranged between 98% and 106%. The sampling throughput for As(III) determination was 60 samples h{sup -1}. The device for groundwater sampling was especially designed for the authors. Metallic components were avoided and the contact between the sample and the atmospheric oxygen was carried to a minimum. On-field arsenic species

  3. Analysis of arsenical metabolites in biological samples.

    Science.gov (United States)

    Hernandez-Zavala, Araceli; Drobna, Zuzana; Styblo, Miroslav; Thomas, David J

    2009-11-01

    Quantitation of iAs and its methylated metabolites in biological samples provides dosimetric information needed to understand dose-response relations. Here, methods are described for separation of inorganic and mono-, di-, and trimethylated arsenicals by thin layer chromatography. This method has been extensively used to track the metabolism of the radionuclide [(73)As] in a variety of in vitro assay systems. In addition, a hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometric method is described for the quantitation of arsenicals in biological samples. This method uses pH-selective hydride generation to differentiate among arsenicals containing trivalent or pentavalent arsenic.

  4. An in vitro tag-and-modify protein sample generation method for single-molecule fluorescence resonance energy transfer.

    Science.gov (United States)

    Hamadani, Kambiz M; Howe, Jesse; Jensen, Madeleine K; Wu, Peng; Cate, Jamie H D; Marqusee, Susan

    2017-09-22

    Biomolecular systems exhibit many dynamic and biologically relevant properties, such as conformational fluctuations, multistep catalysis, transient interactions, folding, and allosteric structural transitions. These properties are challenging to detect and engineer using standard ensemble-based techniques. To address this drawback, single-molecule methods offer a way to access conformational distributions, transient states, and asynchronous dynamics inaccessible to these standard techniques. Fluorescence-based single-molecule approaches are parallelizable and compatible with multiplexed detection; to date, however, they have remained limited to serial screens of small protein libraries. This stems from the current absence of methods for generating either individual dual-labeled protein samples at high throughputs or protein libraries compatible with multiplexed screening platforms. Here, we demonstrate that by combining purified and reconstituted in vitro translation, quantitative unnatural amino acid incorporation via AUG codon reassignment, and copper-catalyzed azide-alkyne cycloaddition, we can overcome these challenges for target proteins that are, or can be, methionine-depleted. We present an in vitro parallelizable approach that does not require laborious target-specific purification to generate dual-labeled proteins and ribosome-nascent chain libraries suitable for single-molecule FRET-based conformational phenotyping. We demonstrate the power of this approach by tracking the effects of mutations, C-terminal extensions, and ribosomal tethering on the structure and stability of three protein model systems: barnase, spectrin, and T4 lysozyme. Importantly, dual-labeled ribosome-nascent chain libraries enable single-molecule co-localization of genotypes with phenotypes, are well suited for multiplexed single-molecule screening of protein libraries, and should enable the in vitro directed evolution of proteins with designer single-molecule conformational

  5. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  6. Design and performance of chromium mist generator

    Directory of Open Access Journals (Sweden)

    Tirgar Aram

    2006-01-01

    Full Text Available Chromium mist generator is an essential tool for conducting researches and making science-based recommendations to evaluate air pollution and its control systems. The purpose of this research was to design and construct a homogenous chromium mist generator and the study of some effective factors including sampling height and distances between samplers in side-by-side sampling on chromium mist sampling method. A mist generator was constructed, using a chromium electroplating bath in pilot scale. Concentration of CrO3 and sulfuric acid in plating solution was 125 g L-1 and 1.25 g L-1, respectively. In order to create permanent air sampling locations, a Plexiglas cylindrical chamber (75 cm height, 55 cm i.d was installed the bath overhead. Sixty holes were produced on the chamber in 3 rows (each 20. The distance between rows and holes was 15 and 7.5 cm, respectively. Homogeneity and effective factors were studied via side-by-side air sampling method. So, 48 clusters of samples were collected on polyvinyl chloride (PVC filters housed in sampling cassettes. Cassettes were located in 35, 50, and 65 cm above the solution surface with less than 7.5 and/or 7.5-15 cm distance between heads. All samples were analyzed according to the NIOSH method 7600. According to the ANOVA test, no significant differences were observed between different sampling locations in side-by-side sampling (P=0.82 and between sampling heights and different samplers distances (P=0.86 and 0.86, respectively. However, there were notable differences between means of coefficient of variations (CV in various heights and distances. It is concluded that the most chromium mist homogeneity could be obtained at height 50 cm from the bath solution surface and samplers distance of < 7.5 cm.

  7. Bessel beam CARS of axially structured samples

    Science.gov (United States)

    Heuke, Sandro; Zheng, Juanjuan; Akimov, Denis; Heintzmann, Rainer; Schmitt, Michael; Popp, Jürgen

    2015-06-01

    We report about a Bessel beam CARS approach for axial profiling of multi-layer structures. This study presents an experimental implementation for the generation of CARS by Bessel beam excitation using only passive optical elements. Furthermore, an analytical expression is provided describing the generated anti-Stokes field by a homogeneous sample. Based on the concept of coherent transfer functions, the underling resolving power of axially structured geometries is investigated. It is found that through the non-linearity of the CARS process in combination with the folded illumination geometry continuous phase-matching is achieved starting from homogeneous samples up to spatial sample frequencies at twice of the pumping electric field wave. The experimental and analytical findings are modeled by the implementation of the Debye Integral and scalar Green function approach. Finally, the goal of reconstructing an axially layered sample is demonstrated on the basis of the numerically simulated modulus and phase of the anti-Stokes far-field radiation pattern.

  8. Constructing and sampling directed graphs with given degree sequences

    International Nuclear Information System (INIS)

    Kim, H; Del Genio, C I; Bassler, K E; Toroczkai, Z

    2012-01-01

    The interactions between the components of complex networks are often directed. Proper modeling of such systems frequently requires the construction of ensembles of digraphs with a given sequence of in- and out-degrees. As the number of simple labeled graphs with a given degree sequence is typically very large even for short sequences, sampling methods are needed for statistical studies. Currently, there are two main classes of methods that generate samples. One of the existing methods first generates a restricted class of graphs and then uses a Markov chain Monte-Carlo algorithm based on edge swaps to generate other realizations. As the mixing time of this process is still unknown, the independence of the samples is not well controlled. The other class of methods is based on the configuration model that may lead to unacceptably many sample rejections due to self-loops and multiple edges. Here we present an algorithm that can directly construct all possible realizations of a given bi-degree sequence by simple digraphs. Our method is rejection-free, guarantees the independence of the constructed samples and provides their weight. The weights can then be used to compute statistical averages of network observables as if they were obtained from uniformly distributed sampling or from any other chosen distribution. (paper)

  9. Speciation of arsenic in water samples by high-performance liquid chromatography-hydride generation-atomic absorption spectrometry at trace levels using a post-column reaction system

    Energy Technology Data Exchange (ETDEWEB)

    Stummeyer, J. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany); Harazim, B. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany); Wippermann, T. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany)

    1996-02-01

    Anion-exchange HPLC has been combined with hydride generation - atomic absorption spectrometry (HG-AAS) for the routine speciation of arsenite, arsenate, monomethylarsenic acid and dimethylarsinic acid. The sensitivity of the AAS-detection was increased by a post-column reaction system to achieve complete formation of volatile arsines from the methylated species and arsenate. The system allows the quantitative determination of 0.5 {mu}g/l of each arsenic compound in water samples. The stability of synthetical and natural water containing arsenic at trace levels was investigated. To preserve stored water samples, a method for quantitative separation of arsenate at high pH-values with the basic anion-exchange resin Dowex 1 x 8 was developed. (orig.)

  10. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  11. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  12. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  13. Computer generated holography with intensity-graded patterns

    Directory of Open Access Journals (Sweden)

    Rossella Conti

    2016-10-01

    Full Text Available Computer Generated Holography achieves patterned illumination at the sample plane through phase modulation of the laser beam at the objective back aperture. This is obtained by using liquid crystal-based spatial light modulators (LC-SLMs, which modulate the spatial phase of the incident laser beam. A variety of algorithms are employed to calculate the phase modulation masks addressed to the LC-SLM. These algorithms range from simple gratings-and-lenses to generate multiple diffraction-limited spots, to iterative Fourier-transform algorithms capable of generating arbitrary illumination shapes perfectly tailored on the base of the target contour. Applications for holographic light patterning include multi-trap optical tweezers, patterned voltage imaging and optical control of neuronal excitation using uncaging or optogenetics. These past implementations of computer generated holography used binary input profile to generate binary light distribution at the sample plane. Here we demonstrate that using graded input sources, enables generating intensity graded light patterns and extend the range of application of holographic light illumination. At first, we use intensity-graded holograms to compensate for LC-SLM position dependent diffraction efficiency or sample fluorescence inhomogeneity. Finally we show that intensity-graded holography can be used to equalize photo evoked currents from cells expressing different level of chanelrhodopsin2 (ChR2, one of the most commonly used optogenetics light gated channels, taking into account the non-linear dependence of channel opening on incident light.

  14. A new vapor generation system for mercury species based on the UV irradiation of mercaptoethanol used in the determination of total and methyl mercury in environmental and biological samples by atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Yanmin; Qiu, Jianhua; Yang, Limin [College of Chemistry and Chemical Engineering, Xiamen University, Department of Chemistry and the MOE Key Laboratory of Analytical Sciences, Xiamen (China); Wang, Qiuquan [College of Chemistry and Chemical Engineering, Xiamen University, Department of Chemistry and the MOE Key Laboratory of Analytical Sciences, Xiamen (China); Xiamen University, State Key Laboratory of Marine Environmental Science, Xiamen (China)

    2007-06-15

    A new vapor generation system for mercury (Hg) species based on the irradiation of mercaptoethanol (ME) with UV was developed to provide an effective sample introduction unit for atomic fluorescence spectrometry (AFS). Preliminary investigations of the mechanism of this novel vapor generation system were based on GC-MS and FT-IR studies. Under optimum conditions, the limits of determination for inorganic divalence mercury and methyl mercury were 60 and 50 pg mL{sup -1}, respectively. Certified reference materials (BCR 463 tuna fish and BCR 580 estuarine sediment) were used to validate this new method, and the results agreed well with certified values. This new system provides an attractive alternative method of chemical vapor generation (CVG) of mercury species compared to other developed CVG systems (for example, the traditional KBH{sub 4}/NaOH-acid system). To our knowledge, this is the first systematic report on UV/ME-based Hg species vapor generation and the determination of total and methyl Hg in environmental and biological samples using UV/ME-AFS. (orig.)

  15. The Employees of Baby Boomers Generation, Generation X, Generation Y and Generation Z in Selected Czech Corporations as Conceivers of Development and Competitiveness in their Corporation

    Directory of Open Access Journals (Sweden)

    Bejtkovský Jiří

    2016-12-01

    Full Text Available The corporations using the varied workforce can supply a greater variety of solutions to problems in service, sourcing, and allocation of their resources. The current labor market mentions four generations that are living and working today: the Baby boomers generation, the Generation X, the Generation Y and the Generation Z. The differences between generations can affect the way corporations recruit and develop teams, deal with change, motivate, stimulate and manage employees, and boost productivity, competitiveness and service effectiveness. A corporation’s success and competitiveness depend on its ability to embrace diversity and realize the competitive advantages and benefits. The aim of this paper is to present the current generation of employees (the employees of Baby Boomers Generation, Generation X, Generation Y and Generation Z in the labor market by secondary research and then to introduce the results of primary research that was implemented in selected corporations in the Czech Republic. The contribution presents a view of some of the results of quantitative and qualitative research conducted in selected corporations in the Czech Republic. These researches were conducted in 2015 on a sample of 3,364 respondents, and the results were analyzed. Two research hypotheses and one research question have been formulated. The verification or rejection of null research hypothesis was done through the statistical method of the Pearson’s Chi-square test. It was found that perception of the choice of superior from a particular generation does depend on the age of employees in selected corporations. It was also determined that there are statistically significant dependences between the preference for eterogeneous or homogeneous cooperation and the age of employees in selected corporations.

  16. Comparison of sampling methods for hard-to-reach francophone populations: yield and adequacy of advertisement and respondent-driven sampling.

    Science.gov (United States)

    Ngwakongnwi, Emmanuel; King-Shier, Kathryn M; Hemmelgarn, Brenda R; Musto, Richard; Quan, Hude

    2014-01-01

    Francophones who live outside the primarily French-speaking province of Quebec, Canada, risk being excluded from research by lack of a sampling frame. We examined the adequacy of random sampling, advertising, and respondent-driven sampling for recruitment of francophones for survey research. We recruited francophones residing in the city of Calgary, Alberta, through advertising and respondentdriven sampling. These 2 samples were then compared with a random subsample of Calgary francophones derived from the 2006 Canadian Community Health Survey (CCHS). We assessed the effectiveness of advertising and respondent-driven sampling in relation to the CCHS sample by comparing demographic characteristics and selected items from the CCHS (specifically self-reported general health status, perceived weight, and having a family doctor). We recruited 120 francophones through advertising and 145 through respondent-driven sampling; the random sample from the CCHS consisted of 259 records. The samples derived from advertising and respondentdriven sampling differed from the CCHS in terms of age (mean ages 41.0, 37.6, and 42.5 years, respectively), sex (proportion of males 26.1%, 40.6%, and 56.6%, respectively), education (college or higher 86.7% , 77.9% , and 59.1%, respectively), place of birth (immigrants accounting for 45.8%, 55.2%, and 3.7%, respectively), and not having a regular medical doctor (16.7%, 34.5%, and 16.6%, respectively). Differences were not tested statistically because of limitations on the analysis of CCHS data imposed by Statistics Canada. The samples generated exclusively through advertising and respondent-driven sampling were not representative of the gold standard sample from the CCHS. Use of such biased samples for research studies could generate misleading results.

  17. Genome-wide association study identifies major loci for carcass weight on BTA14 in Hanwoo (Korean cattle.

    Directory of Open Access Journals (Sweden)

    Seung Hwan Lee

    Full Text Available This genome-wide association study (GWAS was conducted to identify major loci that are significantly associated with carcass weight, and their effects, in order to provide increased understanding of the genetic architecture of carcass weight in Hanwoo. This genome-wide association study identified one major chromosome region ranging from 23 Mb to 25 Mb on chromosome 14 as being associated with carcass weight in Hanwoo. Significant Bonferroni-corrected genome-wide associations (P<1.52×10(-6 were detected for 6 Single Nucleotide Polymorphic (SNP loci for carcass weight on chromosome 14. The most significant SNP was BTB-01280026 (P = 4.02×10(-11, located in the 25 Mb region on Bos taurus autosome 14 (BTA14. The other 5 significant SNPs were Hapmap27934-BTC-065223 (P = 4.04×10(-11 in 25.2 Mb, BTB-01143580 (P = 6.35×10(-11 in 24.3 Mb, Hapmap30932-BTC-011225 (P = 5.92×10(-10 in 24.8 Mb, Hapmap27112-BTC-063342 (P = 5.18×10(-9 in 25.4 Mb, and Hapmap24414-BTC-073009 (P = 7.38×10(-8 in 25.4 Mb, all on BTA 14. One SNP (BTB-01143580; P = 6.35×10(-11 lies independently from the other 5 SNPs. The 5 SNPs that lie together showed a large Linkage disequilibrium (LD block (block size of 553 kb with LD coefficients ranging from 0.53 to 0.89 within the block. The most significant SNPs accounted for 6.73% to 10.55% of additive genetic variance, which is quite a large proportion of the total additive genetic variance. The most significant SNP (BTB-01280026; P = 4.02×10(-11 had 16.96 kg of allele substitution effect, and the second most significant SNP (Hapmap27934-BTC-065223; P = 4.04×10(-11 had 18.06 kg of effect on carcass weight, which correspond to 44% and 47%, respectively, of the phenotypic standard deviation for carcass weight in Hanwoo cattle. Our results demonstrated that carcass weight was affected by a major Quantitative Trait Locus (QTL with a large effect and by many SNPs with small effects that are normally

  18. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-05-07

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  19. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming; Wallner, Johannes; Wonka, Peter

    2014-01-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  20. Imaging of Caenorhabditis elegans samples and sub-cellular localization of new generation photosensitizers for photodynamic therapy, using non-linear microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Filippidis, G [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Kouloumentas, C [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Kapsokalyvas, D [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Voglis, G [Institute of Molecular Biology and Biotechnology, Foundation of Research and Technology, Heraklion 71110, Crete (Greece); Tavernarakis, N [Institute of Molecular Biology and Biotechnology, Foundation of Research and Technology, Heraklion 71110, Crete (Greece); Papazoglou, T G [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece)

    2005-08-07

    Two-photon excitation fluorescence (TPEF) and second-harmonic generation (SHG) are relatively new promising tools for the imaging and mapping of biological structures and processes at the microscopic level. The combination of the two image-contrast modes in a single instrument can provide unique and complementary information concerning the structure and the function of tissues and individual cells. The extended application of this novel, innovative technique by the biological community is limited due to the high price of commercial multiphoton microscopes. In this study, a compact, inexpensive and reliable setup utilizing femtosecond pulses for excitation was developed for the TPEF and SHG imaging of biological samples. Specific cell types of the nematode Caenorhabditis elegans were imaged. Detection of the endogenous structural proteins of the worm, which are responsible for observation of SHG signals, was achieved. Additionally, the binding of different photosensitizers in the HL-60 cell line was investigated, using non-linear microscopy. The sub-cellular localization of photosensitizers of a new generation, very promising for photodynamic therapy (PDT) (Hypericum perforatum L. extracts) was achieved. The sub-cellular localization of these novel photosensitizers was linked with their photodynamic action during PDT, and the possible mechanisms for cell killing have been elucidated.

  1. Analysis of IFR samples at ANL-E

    International Nuclear Information System (INIS)

    Bowers, D.L.; Sabau, C.S.

    1993-01-01

    The Analytical Chemistry Laboratory analyzes a variety of samples submitted by the different research groups within IFR. This talk describes the analytical work on samples generated by the Plutonium Electrorefiner, Large Scale Electrorefiner and Waste Treatment Studies. The majority of these samples contain Transuranics and necessitate facilities that safely contain these radioisotopes. Details such as: sample receiving, dissolution techniques, chemical separations, Instrumentation used, reporting of results are discussed. The Importance of Interactions between customer and analytical personnel Is also demonstrated

  2. An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-03-08

    Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  3. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  4. Groundwater sampling: Chapter 5

    Science.gov (United States)

    Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati

    2011-01-01

    About the book: As water quality becomes a leading concern for people and ecosystems worldwide, it must be properly assessed in order to protect water resources for current and future generations. Water Quality Concepts, Sampling, and Analyses supplies practical information for planning, conducting, or evaluating water quality monitoring programs. It presents the latest information and methodologies for water quality policy, regulation, monitoring, field measurement, laboratory analysis, and data analysis. The book addresses water quality issues, water quality regulatory development, monitoring and sampling techniques, best management practices, and laboratory methods related to the water quality of surface and ground waters. It also discusses basic concepts of water chemistry and hydrology related to water sampling and analysis; instrumentation; water quality data analysis; and evaluation and reporting results.

  5. A single-tube 27-plex SNP assay for estimating individual ancestry and admixture from three continents.

    Science.gov (United States)

    Wei, Yi-Liang; Wei, Li; Zhao, Lei; Sun, Qi-Fan; Jiang, Li; Zhang, Tao; Liu, Hai-Bo; Chen, Jian-Gang; Ye, Jian; Hu, Lan; Li, Cai-Xia

    2016-01-01

    A single-tube multiplex assay of a small set of ancestry-informative markers (AIMs) for effectively estimating individual ancestry and admixture is an ideal forensic tool to trace the population origin of an unknown DNA sample. We present a newly developed 27-plex single nucleotide polymorphism (SNP) panel with highly robust and balanced differential power to perfectly assign individuals to African, European, and East Asian ancestries. Evaluating 968 previously described intercontinental AIMs from three HapMap population genotyping datasets (Yoruban in Ibadan, Nigeria (YRI); Utah residents with Northern and Western European ancestry from the Centre de'Etude du Polymorphism Humain (CEPH) collection (CEU); and Han Chinese in Beijing, China (CHB)), the best set of markers was selected on the basis of Hardy-Weinberg equilibrium (p > 0.00001), population-specific allele frequency (two of three δ values >0.5), according to linkage disequilibrium (r (2) ancestry of the 11 populations in the HapMap project. Then, we tested the 27-plex SNP assay with 1164 individuals from 17 additional populations. The results demonstrated that the SNP panel was successful for ancestry inference of individuals with African, European, and East Asian ancestry. Furthermore, the system performed well when inferring the admixture of Eurasians (EUR/EAS) after analyzing admixed populations from Xinjiang (Central Asian) as follows: Tajik (68:27), Uyghur (49:46), Kirgiz (40:57), and Kazak (36:60). For individual analyses, we interpreted each sample with a three-ancestry component percentage and a population match probability sequence. This multiplex assay is a convenient and cost-effective tool to assist in criminal investigations, as well as to correct for the effects of population stratification for case-control studies.

  6. Next-generation phylogenomics

    Directory of Open Access Journals (Sweden)

    Chan Cheong Xin

    2013-01-01

    Full Text Available Abstract Thanks to advances in next-generation technologies, genome sequences are now being generated at breadth (e.g. across environments and depth (thousands of closely related strains, individuals or samples unimaginable only a few years ago. Phylogenomics – the study of evolutionary relationships based on comparative analysis of genome-scale data – has so far been developed as industrial-scale molecular phylogenetics, proceeding in the two classical steps: multiple alignment of homologous sequences, followed by inference of a tree (or multiple trees. However, the algorithms typically employed for these steps scale poorly with number of sequences, such that for an increasing number of problems, high-quality phylogenomic analysis is (or soon will be computationally infeasible. Moreover, next-generation data are often incomplete and error-prone, and analysis may be further complicated by genome rearrangement, gene fusion and deletion, lateral genetic transfer, and transcript variation. Here we argue that next-generation data require next-generation phylogenomics, including so-called alignment-free approaches. Reviewers Reviewed by Mr Alexander Panchin (nominated by Dr Mikhail Gelfand, Dr Eugene Koonin and Prof Peter Gogarten. For the full reviews, please go to the Reviewers’ comments section.

  7. 0-6760 : improved trip generation data for Texas using workplace and special generator surveys.

    Science.gov (United States)

    2014-08-01

    Trip generation rates play an important role in : transportation planning, which can help in : making informed decisions about future : transportation investment and design. However, : sometimes the rates are derived from small : sample sizes or may ...

  8. Enhanced Sampling and Analysis, Selection of Technology for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Svoboda, John; Meikrantz, David

    2010-02-01

    The focus of this study includes the investigation of sampling technologies used in industry and their potential application to nuclear fuel processing. The goal is to identify innovative sampling methods using state of the art techniques that could evolve into the next generation sampling and analysis system for metallic elements. This report details the progress made in the first half of FY 2010 and includes a further consideration of the research focus and goals for this year. Our sampling options and focus for the next generation sampling method are presented along with the criteria used for choosing our path forward. We have decided to pursue the option of evaluating the feasibility of microcapillary based chips to remotely collect, transfer, track and supply microliters of sample solutions to analytical equipment in support of aqueous processes for used nuclear fuel cycles. Microchip vendors have been screened and a choice made for the development of a suitable microchip design followed by production of samples for evaluation by ANL, LANL, and INL on an independent basis.

  9. Vision of new generation CRMs for QC of microanalysis

    International Nuclear Information System (INIS)

    Tian Weizhi

    2005-01-01

    Direct analysis of ever smaller solid samples has become one of the trends in modern analytical science, in coping with the increasing requirements from life, materials, environment, and other frontier scientific fields. Due to the lack of natural matrix CRMs certified at matched sample size levels, however, quantitative calibration and quality control have long been a bottleneck of microanalysis. CRMs of new generation are therefore called for to make solid sampling microanalysis an accurately quantitative and quality-controllable technique. In this paper, an approach is proposed to use a combination of several nuclear analytical techniques in the certification of RMs suitable for QC of analyses at sub-ng sample size levels. The technical procedures, the major problems, and the possible format of certificates of the new generation CRMs, and the outliik of the establishment of QC system for microanalysis are described. The CRMs of current generation have played an important role in the quality of analysis, especially trace analysis, and in turn in the development of related scientific fields in 20 th century. It may be reasonably predicted that the new generation CRMs will play the similar role in the quality of microanalysis, and in turn in relevant frontier scientific fields in 21 st century. Nuclear analytical techniques have made, and will continue to make, unique contributions to both generations of CRMs.

  10. Stochastic sampling of the RNA structural alignment space.

    Science.gov (United States)

    Harmanci, Arif Ozgun; Sharma, Gaurav; Mathews, David H

    2009-07-01

    A novel method is presented for predicting the common secondary structures and alignment of two homologous RNA sequences by sampling the 'structural alignment' space, i.e. the joint space of their alignments and common secondary structures. The structural alignment space is sampled according to a pseudo-Boltzmann distribution based on a pseudo-free energy change that combines base pairing probabilities from a thermodynamic model and alignment probabilities from a hidden Markov model. By virtue of the implicit comparative analysis between the two sequences, the method offers an improvement over single sequence sampling of the Boltzmann ensemble. A cluster analysis shows that the samples obtained from joint sampling of the structural alignment space cluster more closely than samples generated by the single sequence method. On average, the representative (centroid) structure and alignment of the most populated cluster in the sample of structures and alignments generated by joint sampling are more accurate than single sequence sampling and alignment based on sequence alone, respectively. The 'best' centroid structure that is closest to the known structure among all the centroids is, on average, more accurate than structure predictions of other methods. Additionally, cluster analysis identifies, on average, a few clusters, whose centroids can be presented as alternative candidates. The source code for the proposed method can be downloaded at http://rna.urmc.rochester.edu.

  11. Laser-generated acoustic wave studies on tattoo pigment

    Science.gov (United States)

    Paterson, Lorna M.; Dickinson, Mark R.; King, Terence A.

    1996-01-01

    A Q-switched alexandrite laser (180 ns at 755 nm) was used to irradiate samples of agar embedded with red, black and green tattoo dyes. The acoustic waves generated in the samples were detected using a PVDF membrane hydrophone and compared to theoretical expectations. The laser pulses were found to generate acoustic waves in the black and green samples but not in the red pigment. Pressures of up to 1.4 MPa were produced with irradiances of up to 96 MWcm-2 which is comparable to the irradiances used to clear pigment embedded in skin. The pressure gradient generated across pigment particles was approximately 1.09 X 1010 Pam-1 giving a pressure difference of 1.09 +/- 0.17 MPa over a particle with mean diameter 100 micrometers . This is not sufficient to permanently damage skin which has a tensile strength of 7.4 MPa.

  12. An Investigation of the Sampling Distribution of the Congruence Coefficient.

    Science.gov (United States)

    Broadbooks, Wendy J.; Elmore, Patricia B.

    This study developed and investigated an empirical sampling distribution of the congruence coefficient. The effects of sample size, number of variables, and population value of the congruence coefficient on the sampling distribution of the congruence coefficient were examined. Sample data were generated on the basis of the common factor model and…

  13. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  14. On-line sample processing methods in flow analysis

    DEFF Research Database (Denmark)

    Miró, Manuel; Hansen, Elo Harald

    2008-01-01

    In this chapter, the state of the art of flow injection and related approaches thereof for automation and miniaturization of sample processing regardless of the aggregate state of the sample medium is overviewed. The potential of the various generation of flow injection for implementation of in...

  15. Neonatal blood gas sampling methods | Goenka | South African ...

    African Journals Online (AJOL)

    There is little published guidance that systematically evaluates the different methods of neonatal blood gas sampling, where each method has its individual benefits and risks. This review critically surveys the available evidence to generate a comparison between arterial and capillary blood gas sampling, focusing on their ...

  16. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  17. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Directory of Open Access Journals (Sweden)

    Peter Feist

    2015-02-01

    Full Text Available Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  18. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  19. Proteomic challenges: sample preparation techniques for microgram-quantity protein analysis from biological samples.

    Science.gov (United States)

    Feist, Peter; Hummon, Amanda B

    2015-02-05

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed.

  20. Assessing neutron generator output using neutron activation of silicon

    International Nuclear Information System (INIS)

    Kehayias, Pauli M.; Kehayias, Joseph J.

    2007-01-01

    D-T neutron generators are used for elemental composition analysis and medical applications. Often composition is determined by examining elemental ratios in which the knowledge of the neutron flux is unnecessary. However, the absolute value of the neutron flux is required when the generator is used for neutron activation analysis, to study radiation damage to materials, to monitor the operation of the generator, and to measure radiation exposure. We describe a method for absolute neutron output and flux measurements of low output D-T neutron generators using delayed activation of silicon. We irradiated a series of silicon oxide samples with 14.1 MeV neutrons and counted the resulting gamma rays of the 28 Al nucleus with an efficiency-calibrated detector. To minimize the photon self-absorption effects within the samples, we used a zero-thickness extrapolation technique by repeating the measurement with samples of different thicknesses. The neutron flux measured 26 cm away from the tritium target of a Thermo Electron A-325 D-T generator (Thermo Electron Corporation, Colorado Springs, CO) was 6.2 x 10 3 n/s/cm 2 ± 5%, which is consistent with the manufacturer's specifications

  1. Assessing neutron generator output using neutron activation of silicon

    Energy Technology Data Exchange (ETDEWEB)

    Kehayias, Pauli M. [Body Composition Laboratory, Jean Mayer United States Department of Agriculture Human Nutrition Research Center on Aging, Tufts University, Boston, MA 02111 (United States); Kehayias, Joseph J. [Body Composition Laboratory, Jean Mayer United States Department of Agriculture Human Nutrition Research Center on Aging, Tufts University, Boston, MA 02111 (United States)]. E-mail: joseph.kehayias@tufts.edu

    2007-08-15

    D-T neutron generators are used for elemental composition analysis and medical applications. Often composition is determined by examining elemental ratios in which the knowledge of the neutron flux is unnecessary. However, the absolute value of the neutron flux is required when the generator is used for neutron activation analysis, to study radiation damage to materials, to monitor the operation of the generator, and to measure radiation exposure. We describe a method for absolute neutron output and flux measurements of low output D-T neutron generators using delayed activation of silicon. We irradiated a series of silicon oxide samples with 14.1 MeV neutrons and counted the resulting gamma rays of the {sup 28}Al nucleus with an efficiency-calibrated detector. To minimize the photon self-absorption effects within the samples, we used a zero-thickness extrapolation technique by repeating the measurement with samples of different thicknesses. The neutron flux measured 26 cm away from the tritium target of a Thermo Electron A-325 D-T generator (Thermo Electron Corporation, Colorado Springs, CO) was 6.2 x 10{sup 3} n/s/cm{sup 2} {+-} 5%, which is consistent with the manufacturer's specifications.

  2. Extraction of Total DNA and RNA from Marine Filter Samples and Generation of a cDNA as Universal Template for Marker Gene Studies.

    Science.gov (United States)

    Schneider, Dominik; Wemheuer, Franziska; Pfeiffer, Birgit; Wemheuer, Bernd

    2017-01-01

    Microbial communities play an important role in marine ecosystem processes. Although the number of studies targeting marker genes such as the 16S rRNA gene has been increased in the last few years, the vast majority of marine diversity is rather unexplored. Moreover, most studies focused on the entire bacterial community and thus disregarded active microbial community players. Here, we describe a detailed protocol for the simultaneous extraction of DNA and RNA from marine water samples and for the generation of cDNA from the isolated RNA which can be used as a universal template in various marker gene studies.

  3. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  4. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  5. Sampling the Mouse Hippocampal Dentate Gyrus

    Directory of Open Access Journals (Sweden)

    Lisa Basler

    2017-12-01

    Full Text Available Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE have been developed to provide tentative answers to the question if sampling has been “good enough” to provide meaningful statistical outcomes. We tested the performance of the commonly used Gundersen-Jensen CE estimator, using the layers of the mouse hippocampal dentate gyrus as an example (molecular layer, granule cell layer and hilus. We found that this estimator provided useful estimates of the precision that can be expected from samples of different sizes. For all layers, we found that a smoothness factor (m of 0 generally provided better estimates than an m of 1. Only for the combined layers, i.e., the entire dentate gyrus, better CE estimates could be obtained using an m of 1. The orientation of the sections impacted on CE sizes. Frontal (coronal sections are typically most efficient by providing the smallest CEs for a given amount of work. Applying the estimator to 3D-reconstructed layers and using very intense sampling, we observed CE size plots with m = 0 to m = 1 transitions that should also be expected but are not often observed in real section series. The data we present also allows the reader to approximate the sampling intervals in frontal, horizontal or sagittal sections that provide CEs of specified sizes for the layers of the mouse dentate gyrus.

  6. A Consistent System for Coding Laboratory Samples

    Science.gov (United States)

    Sih, John C.

    1996-07-01

    A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.

  7. Gamma: A C++ Sound Synthesis Library Further Abstracting the Unit Generator

    DEFF Research Database (Denmark)

    Putnam, Lance Jonathan

    2014-01-01

    Gamma is a C++ library for sound synthesis that was created to address some of the limitations of existing sound synthesis libraries. The first limitation is that unit generators cannot easily be organized into separate sampling domains. This makes it difficult to use unit generators with different...... sample rates and in other domains, namely the frequency domain. The second limitation is that certain internal unit generator algorithms, such as interpolation, cannot be customized. This tends to lead to closed architectures consisting of multiple unit generators with only slight algorithmic differences...

  8. Efficient pseudo-random number generation for monte-carlo simulations using graphic processors

    Science.gov (United States)

    Mohanty, Siddhant; Mohanty, A. K.; Carminati, F.

    2012-06-01

    A hybrid approach based on the combination of three Tausworthe generators and one linear congruential generator for pseudo random number generation for GPU programing as suggested in NVIDIA-CUDA library has been used for MONTE-CARLO sampling. On each GPU thread, a random seed is generated on fly in a simple way using the quick and dirty algorithm where mod operation is not performed explicitly due to unsigned integer overflow. Using this hybrid generator, multivariate correlated sampling based on alias technique has been carried out using both CUDA and OpenCL languages.

  9. Efficient pseudo-random number generation for Monte-Carlo simulations using graphic processors

    International Nuclear Information System (INIS)

    Mohanty, Siddhant; Mohanty, A K; Carminati, F

    2012-01-01

    A hybrid approach based on the combination of three Tausworthe generators and one linear congruential generator for pseudo random number generation for GPU programing as suggested in NVIDIA-CUDA library has been used for MONTE-CARLO sampling. On each GPU thread, a random seed is generated on fly in a simple way using the quick and dirty algorithm where mod operation is not performed explicitly due to unsigned integer overflow. Using this hybrid generator, multivariate correlated sampling based on alias technique has been carried out using both CUDA and OpenCL languages.

  10. Bayesian posterior sampling via stochastic gradient Fisher scoring

    NARCIS (Netherlands)

    Ahn, S.; Korattikara, A.; Welling, M.; Langford, J.; Pineau, J.

    2012-01-01

    In this paper we address the following question: "Can we approximately sample from a Bayesian posterior distribution if we are only allowed to touch a small mini-batch of data-items for every sample we generate?". An algorithm based on the Langevin equation with stochastic gradients (SGLD) was

  11. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  12. A new nebulization device with exchangeable aerosol generation mode as a useful tool to investigate sample introduction processes in inductively coupled plasma atomic emission spectrometry

    International Nuclear Information System (INIS)

    Grotti, Marco; Lagomarsino, Cristina; Frache, Roberto

    2004-01-01

    A new sample introduction device has been designed in order to differentiate between the effects of the aerosol production and its following desolvation on analytical performances of an inductively coupled plasma optical spectrometer. This research tool allows to easily switch between the pneumatic and ultrasonic aerosol generation mode and to use a joint desolvation chamber. In this way, a real comparison between aerosol production systems may be attained and the influence of aerosol generation process on analytical figures clearly distinguished from that of the desolvation process. In this work, the separate effects of the aerosol generation and desolvation processes on analytical sensitivity and tolerance towards matrix effects have been investigated. Concerning sensitivity, it was found that both the processes play an important role in determining emission intensities, being the increase in sensitivity due to desolvation higher than that due to the improved aerosol generation efficiency. Concerning the matrix effects, a predominant role of the desolvation system was found, while the influence of the aerosol generation mode was much less important. For nitric acid, the decreasing effect was mitigated by the presence of a desolvation system, due to partial removal of the acid. On the contrary, the depressive effect of sulfuric acid was enhanced by the presence of a desolvation system, due to degradation of the solvent removal efficiency and to further decrease in the analyte transport rate caused by clustering phenomena. Concerning the interferences due to sodium and calcium, a depressive effect was observed, which is enhanced by desolvation

  13. Convolutional neural network using generated data for SAR ATR with limited samples

    Science.gov (United States)

    Cong, Longjian; Gao, Lei; Zhang, Hui; Sun, Peng

    2018-03-01

    Being able to adapt all weather at all times, it has been a hot research topic that using Synthetic Aperture Radar(SAR) for remote sensing. Despite all the well-known advantages of SAR, it is hard to extract features because of its unique imaging methodology, and this challenge attracts the research interest of traditional Automatic Target Recognition(ATR) methods. With the development of deep learning technologies, convolutional neural networks(CNNs) give us another way out to detect and recognize targets, when a huge number of samples are available, but this premise is often not hold, when it comes to monitoring a specific type of ships. In this paper, we propose a method to enhance the performance of Faster R-CNN with limited samples to detect and recognize ships in SAR images.

  14. Nonlinear Dynamics of Cantilever-Sample Interactions in Atomic Force Microscopy

    Science.gov (United States)

    Cantrell, John H.; Cantrell, Sean A.

    2010-01-01

    The interaction of the cantilever tip of an atomic force microscope (AFM) with the sample surface is obtained by treating the cantilever and sample as independent systems coupled by a nonlinear force acting between the cantilever tip and a volume element of the sample surface. The volume element is subjected to a restoring force from the remainder of the sample that provides dynamical equilibrium for the combined systems. The model accounts for the positions on the cantilever of the cantilever tip, laser probe, and excitation force (if any) via a basis set of set of orthogonal functions that may be generalized to account for arbitrary cantilever shapes. The basis set is extended to include nonlinear cantilever modes. The model leads to a pair of coupled nonlinear differential equations that are solved analytically using a matrix iteration procedure. The effects of oscillatory excitation forces applied either to the cantilever or to the sample surface (or to both) are obtained from the solution set and applied to the to the assessment of phase and amplitude signals generated by various acoustic-atomic force microscope (A-AFM) modalities. The influence of bistable cantilever modes of on AFM signal generation is discussed. The effects on the cantilever-sample surface dynamics of subsurface features embedded in the sample that are perturbed by surface-generated oscillatory excitation forces and carried to the cantilever via wave propagation are accounted by the Bolef-Miller propagating wave model. Expressions pertaining to signal generation and image contrast in A-AFM are obtained and applied to amplitude modulation (intermittent contact) atomic force microscopy and resonant difference-frequency atomic force ultrasonic microscopy (RDF-AFUM). The influence of phase accumulation in A-AFM on image contrast is discussed, as is the effect of hard contact and maximum nonlinearity regimes of A-AFM operation.

  15. Subrandom methods for multidimensional nonuniform sampling.

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. The intensive DT neutron generator of TU Dresden

    Directory of Open Access Journals (Sweden)

    Klix Axel

    2018-01-01

    Full Text Available TU Dresden operates an accelerator-based intensive DT neutron generator. Experimental activities comprise investigation into material activation and decay, neutron and photon transport in matter and R&D work on radiation detectors for harsh environments. The intense DT neutron generator is capable to produce a maximum of 1012 n/s. The neutron source is a solid-type water-cooled tritium target based on a titanium matrix on a copper carrier. The neutron yield at a typical deuteron beam current of 1 mA is of the order of 1011 n/s in 4Π. A pneumatic sample transport system is available for short-time irradiations and connected to wo high-purity germanium detector spectrometers for the measurement of induced activities. The overall design of the experimental hall with the neutron generator allows a flexible setup of experiments including the possibility of investigating larger structures and cooled samples or samples at high temperatures.

  17. The intensive DT neutron generator of TU Dresden

    Science.gov (United States)

    Klix, Axel; DÖring, Toralf; Domula, Alexander; Zuber, Kai

    2018-01-01

    TU Dresden operates an accelerator-based intensive DT neutron generator. Experimental activities comprise investigation into material activation and decay, neutron and photon transport in matter and R&D work on radiation detectors for harsh environments. The intense DT neutron generator is capable to produce a maximum of 1012 n/s. The neutron source is a solid-type water-cooled tritium target based on a titanium matrix on a copper carrier. The neutron yield at a typical deuteron beam current of 1 mA is of the order of 1011 n/s in 4Π. A pneumatic sample transport system is available for short-time irradiations and connected to wo high-purity germanium detector spectrometers for the measurement of induced activities. The overall design of the experimental hall with the neutron generator allows a flexible setup of experiments including the possibility of investigating larger structures and cooled samples or samples at high temperatures.

  18. Flow injection electrochemical hydride generation inductively coupled plasma time-of-flight mass spectrometry for the simultaneous determination of hydride forming elements and its application to the analysis of fresh water samples

    International Nuclear Information System (INIS)

    Bings, Nicolas H.; Stefanka, Zsolt; Mallada, Sergio Rodriguez

    2003-01-01

    A flow injection (FI) method was developed using electrochemical hydride generation (EcHG) as a sample introduction system, coupled to an inductively coupled plasma time-of-flight mass spectrometer (ICP-TOFMS) for rapid and simultaneous determination of six elements forming hydrides (As, Bi, Ge, Hg, Sb and Se). A novel low volume electrolysis cell, especially suited for FI experiments was designed and the conditions for simultaneous electrochemical hydride generation (EcHG; electrolyte concentrations and flow rates, electrolysis voltage and current) as well as the ICP-TOFMS operational parameters (carrier gas flow rate, modulation pulse width (MPW)) for the simultaneous determination of 12 isotopes were optimized. The compromise operation parameters of the electrolysis were found to be 1.4 and 3 ml min -1 for the anolyte and catholyte flow rates, respectively, using 2 M sulphuric acid. An optimum electrolysis current of 0.7 A (16 V) and an argon carrier gas flow rate of 0.91 l min -1 were chosen. A modulation pulse width of 5 μs, which influences the sensitivity through the amount of ions being collected by the MS per single analytical cycle, provided optimum results for the detection of transient signals. The achieved detection limits were compared with those obtained by using FI in combination with conventional nebulization (FI-ICP-TOFMS); values for chemical hydride generation (FI-CHG-ICP-TOFMS) were taken from the literature. By using a 200 μl sample loop absolute detection limits (3σ) in the range of 10-160 pg for As, Bi, Ge, Hg, Sb and 1.1 ng for Se and a precision of 4-8% for seven replicate injections of 20-100 ng ml -1 multielemental sample solutions were achieved. The analysis of a standard reference material (SRM) 1643d (NIST, 'Trace Elements in Water') showed good agreement with the certified values for As and Sb. Se showed a drastic difference, which is probably due to the presence of hydride-inactive Se species in the sample. Recoveries better than

  19. The ocean sampling day consortium

    DEFF Research Database (Denmark)

    Kopf, Anna; Bicak, Mesude; Kottmann, Renzo

    2015-01-01

    Ocean Sampling Day was initiated by the EU-funded Micro B3 (Marine Microbial Biodiversity, Bioinformatics, Biotechnology) project to obtain a snapshot of the marine microbial biodiversity and function of the world’s oceans. It is a simultaneous global mega-sequencing campaign aiming to generate...... the largest standardized microbial data set in a single day. This will be achievable only through the coordinated efforts of an Ocean Sampling Day Consortium, supportive partnerships and networks between sites. This commentary outlines the establishment, function and aims of the Consortium and describes our...

  20. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  1. Generational Differences In Organizational Justice Perceptions: An Exploratory Investigation Across Three Generational Cohorts

    Directory of Open Access Journals (Sweden)

    Ledimo Ophillia

    2015-06-01

    Full Text Available Despite several reviews of generational differences across cohorts regarding their career stages in organizations, relatively few empirical investigations have been conducted to understand cohorts’ perceptions. Hence, there is paucity of studies that explored differences on the construct organizational justice across generational cohorts. The objective of this study was to explore the differences across three generational cohorts (Millennials, Generation X, and Baby Boomers on dimensions of the organizational justice measurement instrument (OJMI. Data was collected through the administration of OJMI to a random sample size of organizational employees (n = 289. Descriptive statistics and analysis of variance were conducted to interpret the data. These findings provide evidence that differences do exist across cohorts on dimensions of organizational justice. In terms of contributions and practical implications, insight gained from the findings may be used in proposing organizational development interventions to manage multigenerational employees as well as to conduct future research.

  2. Elemental analysis using temporal gating of a pulsed neutron generator

    Energy Technology Data Exchange (ETDEWEB)

    Mitra, Sudeep

    2018-02-20

    Technologies related to determining elemental composition of a sample that comprises fissile material are described herein. In a general embodiment, a pulsed neutron generator periodically emits bursts of neutrons, and is synchronized with an analyzer circuit. The bursts of neutrons are used to interrogate the sample, and the sample outputs gamma rays based upon the neutrons impacting the sample. A detector outputs pulses based upon the gamma rays impinging upon the material of the detector, and the analyzer circuit assigns the pulses to temporally-based bins based upon the analyzer circuit being synchronized with the pulsed neutron generator. A computing device outputs data that is indicative of elemental composition of the sample based upon the binned pulses.

  3. Heritability in the efficiency of nonsense-mediated mRNA decay in humans

    KAUST Repository

    Seoighe, Cathal; Gehring, Christoph A

    2010-01-01

    across tissues and between individuals, with important clinical consequences. Principal Findings: Using previously published Affymetrix exon microarray data from cell lines genotyped as part of the International HapMap project, we investigated whether

  4. Ascertainment bias in studies of human genome-wide polymorphism

    DEFF Research Database (Denmark)

    Clark, Andrew G.; Hubisz, Melissa J.; Bustamente, Carlos D.

    2005-01-01

    of the SNPs that are found are influenced by the discovery sampling effort. The International HapMap project relied on nearly any piece of information available to identify SNPs-including BAC end sequences, shotgun reads, and differences between public and private sequences-and even made use of chimpanzee...... was a resequencing-by-hybridization effort using the 24 people of diverse origin in the Polymorphism Discovery Resource. Here we take these two data sets and contrast two basic summary statistics, heterozygosity and FST, as well as the site frequency spectra, for 500-kb windows spanning the genome. The magnitude...... of disparity between these samples in these measures of variability indicates that population genetic analysis on the raw genotype data is ill advised. Given the knowledge of the discovery samples, we perform an ascertainment correction and show how the post-correction data are more consistent across...

  5. Human population structure detection via multilocus genotype clustering

    Directory of Open Access Journals (Sweden)

    Starmer Joshua

    2007-06-01

    Full Text Available Abstract Background We describe a hierarchical clustering algorithm for using Single Nucleotide Polymorphism (SNP genetic data to assign individuals to populations. The method does not assume Hardy-Weinberg equilibrium and linkage equilibrium among loci in sample population individuals. Results We show that the algorithm can assign sample individuals highly accurately to their corresponding ethnic groups in our tests using HapMap SNP data and it is also robust to admixed populations when tested with Perlegen SNP data. Moreover, it can detect fine-scale population structure as subtle as that between Chinese and Japanese by using genome-wide high-diversity SNP loci. Conclusion The algorithm provides an alternative approach to the popular STRUCTURE program, especially for fine-scale population structure detection in genome-wide association studies. This is the first successful separation of Chinese and Japanese samples using random SNP loci with high statistical support.

  6. Wyoming CV Pilot Traveler Information Message Sample

    Data.gov (United States)

    Department of Transportation — This dataset contains a sample of the sanitized Traveler Information Messages (TIM) being generated by the Wyoming Connected Vehicle (CV) Pilot. The full set of TIMs...

  7. MadGraph/MadEvent. The new web generation

    International Nuclear Information System (INIS)

    Alwall, J.

    2007-01-01

    The new web-based version of the automatized process and event generator MadGraph/MadEvent is now available. Recent developments are: New models, notably MSSM, 2HDM and a framework for addition of user-defined models, inclusive sample generation and on-line hadronization and detector simulation. Event generation can be done on-line on any of our clusters. (author)

  8. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    Science.gov (United States)

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  10. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  11. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  12. Flow-driven voltage generation in carbon nanotubes

    Indian Academy of Sciences (India)

    The flow of various liquids and gases over single-walled carbon nanotube bundles induces an electrical signal (voltage/current) in the sample along the direction of the flow. The electrical response generated by the flow of liquids is found to be logarithmic in the flow speed over a wide range. In contrast, voltage generated ...

  13. Population Analysis of Pharmacogenetic Polymorphisms Related to Acute Lymphoblastic Leukemia Drug Treatment

    Directory of Open Access Journals (Sweden)

    Marcela A. Chiabai

    2012-01-01

    Full Text Available This study aimed to evaluate in the Brazilian population, the genotypes and population frequencies of pharmacogenetic polymorphisms involved in the response to drugs used in treatment of acute lymphoblastic leukemia (ALL, and to compare the data with data from the HapMap populations. There was significant differentiation between most population pairs, but few associations between genetic ancestry and SNPs in the Brazilian population were observed. AMOVA analysis comparing the Brazilian population to all other populations retrieved from HapMap pointed to a genetic proximity with the European population. These associations point to preclusion of the use of genetic ancestry as a proxy for predicting drug response. In this way, any study aiming to correlate genotype with drug response in the Brazilian population should be based on pharmacogenetic SNP genotypes.

  14. Testing generative thinking among Swazi children | Mushoriwa ...

    African Journals Online (AJOL)

    The survey research design was used, with interviews employed to collect the data. Crosstabs and a two-sample t-test were used to analyse the data. The study found no significant differences in generative thinking between second and fifth graders in the Swazi sample. In the comparative analyses, while significant ...

  15. Psychological empowerment and job satisfaction between Baby Boomer and Generation X nurses.

    Science.gov (United States)

    Sparks, Amy M

    2012-05-01

    This paper is a report of a study of differences in nurses' generational psychological empowerment and job satisfaction. Generations differ in work styles such as autonomy, work ethics, involvement, views on leadership, and primary views on what constitutes innovation, quality, and service. A secondary analysis was conducted from two data sets resulting in a sample of 451 registered nurses employed at five hospitals in West Virginia. One data set was gathered from a convenience sample and one from a randomly selected sample. Data were collected from 2000 to 2004. Baby Boomer nurses reported higher mean total psychological empowerment scores than Generation X nurses. There were no differences in total job satisfaction scores between the generations. There were significant differences among the generations' psychological empowerment scores. Generational differences related to psychological empowerment could provide insight into inconsistent findings related to nurse job satisfaction. Nurse administrators may consider this evidence when working on strategic plans to motivate and entice Generation X nurses and retain Baby Boomers. Although implications based on this study are tentative, the results indicate the need for administrators to consider the differences between Baby Boomer and Generation X nurses. © 2011 Blackwell Publishing Ltd.

  16. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-09-01

    Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  17. Distribution Coeficients (Kd) Generated From A Core Sample Collected From The Saltstone Disposal Facility

    International Nuclear Information System (INIS)

    Almond, P.; Kaplan, D.

    2011-01-01

    Core samples originating from Vault 4, Cell E of the Saltstone Disposal Facility (SDF) were collected in September of 2008 (Hansen and Crawford 2009, Smith 2008) and sent to SRNL to measure chemical and physical properties of the material including visual uniformity, mineralogy, microstructure, density, porosity, distribution coefficients (K d ), and chemical composition. Some data from these experiments have been reported (Cozzi and Duncan 2010). In this study, leaching experiments were conducted with a single core sample under conditions that are representative of saltstone performance. In separate experiments, reducing and oxidizing environments were targeted to obtain solubility and Kd values from the measurable species identified in the solid and aqueous leachate. This study was designed to provide insight into how readily species immobilized in saltstone will leach from the saltstone under oxidizing conditions simulating the edge of a saltstone monolith and under reducing conditions, targeting conditions within the saltstone monolith. Core samples were taken from saltstone poured in December of 2007 giving a cure time of nine months in the cell and a total of thirty months before leaching experiments began in June 2010. The saltstone from Vault 4, Cell E is comprised of blast furnace slag, class F fly ash, portland cement, and Deliquification, Dissolution, and Adjustment (DDA) Batch 2 salt solution. The salt solution was previously analyzed from a sample of Tank 50 salt solution and characterized in the 4QCY07 Waste Acceptance Criteria (WAC) report (Zeigler and Bibler 2009). Subsequent to Tank 50 analysis, additional solution was added to the tank solution from the Effluent Treatment Project as well as from inleakage from Tank 50 pump bearings (Cozzi and Duncan 2010). Core samples were taken from three locations and at three depths at each location using a two-inch diameter concrete coring bit (1-1, 1-2, 1-3; 2-1, 2-2, 2-3; 3-1, 3-2, 3-3) (Hansen and Crawford

  18. DISTRIBUTION COEFICIENTS (KD) GENERATED FROM A CORE SAMPLE COLLECTED FROM THE SALTSTONE DISPOSAL FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Almond, P.; Kaplan, D.

    2011-04-25

    Core samples originating from Vault 4, Cell E of the Saltstone Disposal Facility (SDF) were collected in September of 2008 (Hansen and Crawford 2009, Smith 2008) and sent to SRNL to measure chemical and physical properties of the material including visual uniformity, mineralogy, microstructure, density, porosity, distribution coefficients (K{sub d}), and chemical composition. Some data from these experiments have been reported (Cozzi and Duncan 2010). In this study, leaching experiments were conducted with a single core sample under conditions that are representative of saltstone performance. In separate experiments, reducing and oxidizing environments were targeted to obtain solubility and Kd values from the measurable species identified in the solid and aqueous leachate. This study was designed to provide insight into how readily species immobilized in saltstone will leach from the saltstone under oxidizing conditions simulating the edge of a saltstone monolith and under reducing conditions, targeting conditions within the saltstone monolith. Core samples were taken from saltstone poured in December of 2007 giving a cure time of nine months in the cell and a total of thirty months before leaching experiments began in June 2010. The saltstone from Vault 4, Cell E is comprised of blast furnace slag, class F fly ash, portland cement, and Deliquification, Dissolution, and Adjustment (DDA) Batch 2 salt solution. The salt solution was previously analyzed from a sample of Tank 50 salt solution and characterized in the 4QCY07 Waste Acceptance Criteria (WAC) report (Zeigler and Bibler 2009). Subsequent to Tank 50 analysis, additional solution was added to the tank solution from the Effluent Treatment Project as well as from inleakage from Tank 50 pump bearings (Cozzi and Duncan 2010). Core samples were taken from three locations and at three depths at each location using a two-inch diameter concrete coring bit (1-1, 1-2, 1-3; 2-1, 2-2, 2-3; 3-1, 3-2, 3-3) (Hansen and

  19. LIG1 polymorphisms: the Indian scenario

    Indian Academy of Sciences (India)

    2014-08-14

    Aug 14, 2014 ... occurs across this population on social parameters such as income and ..... HapMap database belonging to Chinese (CHB), Japanese. (JPT) ..... was supported by CSIR network projects CMM0016, CMM0018,. NWP0034.

  20. A Next-Generation Sequencing Data Analysis Pipeline for Detecting Unknown Pathogens from Mixed Clinical Samples and Revealing Their Genetic Diversity.

    Directory of Open Access Journals (Sweden)

    Yu-Nong Gong

    Full Text Available Forty-two cytopathic effect (CPE-positive isolates were collected from 2008 to 2012. All isolates could not be identified for known viral pathogens by routine diagnostic assays. They were pooled into 8 groups of 5-6 isolates to reduce the sequencing cost. Next-generation sequencing (NGS was conducted for each group of mixed samples, and the proposed data analysis pipeline was used to identify viral pathogens in these mixed samples. Polymerase chain reaction (PCR or enzyme-linked immunosorbent assay (ELISA was individually conducted for each of these 42 isolates depending on the predicted viral types in each group. Two isolates remained unknown after these tests. Moreover, iteration mapping was implemented for each of these 2 isolates, and predicted human parechovirus (HPeV in both. In summary, our NGS pipeline detected the following viruses among the 42 isolates: 29 human rhinoviruses (HRVs, 10 HPeVs, 1 human adenovirus (HAdV, 1 echovirus and 1 rotavirus. We then focused on the 10 identified Taiwanese HPeVs because of their reported clinical significance over HRVs. Their genomes were assembled and their genetic diversity was explored. One novel 6-bp deletion was found in one HPeV-1 virus. In terms of nucleotide heterogeneity, 64 genetic variants were detected from these HPeVs using the mapped NGS reads. Most importantly, a recombination event was found between our HPeV-3 and a known HPeV-4 strain in the database. Similar event was detected in the other HPeV-3 strains in the same clade of the phylogenetic tree. These findings demonstrated that the proposed NGS data analysis pipeline identified unknown viruses from the mixed clinical samples, revealed their genetic identity and variants, and characterized their genetic features in terms of viral evolution.

  1. Adaptive Metropolis Sampling with Product Distributions

    Science.gov (United States)

    Wolpert, David H.; Lee, Chiu Fan

    2005-01-01

    The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.

  2. Generation Y Online Buying Patterns

    Directory of Open Access Journals (Sweden)

    Katija Vojvodić

    2015-12-01

    Full Text Available The advantages of electronic retailing can, among other things, result in uncontrolled buying by online consumers, i.e. in extreme buying behavior. The main purpose of this paper is to analyze and determine the buying patterns of Generation Y online consumers in order to explore the existence of different types of behavior based on the characteristics of online buying. The paper also aims at exploring the relationship between extracted factors and Generation Y consumers’ buying intentions. Empirical research was conducted on a sample of 515 consumers in the Dubrovnik-Neretva County. Based on the factor analysis, research results indicate that Generation Y online consumers are influenced by three factors: compulsivity, impulsivity, and functionality. The analysis of variance reveals that significant differences exist between the extracted factors and Generation Y’s online buying characteristics. In addition, correlation analysis shows a statistically significant correlation between the extracted factors and Generation Y’s buying intentions.

  3. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  4. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  5. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  6. Adaptive sampling program support for expedited site characterization

    International Nuclear Information System (INIS)

    Johnson, R.

    1993-01-01

    Expedited site characterizations offer substantial savings in time and money when assessing hazardous waste sites. Key to some of these savings is the ability to adapt a sampling program to the ''real-time'' data generated by an expedited site characterization. This paper presents a two-prong approach to supporting adaptive sampling programs: a specialized object-oriented database/geographical information system for data fusion, management and display; and combined Bayesian/geostatistical methods for contamination extent estimation and sample location selection

  7. Estimation of Uncertainty in Aerosol Concentration Measured by Aerosol Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Chan; Song, Yong Jae; Jung, Woo Young; Lee, Hyun Chul; Kim, Gyu Tae; Lee, Doo Yong [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    FNC Technology Co., Ltd has been developed test facilities for the aerosol generation, mixing, sampling and measurement under high pressure and high temperature conditions. The aerosol generation system is connected to the aerosol mixing system which injects SiO{sub 2}/ethanol mixture. In the sampling system, glass fiber membrane filter has been used to measure average mass concentration. Based on the experimental results using main carrier gas of steam and air mixture, the uncertainty estimation of the sampled aerosol concentration was performed by applying Gaussian error propagation law. FNC Technology Co., Ltd. has been developed the experimental facilities for the aerosol measurement under high pressure and high temperature. The purpose of the tests is to develop commercial test module for aerosol generation, mixing and sampling system applicable to environmental industry and safety related system in nuclear power plant. For the uncertainty calculation of aerosol concentration, the value of the sampled aerosol concentration is not measured directly, but must be calculated from other quantities. The uncertainty of the sampled aerosol concentration is a function of flow rates of air and steam, sampled mass, sampling time, condensed steam mass and its absolute errors. These variables propagate to the combination of variables in the function. Using operating parameters and its single errors from the aerosol test cases performed at FNC, the uncertainty of aerosol concentration evaluated by Gaussian error propagation law is less than 1%. The results of uncertainty estimation in the aerosol sampling system will be utilized for the system performance data.

  8. (SNP) assay for population stratification test between eastern Asians

    African Journals Online (AJOL)

    Yomi

    2012-01-03

    Jan 3, 2012 ... program STRUCTURE 2.0, which uses a Markov chain Monte. Carlo (MCMC) algorithm to cluster individuals into different cryptic ... HapMap project. .... Evaluation of the 124-plex SNP typing microarray for forensic testing.

  9. The use of genome-wide eQTL associations in lymphoblastoid cell lines to identify novel genetic pathways involved in complex traits.

    Directory of Open Access Journals (Sweden)

    Josine L Min

    Full Text Available The integrated analysis of genotypic and expression data for association with complex traits could identify novel genetic pathways involved in complex traits. We profiled 19,573 expression probes in Epstein-Barr virus-transformed lymphoblastoid cell lines (LCLs from 299 twins and correlated these with 44 quantitative traits (QTs. For 939 expressed probes correlating with more than one QT, we investigated the presence of eQTL associations in three datasets of 57 CEU HapMap founders and 86 unrelated twins. Genome-wide association analysis of these probes with 2.2 m SNPs revealed 131 potential eQTLs (1,989 eQTL SNPs overlapping between the HapMap datasets, five of which were in cis (58 eQTL SNPs. We then tested 535 SNPs tagging the eQTL SNPs, for association with the relevant QT in 2,905 twins. We identified nine potential SNP-QT associations (P<0.01 but none significantly replicated in five large consortia of 1,097-16,129 subjects. We also failed to replicate previous reported eQTL associations with body mass index, plasma low-density lipoprotein cholesterol, high-density lipoprotein cholesterol and triglycerides levels derived from lymphocytes, adipose and liver tissue. Our results and additional power calculations suggest that proponents may have been overoptimistic in the power of LCLs in eQTL approaches to elucidate regulatory genetic effects on complex traits using the small datasets generated to date. Nevertheless, larger tissue-specific expression data sets relevant to specific traits are becoming available, and should enable the adoption of similar integrated analyses in the near future.

  10. exTAS - next-generation TAS for small samples and extreme conditions

    International Nuclear Information System (INIS)

    Kulda, J.; Hiess, A.

    2011-01-01

    The currently used implementation of horizontally and vertically focusing optics in three-axis spectrometers (TAS) permits efficient studies of excitations in sub-cm 3 - sized single crystals]. With the present proposal we wish to stimulate a further paradigm shift into the domain of mm 3 -sized samples. exTAS combines highly focused mm-sized focal spots, boosting the sensitivity limits, with a spectrometer layout down-scaled to a table-top size to provide high flexibility in optimizing acceptance angles and to achieve sub-millimeter positioning accuracy. (authors)

  11. Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.

    Science.gov (United States)

    Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.

    1998-01-01

    The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)

  12. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  13. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  14. Multi-Locus Next-Generation Sequence Typing of DNA Extracted From Pooled Colonies Detects Multiple Unrelated Candida albicans Strains in a Significant Proportion of Patient Samples

    Directory of Open Access Journals (Sweden)

    Ningxin Zhang

    2018-06-01

    Full Text Available The yeast Candida albicans is an important opportunistic human pathogen. For C. albicans strain typing or drug susceptibility testing, a single colony recovered from a patient sample is normally used. This is insufficient when multiple strains are present at the site sampled. How often this is the case is unclear. Previous studies, confined to oral, vaginal and vulvar samples, have yielded conflicting results and have assessed too small a number of colonies per sample to reliably detect the presence of multiple strains. We developed a next-generation sequencing (NGS modification of the highly discriminatory C. albicans MLST (multilocus sequence typing method, 100+1 NGS-MLST, for detection and typing of multiple strains in clinical samples. In 100+1 NGS-MLST, DNA is extracted from a pool of colonies from a patient sample and also from one of the colonies. MLST amplicons from both DNA preparations are analyzed by high-throughput sequencing. Using base call frequencies, our bespoke DALMATIONS software determines the MLST type of the single colony. If base call frequency differences between pool and single colony indicate the presence of an additional strain, the differences are used to computationally infer the second MLST type without the need for MLST of additional individual colonies. In mixes of previously typed pairs of strains, 100+1 NGS-MLST reliably detected a second strain. Inferred MLST types of second strains were always more similar to their real MLST types than to those of any of 59 other isolates (22 of 31 inferred types were identical to the real type. Using 100+1 NGS-MLST we found that 7/60 human samples, including three superficial candidiasis samples, contained two unrelated strains. In addition, at least one sample contained two highly similar variants of the same strain. The probability of samples containing unrelated strains appears to differ considerably between body sites. Our findings indicate the need for wider surveys to

  15. Research results: preserving newborn blood samples.

    Science.gov (United States)

    Lewis, Michelle Huckaby; Scheurer, Michael E; Green, Robert C; McGuire, Amy L

    2012-11-07

    Retention and use, without explicit parental permission, of residual dried blood samples from newborn screening has generated public controversy over concerns about violations of family privacy rights and loss of parental autonomy. The public debate about this issue has included little discussion about the destruction of a potentially valuable public resource that can be used for research that may yield improvements in public health. The research community must advocate for policies and infrastructure that promote retention of residual dried blood samples and their use in biomedical research.

  16. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  17. Comparison of a quantum random number generator with pseudorandom number generators for their use in molecular Monte Carlo simulations.

    Science.gov (United States)

    Ghersi, Dario; Parakh, Abhishek; Mezei, Mihaly

    2017-12-05

    Four pseudorandom number generators were compared with a physical, quantum-based random number generator using the NIST suite of statistical tests, which only the quantum-based random number generator could successfully pass. We then measured the effect of the five random number generators on various calculated properties in different Markov-chain Monte Carlo simulations. Two types of systems were tested: conformational sampling of a small molecule in aqueous solution and liquid methanol under constant temperature and pressure. The results show that poor quality pseudorandom number generators produce results that deviate significantly from those obtained with the quantum-based random number generator, particularly in the case of the small molecule in aqueous solution setup. In contrast, the widely used Mersenne Twister pseudorandom generator and a 64-bit Linear Congruential Generator with a scrambler produce results that are statistically indistinguishable from those obtained with the quantum-based random number generator. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  19. WRAP Module 1 sampling and analysis plan

    Energy Technology Data Exchange (ETDEWEB)

    Mayancsik, B.A.

    1995-03-24

    This document provides the methodology to sample, screen, and analyze waste generated, processed, or otherwise the responsibility of the Waste Receiving and Processing Module 1 facility. This includes Low-Level Waste, Transuranic Waste, Mixed Waste, and Dangerous Waste.

  20. WRAP Module 1 sampling and analysis plan

    International Nuclear Information System (INIS)

    Mayancsik, B.A.

    1995-01-01

    This document provides the methodology to sample, screen, and analyze waste generated, processed, or otherwise the responsibility of the Waste Receiving and Processing Module 1 facility. This includes Low-Level Waste, Transuranic Waste, Mixed Waste, and Dangerous Waste

  1. Hybrid layer difference between sixth and seventh generation bonding agent

    Directory of Open Access Journals (Sweden)

    Grace Syavira Suryabrata

    2006-03-01

    Full Text Available Since etching is completed at the same stage as priming and bonding, when applying the sixth and seventh generation bonding, the exposed smear layers are constantly surrounded by primer and bonding and cannot collapse. The smear layer and the depth of penetration of resin bonding in dentinal tubules are completely integrated into hybrid layer. The purpose of this laboratory research was to study the penetration depth of two self etching adhesive. Fourteen samples of human extracted teeth were divided into two groups. Each groups consisted of seven samples, each of them was treated with sixth generation bonding agent and the other was treated with seventh generation bonding agent. The results disclosed that the penetration into dentinal tubules of seventh generation bonding agent was deeper than sixth generation bonding agent. Conclusion: bond strength will improve due to the increasing of penetration depth of resin bonding in dentinal tubules.

  2. Rational learning and information sampling: on the "naivety" assumption in sampling explanations of judgment biases.

    Science.gov (United States)

    Le Mens, Gaël; Denrell, Jerker

    2011-04-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved

  3. Genetic variations in the Dravidian population of South West coast of India: Implications in designing case-control studies.

    Science.gov (United States)

    D'Cunha, Anitha; Pandit, Lekha; Malli, Chaithra

    2017-06-01

    Indian data have been largely missing from genome-wide databases that provide information on genetic variations in different populations. This hinders association studies for complex disorders in India. This study was aimed to determine whether the complex genetic structure and endogamy among Indians could potentially influence the design of case-control studies for autoimmune disorders in the south Indian population. A total of 12 single nucleotide variations (SNVs) related to genes associated with autoimmune disorders were genotyped in 370 healthy individuals belonging to six different caste groups in southern India. Allele frequencies were estimated; genetic divergence and phylogenetic relationship within the various caste groups and other HapMap populations were ascertained. Allele frequencies for all genotyped SNVs did not vary significantly among the different groups studied. Wright's FSTwas 0.001 per cent among study population and 0.38 per cent when compared with Gujarati in Houston (GIH) population on HapMap data. The analysis of molecular variance results showed a 97 per cent variation attributable to differences within the study population and variation due to differences between castes. Phylogenetic analysis showed a separation of Dravidian population from other HapMap populations and particularly from GIH population. Despite the complex genetic origins of the Indian population, our study indicated a low level of genetic differentiation among Dravidian language-speaking people of south India. Case-control studies of association among Dravidians of south India may not require stratification based on language and caste.

  4. One in Four Individuals of African-American Ancestry Harbors a 5.5kb Deletion at chromosome 11q13.1

    Science.gov (United States)

    Zainabadi, Kayvan; Jain, Anuja V.; Donovan, Frank X.; Elashoff, David; Rao, Nagesh P.; Murty, Vundavalli V.; Chandrasekharappa, Settara C.; Srivatsan, Eri S.

    2014-01-01

    Cloning and sequencing of 5.5kb deletion at chromosome 11q13.1 from the HeLa cells, tumorigenic hybrids and two fibroblast cell lines has revealed homologous recombination between AluSx and AluY resulting in the deletion of intervening sequences. Long-range PCR of the 5.5kb sequence in 494 normal lymphocyte samples showed heterozygous deletion in 28.3% of African- American ancestry samples but only in 4.8% of Caucasian samples (pdeletion occurs in 27% of YRI (Yoruba – West African) population but none in non-African populations. The HapMap analysis further identified strong linkage disequilibrium between 5 single nucleotide polymorphisms and the 5.5kb deletion in the people of African ancestry. Computational analysis of 175kb sequence surrounding the deletion site revealed enhanced flexibility, low thermodynamic stability, high repetitiveness, and stable stem-loop/hairpin secondary structures that are hallmarks of common fragile sites. PMID:24412158

  5. GET electronics samples data analysis

    International Nuclear Information System (INIS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G.F.; Pancin, J.; Pedroza, J.L.; Pibernat, J.; Pollacco, E.; Rebii, A.

    2016-01-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  6. Generation and detection technique of laser-ultrasonic

    International Nuclear Information System (INIS)

    Dho, Sang Whoe; Lee, Seung Seok

    1999-01-01

    A number of physical processes may take place when a solid surface is illuminated by a pulse laser. At lower power region these include heating, the generation of thermal waves, elastic waves (ultrasound). At higher powers, material may be ablated from the surface and a plasma formed, while in the sample there may be melting, plastic deformation and even the formation of cracks. In this letter we consider the generation techniques of laser-ultrasonic il all possible state. And we consider the measurement technique of laser-generated ultrasound based on the optical method.

  7. Gas Generation from K East Basin Sludges - Series II Testing

    International Nuclear Information System (INIS)

    Bryan, Samuel A.; Delegard, Calvin H.; Schmidt, Andrew J.; Sell, Rachel L.; Silvers, Kurt L.; Gano, Susan R.; Thornton, Brenda M.

    2001-01-01

    This report describes work to examine the gas generation behavior of actual K East (KE) Basin floor, pit and canister sludge. Mixed and unmixed and fractionated KE canister sludge were tested, along with floor and pit sludges from areas in the KE Basin not previously sampled. The first report in this series focused on gas generation from KE floor and canister sludge collected using a consolidated sampling technique. The third report will present results of gas generation testing of irradiated uranium fuel fragments with and without sludge addition. The path forward for management of the K Basin Sludge is to retrieve, ship, and store the sludge at T Plant until final processing at some future date. Gas generation will impact the designs and costs of systems associated with retrieval, transportation and storage of sludge

  8. Experimental study of power generation utilizing human excreta

    International Nuclear Information System (INIS)

    Mudasar, Roshaan; Kim, Man-Hoe

    2017-01-01

    Highlights: • Power generation from human excreta has been studied under ambient conditions. • Biogas increases with solid wastes and continuous feeding at mesophilic conditions. • Understand the potential of human excreta for domestic power generating systems. • 26.8 kW h power is generated using biogas of 0.35 m 3 /kg from waste of 35 kg. • Continuous feeding produces 0.7 m 3 /kg biogas and generates 60 kW h power. - Abstract: This study presents the energetic performance of the biomass to produce power for micro scale domestic usage. Human excreta are chosen as the subject of the study to investigate their potential to produce biogas under ambient conditions. Furthermore, the research examines the approaches by which biogas production can be enhanced and purified, leading to a high-power generation system. The experimental work focuses on the design and fabrication of a biogas digester with a reverse solar reflector, water scrubbing tower, and a dryer. Anaerobic digestion has been considered as the decomposition method using solar energy which is a heat providing source. Specifically, two types of experiments have been performed, namely, feces to water weight proportion and continuous feeding experiments, each involving a set of six samples. The effect of parameters such as pH, ambient temperature, and biogas upgradation reveals that volume of biogas and power generation can be best obtained when an 8:2 feces to water weight sample is employed and when the feeding is applied every fifth day. In addition, this study discusses the environmental prospects of the biogas technology, which is achieved by using the water purification method to improve the methane percentage to 85% and remove undesired gases. The motivation behind this work is to understand the potential of human excreta for the development of domestic power generating systems. The results obtained reveal that 0.35 m 3 /kg of biogas is produced with 8:2 weight proportion sample, which

  9. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  10. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Mandelli, Diego; Smith, Curtis Lee; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua Joseph

    2015-01-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  11. Improved Sampling Algorithms in the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.

  12. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  13. Generation of Rayleigh waves into mortar and concrete samples.

    Science.gov (United States)

    Piwakowski, B; Fnine, Abdelilah; Goueygou, M; Buyle-Bodin, F

    2004-04-01

    The paper deals with a non-destructive method for characterizing the degraded cover of concrete structures using high-frequency ultrasound. In a preliminary study, the authors emphasized on the interest of using higher frequency Rayleigh waves (within the 0.2-1 MHz frequency band) for on-site inspection of concrete structures with subsurface damage. The present study represents a continuation of the previous work and aims at optimizing the generation and reception of Rayleigh waves into mortar and concrete be means of wedge transducers. This is performed experimentally by checking the influence of the wedge material and coupling agent on the surface wave parameters. The selection of the best combination wedge/coupling is performed by searching separately for the best wedge material and the best coupling material. Three wedge materials and five coupling agents were tested. For each setup the five parameters obtained from the surface wave measurement i.e. the frequency band, the maximal available central frequency, the group velocity error and its standard deviation and finally the error in velocity dispersion characteristic were investigated and classed as a function of the wedge material and the coupling agent. The selection criteria were chosen so as to minimize the absorption of both materials, the randomness of measurements and the systematic error of the group velocity and of dispersion characteristic. Among the three tested wedge materials, Teflon was found to be the best. The investigation on the coupling agent shows that the gel type materials are the best solutions. The "thick" materials displaying higher viscosity were found as the worst. The results show also that the use of a thin plastic film combined with the coupling agent even increases the bandwidth and decreases the uncertainty of measurements.

  14. Sample Selection for Training Cascade Detectors.

    Science.gov (United States)

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  15. LVQ-SMOTE - Learning Vector Quantization based Synthetic Minority Over-sampling Technique for biomedical data.

    Science.gov (United States)

    Nakamura, Munehiro; Kajiwara, Yusuke; Otsuka, Atsushi; Kimura, Haruhiko

    2013-10-02

    Over-sampling methods based on Synthetic Minority Over-sampling Technique (SMOTE) have been proposed for classification problems of imbalanced biomedical data. However, the existing over-sampling methods achieve slightly better or sometimes worse result than the simplest SMOTE. In order to improve the effectiveness of SMOTE, this paper presents a novel over-sampling method using codebooks obtained by the learning vector quantization. In general, even when an existing SMOTE applied to a biomedical dataset, its empty feature space is still so huge that most classification algorithms would not perform well on estimating borderlines between classes. To tackle this problem, our over-sampling method generates synthetic samples which occupy more feature space than the other SMOTE algorithms. Briefly saying, our over-sampling method enables to generate useful synthetic samples by referring to actual samples taken from real-world datasets. Experiments on eight real-world imbalanced datasets demonstrate that our proposed over-sampling method performs better than the simplest SMOTE on four of five standard classification algorithms. Moreover, it is seen that the performance of our method increases if the latest SMOTE called MWMOTE is used in our algorithm. Experiments on datasets for β-turn types prediction show some important patterns that have not been seen in previous analyses. The proposed over-sampling method generates useful synthetic samples for the classification of imbalanced biomedical data. Besides, the proposed over-sampling method is basically compatible with basic classification algorithms and the existing over-sampling methods.

  16. Sampling frequency affects ActiGraph activity counts

    DEFF Research Database (Denmark)

    Brønd, Jan Christian; Arvidsson, Daniel

    that is normally performed at frequencies higher than 2.5 Hz. With the ActiGraph model GT3X one has the option to select sample frequency from 30 to 100 Hz. This study investigated the effect of the sampling frequency on the ouput of the bandpass filter.Methods: A synthetic frequency sweep of 0-15 Hz was generated...... in Matlab and sampled at frequencies of 30-100 Hz. Also, acceleration signals during indoor walking and running were sampled at 30 Hz using the ActiGraph GT3X and resampled in Matlab to frequencies of 40-100 Hz. All data was processed with the ActiLife software.Results: Acceleration frequencies between 5......-15 Hz escaped the bandpass filter when sampled at 40, 50, 70, 80 and 100 Hz, while this was not the case when sampled at 30, 60 and 90 Hz. During the ambulatory activities this artifact resultet in different activity count output from the ActiLife software with different sampling frequency...

  17. Non-Destructive Method by Gamma Sampling Measurements for Radiological Characterization of a Steam Generator: Physical and Numerical Modeling for ANIMMA (23-27 June 2013)

    International Nuclear Information System (INIS)

    Auge, G.; Rottner, B.; Dubois, C.

    2013-06-01

    The radiological characterization of a steam generator consists of evaluating the global radiological activity in the tube bundle. In this paper, we present a non-destructive method and the results analysis of the gamma sampling measurements from a sample of U-tubes in the bundle. On site, the implementation of the methodology is fairly easy. But the analysis of the results is more complicated due to the long path of the gamma ray (from 60 Co quite penetrating), and also the heterogeneous activity of U-tubes bundle, which have not the same life cycle. We explain why the periodic spatial arrangement complicates also the analysis. Furthermore, we have taken into account the environment of all tubes measured because of all the external influence activity of others U-tubes (the nearest, the most distant and potential hot spot). A great amount of independent influence coefficient had to be considered (roughly 18 million). Based on a physical and numerical modeling, and using a Cholesky algorithm solving the problem and saving time machine. (authors)

  18. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  19. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    Science.gov (United States)

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  20. Self generation, small generation, and embedded generation issues

    International Nuclear Information System (INIS)

    2001-01-01

    The New Brunswick Market Design Committee for electric power restructuring has been directed to examine issues regarding cogeneration and small-scale, on-site generation and how they will fit within the framework of the bilateral contract market. The Committee will also have to deal with issues of generation embedded in a distribution system. The Committee has defined cogeneration as the simultaneous production of electricity and useful thermal energy. Self-generation has been defined as small-scale power generation by an end-user, while embedded generation has been defined as a generation facility that is located within a distribution utility but is not directly connected to the transmission system. The Committee has postponed its decision on whether embedded generation will be eligible to participate under the bilateral contract market for electricity. This report discusses general issues such as the physical support of generation, market support of generation, transition issues and policy issues. It also discusses generation support issues such as operating reserves, transmission tariff issues, and distribution tariffs. Market support issues such as transmission access for generation sales were also considered, along with market access for generation sales, and net metering for behind the meter generation. 7 refs., 1 tab

  1. Finding metastabilities in reversible Markov chains based on incomplete sampling

    Directory of Open Access Journals (Sweden)

    Fackeldey Konstantin

    2017-01-01

    Full Text Available In order to fully characterize the state-transition behaviour of finite Markov chains one needs to provide the corresponding transition matrix P. In many applications such as molecular simulation and drug design, the entries of the transition matrix P are estimated by generating realizations of the Markov chain and determining the one-step conditional probability Pij for a transition from one state i to state j. This sampling can be computational very demanding. Therefore, it is a good idea to reduce the sampling effort. The main purpose of this paper is to design a sampling strategy, which provides a partial sampling of only a subset of the rows of such a matrix P. Our proposed approach fits very well to stochastic processes stemming from simulation of molecular systems or random walks on graphs and it is different from the matrix completion approaches which try to approximate the transition matrix by using a low-rank-assumption. It will be shown how Markov chains can be analyzed on the basis of a partial sampling. More precisely. First, we will estimate the stationary distribution from a partially given matrix P. Second, we will estimate the infinitesimal generator Q of P on the basis of this stationary distribution. Third, from the generator we will compute the leading invariant subspace, which should be identical to the leading invariant subspace of P. Forth, we will apply Robust Perron Cluster Analysis (PCCA+ in order to identify metastabilities using this subspace.

  2. Strategic management of steam generators

    International Nuclear Information System (INIS)

    Hernalsteen, P.; Berthe, J.

    1991-01-01

    This paper addresses the general approach followed in Belgium for managing any kind of generic defect affecting a Steam Generator tubebundle. This involves the successive steps of: problem detection, dedicated sample monitoring, implementation of preventive methods, development of specific plugging criteria, dedicated 100% inspection, implementation of repair methods, adjusted sample monitoring and repair versus replacement strategy. These steps are illustrated by the particular case of Primary Water Stress Corrosion Cracking in tube roll transitions, which is presently the main problem for two Belgian units Doele-3 and Tihange-2. (author)

  3. Site-specific waste management instruction for the field sampling organization

    International Nuclear Information System (INIS)

    Bryant, D.L.

    1997-01-01

    The Site-Specific Waste Management Instruction (SSWMI) provides guidance for the management of waste generated from field-sampling activities performed by the Environment Restoration Contractor (ERC) Sampling Organization that are not managed as part of a project SSWMI. Generally, the waste is unused preserved groundwater trip blanks, used and expired calibration solutions, and other similar waste that cannot be returned to an ERC project for disposal. The specific waste streams addressed by this SSWMI are identified in Section 2.0. This SSWMI was prepared in accordance with BHI-EE-02, Environmental Requirements. Waste generated from field sample collection activities should be returned to the project and managed in accordance with the applicable project-specific SSWMI whenever possible. However, returning all field sample collection and associated waste to a project for disposal may not always be practical or cost effective. Therefore, the ERC field sampling organization must manage and arrange to dispose of the waste using the (Bechtel Hanford, Inc. [BHI]) Field Support Waste Management (FSWM) services. This SSWMI addresses those waste streams that are the responsibility of the field sampling organization to manage and make arrangements for disposal

  4. Semantic attributes based texture generation

    Science.gov (United States)

    Chi, Huifang; Gan, Yanhai; Qi, Lin; Dong, Junyu; Madessa, Amanuel Hirpa

    2018-04-01

    Semantic attributes are commonly used for texture description. They can be used to describe the information of a texture, such as patterns, textons, distributions, brightness, and so on. Generally speaking, semantic attributes are more concrete descriptors than perceptual features. Therefore, it is practical to generate texture images from semantic attributes. In this paper, we propose to generate high-quality texture images from semantic attributes. Over the last two decades, several works have been done on texture synthesis and generation. Most of them focusing on example-based texture synthesis and procedural texture generation. Semantic attributes based texture generation still deserves more devotion. Gan et al. proposed a useful joint model for perception driven texture generation. However, perceptual features are nonobjective spatial statistics used by humans to distinguish different textures in pre-attentive situations. To give more describing information about texture appearance, semantic attributes which are more in line with human description habits are desired. In this paper, we use sigmoid cross entropy loss in an auxiliary model to provide enough information for a generator. Consequently, the discriminator is released from the relatively intractable mission of figuring out the joint distribution of condition vectors and samples. To demonstrate the validity of our method, we compare our method to Gan et al.'s method on generating textures by designing experiments on PTD and DTD. All experimental results show that our model can generate textures from semantic attributes.

  5. Galaxy LIMS for next-generation sequencing

    NARCIS (Netherlands)

    Scholtalbers, J.; Rossler, J.; Sorn, P.; Graaf, J. de; Boisguerin, V.; Castle, J.; Sahin, U.

    2013-01-01

    SUMMARY: We have developed a laboratory information management system (LIMS) for a next-generation sequencing (NGS) laboratory within the existing Galaxy platform. The system provides lab technicians standard and customizable sample information forms, barcoded submission forms, tracking of input

  6. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  7. Generation Y preferences towards wine

    DEFF Research Database (Denmark)

    Chrysochou, Polymeros; Krystallis Krontalis, Athanasios; Mocanu, Ana

    2012-01-01

    Purpose – The purpose of this paper is to explore differences in wine preferences between Generation Y and older cohorts in the USA. Design/methodology/approach – A total of 260 US consumers participated in a web-based survey that took place in April 2010. The best-worst scaling method was applied...... measuring the level of importance given by participants to a list of most common attributes used in choice of wine. Independent sample t-tests were applied to compare the best-worst scores between Generation Y and older cohorts. Findings – Differences were found in the level of importance that Generation Y...... gives to wine attributes in comparison to older cohorts. Generation Y was found to attach more importance to attributes such as “Someone recommended it”, “Attractive front label” and “Promotional display in-store”, whereas older cohorts gave more importance to attributes such as “I read about it...

  8. The results of experimental studies of VLF-ULF electromagnetic emission by rock samples due to mechanical action

    Science.gov (United States)

    Panfilov, A. A.

    2014-06-01

    The paper presents the results of laboratory experiments on electromagnetic emissions excitation (the electric component of electromagnetic fields) by rock samples due to different forms of mechanical stress applications. It was shown that samples generate electric impulses with different spectra when the impact action, gradual loading or dynamic friction is applied. It was ascertained that level and spectral compositions of signals, generated by rock samples, change with an increasing number of hits. It was found that strong electromagnetic signals, generated while rock samples were fracturing, were accompanied by repetitive weak but perceptible variations in the electric field intensity in short frequency ranges.

  9. Identification of copy number variants defining genomic differences among major human groups.

    Directory of Open Access Journals (Sweden)

    Lluís Armengol

    Full Text Available BACKGROUND: Understanding the genetic contribution to phenotype variation of human groups is necessary to elucidate differences in disease predisposition and response to pharmaceutical treatments in different human populations. METHODOLOGY/PRINCIPAL FINDINGS: We have investigated the genome-wide profile of structural variation on pooled samples from the three populations studied in the HapMap project by comparative genome hybridization (CGH in different array platforms. We have identified and experimentally validated 33 genomic loci that show significant copy number differences from one population to the other. Interestingly, we found an enrichment of genes related to environment adaptation (immune response, lipid metabolism and extracellular space within these regions and the study of expression data revealed that more than half of the copy number variants (CNVs translate into gene-expression differences among populations, suggesting that they could have functional consequences. In addition, the identification of single nucleotide polymorphisms (SNPs that are in linkage disequilibrium with the copy number alleles allowed us to detect evidences of population differentiation and recent selection at the nucleotide variation level. CONCLUSIONS: Overall, our results provide a comprehensive view of relevant copy number changes that might play a role in phenotypic differences among major human populations, and generate a list of interesting candidates for future studies.

  10. A simulative comparison of respondent driven sampling with incentivized snowball sampling – the “strudel effect”

    Science.gov (United States)

    Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.

    2014-01-01

    Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650

  11. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration

    DEFF Research Database (Denmark)

    de Vries, Paul S; Chasman, Daniel I; Sabater-Lleal, Maria

    2016-01-01

    Genome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes indels. W...

  12. A meta-analysis of 120 246 individuals identifies 18 new loci for fibrinogen concentration

    NARCIS (Netherlands)

    P.S. de Vries (Paul); D.I. Chasman (Daniel); M. Sabater-Lleal (Maria); M.-H. Chen (Ming-Huei); J.E. Huffman (Jennifer E.); M. Steri (Maristella); W. Tang (Weihong); A. Teumer (Alexander); R.E. Marioni (Riccardo); V. Grossmann (Vera); J.J. Hottenga (Jouke Jan); S. Trompet (Stella); M. Müller-Nurasyid (Martina); J.H. Zhao (Jing Hua); J. Brody (Jennifer); M.E. Kleber (Marcus); X. Guo (Xiuqing); J.J. Wang (Jie Jin); P. Auer (Paul); J. Attia (John); L.R. Yanek (Lisa); T.S. Ahluwalia (Tarunveer Singh); J. Lahti (Jari); C. Venturini (Cristina); T. Tanaka (Toshiko); L.F. Bielak (Lawrence F.); P.K. Joshi (Peter); A. Rocanin-Arjo (Ares); I. Kolcic (Ivana); P. Navarro (Pau); L.M. Rose (Lynda); C. Oldmeadow (Christopher); H. Riess (Helene); J. Mazur (Johanna); S. Basu (Saonli); A. Goel (Anuj); Q. Yang (Qiong); M. Ghanbari (Mohsen); Gonnekewillemsen; A. Rumley (Ann); E. Fiorillo (Edoardo); A.J. de Craen (Anton); A. Grotevendt (Anne); R.A. Scott (Robert); K.D. Taylor (Kent D.); G.E. Delgado (Graciela E.); J. Yao (Jie); A. Kifley (Annette); C. Kooperberg (Charles); Q. Qayyum (Rehan); L. Lopez (Lornam); T.L. Berentzen (Tina L.); K. Räikkönen (Katri); Massimomangino; S. Bandinelli (Stefania); P.A. Peyser (Patricia A.); S. Wild (Sarah); D.-A. Tregouet (David-Alexandre); A.F. Wright (Alan); J. Marten (Jonathan); T. Zemunik (Tatijana); A.C. Morrison (Alanna); B. Sennblad (Bengt); G.H. Tofler (Geoffrey); M.P.M. de Maat (Moniek); E.J.C. de Geus (Eco); G.D. Lowe (Gordon D.); M. Zoledziewska (Magdalena); N. Sattar (Naveed); H. Binder (Harald); U. Völker (Uwe); M. Waldenberger (Melanie); K.-T. Khaw (Kay-Tee); B. McKnight (Barbara); J. Huang (Jian); N.S. Jenny (Nancy); E.G. Holliday (Elizabeth); L. Qi (Lihong); M.G. Mcevoy (Mark G.); D.M. Becker (Diane); J.M. Starr (John); A.-P. Sarin; P.G. Hysi (Pirro); D.G. Hernandez (Dena); M.A. Jhun (Min A.); H. Campbell (Harry); A. Hamsten (Anders); F. Sarin (Fernando); W.L. McArdle (Wendy); P. Eline Slagboom; T. Zeller (Tanja); W. Koenig (Wolfgang); B. Psaty (Brucem); T. Haritunians (Talin); J. Liu (Jingmin); A. Palotie (Aarno); A.G. Uitterlinden (André); D.J. Stott (David J.); A. Hofman (Albert); O.H. Franco (Oscar); O. Polasek (Ozren); I. Rudan (Igor); P.-E. Morange (P.); J.F. Wilson (James F.); S.L. Kardia (Sharon L.r); L. Ferrucci (Luigi); T.D. Spector (Timothy); J.G. Eriksson (Johan G.); T. Hansen (Torben); I.J. Deary (Ian); L.C. Becker (Lewis); R.J. Scott (Rodney); P. Mitchell (Paul); W. März (Winfried); N.J. Wareham (Nick J.); A. Peters (Annette); A. Greinacher (Andreas); P.S. Wild (Philipp S.); J.W. Jukema (Jan Wouter); D.I. Boomsma (Dorret I.); C. Hayward (Caroline); F. Cucca (Francesco); R.P. Tracy (Russell); H. Watkins (Hugh); A.P. Reiner (Alex P.); A.R. Folsom (Aaron); P.M. Ridker (Paul); C.J. O'Donnell (Christopher J.); N.L. Smith (Nicholas L.); D.P. Strachan (David P.); A. Dehghan (Abbas)

    2016-01-01

    textabstractGenome-wide association studies have previously identified 23 genetic loci associated with circulating fibrinogen concentration. These studies used HapMap imputation and did not examine the X-chromosome. 1000 Genomes imputation provides better coverage of uncommon variants, and includes

  13. Cascading Generative Adversarial Networks for Targeted

    KAUST Repository

    Hamdi, Abdullah

    2018-01-01

    Abundance of labelled data played a crucial role in the recent developments in computer vision, but that faces problems like scalability and transferability to the wild. One alternative approach is to utilize the data without labels, i.e. unsupervised learning, in learning valuable information and put it in use to tackle vision problems. Generative Adversarial Networks (GANs) have gained momentum for their ability to model image distributions in unsupervised manner. They learn to emulate the training set and that enables sampling from that domain and using the knowledge learned for useful applications. Several methods proposed enhancing GANs, including regularizing the loss with some feature matching. We seek to push GANs beyond the data in the training and try to explore unseen territory in the image manifold. We first propose a new regularizer for GAN based on K-Nearest Neighbor (K-NN) selective feature matching to a target set Y in high-level feature space, during the adversarial training of GAN on the base set X, and we call this novel model K-GAN. We show that minimizing the added term follows from cross-entropy minimization between the distributions of GAN and set Y. Then, we introduce a cascaded framework for GANs that try to address the task of imagining a new distribution that combines the base set X and target set Y by cascading sampling GANs with translation GANs, and we dub the cascade of such GANs as the Imaginative Adversarial Network (IAN). Several cascades are trained on a collected dataset Zoo-Faces and generated innovative samples are shown, including from K-GAN cascade. We conduct an objective and subjective evaluation for different IAN setups in the addressed task of generating innovative samples and we show the effect of regularizing GAN on different scores. We conclude with some useful applications for these IANs, like multi-domain manifold traversing.

  14. Cascading Generative Adversarial Networks for Targeted

    KAUST Repository

    Hamdi, Abdullah

    2018-04-09

    Abundance of labelled data played a crucial role in the recent developments in computer vision, but that faces problems like scalability and transferability to the wild. One alternative approach is to utilize the data without labels, i.e. unsupervised learning, in learning valuable information and put it in use to tackle vision problems. Generative Adversarial Networks (GANs) have gained momentum for their ability to model image distributions in unsupervised manner. They learn to emulate the training set and that enables sampling from that domain and using the knowledge learned for useful applications. Several methods proposed enhancing GANs, including regularizing the loss with some feature matching. We seek to push GANs beyond the data in the training and try to explore unseen territory in the image manifold. We first propose a new regularizer for GAN based on K-Nearest Neighbor (K-NN) selective feature matching to a target set Y in high-level feature space, during the adversarial training of GAN on the base set X, and we call this novel model K-GAN. We show that minimizing the added term follows from cross-entropy minimization between the distributions of GAN and set Y. Then, we introduce a cascaded framework for GANs that try to address the task of imagining a new distribution that combines the base set X and target set Y by cascading sampling GANs with translation GANs, and we dub the cascade of such GANs as the Imaginative Adversarial Network (IAN). Several cascades are trained on a collected dataset Zoo-Faces and generated innovative samples are shown, including from K-GAN cascade. We conduct an objective and subjective evaluation for different IAN setups in the addressed task of generating innovative samples and we show the effect of regularizing GAN on different scores. We conclude with some useful applications for these IANs, like multi-domain manifold traversing.

  15. MTGAN: Speaker Verification through Multitasking Triplet Generative Adversarial Networks

    OpenAIRE

    Ding, Wenhao; He, Liang

    2018-01-01

    In this paper, we propose an enhanced triplet method that improves the encoding process of embeddings by jointly utilizing generative adversarial mechanism and multitasking optimization. We extend our triplet encoder with Generative Adversarial Networks (GANs) and softmax loss function. GAN is introduced for increasing the generality and diversity of samples, while softmax is for reinforcing features about speakers. For simplification, we term our method Multitasking Triplet Generative Advers...

  16. Performance of next-generation sequencing on small tumor specimens and/or low tumor content samples using a commercially available platform.

    Directory of Open Access Journals (Sweden)

    Scott Morris

    Full Text Available Next generation sequencing tests (NGS are usually performed on relatively small core biopsy or fine needle aspiration (FNA samples. Data is limited on what amount of tumor by volume or minimum number of FNA passes are needed to yield sufficient material for running NGS. We sought to identify the amount of tumor for running the PCDx NGS platform.2,723 consecutive tumor tissues of all cancer types were queried and reviewed for inclusion. Information on tumor volume, success of performing NGS, and results of NGS were compiled. Assessment of sequence analysis, mutation calling and sensitivity, quality control, drug associations, and data aggregation and analysis were performed.6.4% of samples were rejected from all testing due to insufficient tumor quantity. The number of genes with insufficient sensitivity make definitive mutation calls increased as the percentage of tumor decreased, reaching statistical significance below 5% tumor content. The number of drug associations also decreased with a lower percentage of tumor, but this difference only became significant between 1-3%. The number of drug associations did decrease with smaller tissue size as expected. Neither specimen size or percentage of tumor affected the ability to pass mRNA quality control. A tumor area of 10 mm2 provides a good margin of error for specimens to yield adequate drug association results.Specimen suitability remains a major obstacle to clinical NGS testing. We determined that PCR-based library creation methods allow the use of smaller specimens, and those with a lower percentage of tumor cells to be run on the PCDx NGS platform.

  17. Sampling Realistic Protein Conformations Using Local Structural Bias

    DEFF Research Database (Denmark)

    Hamelryck, Thomas Wim; Kent, John T.; Krogh, A.

    2006-01-01

    The prediction of protein structure from sequence remains a major unsolved problem in biology. The most successful protein structure prediction methods make use of a divide-and-conquer strategy to attack the problem: a conformational sampling method generates plausible candidate structures, which...... are subsequently accepted or rejected using an energy function. Conceptually, this often corresponds to separating local structural bias from the long-range interactions that stabilize the compact, native state. However, sampling protein conformations that are compatible with the local structural bias encoded...... in a given protein sequence is a long-standing open problem, especially in continuous space. We describe an elegant and mathematically rigorous method to do this, and show that it readily generates native-like protein conformations simply by enforcing compactness. Our results have far-reaching implications...

  18. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  19. Theory of sampling: four critical success factors before analysis.

    Science.gov (United States)

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  20. Welcome to America, welcome to college: Comparing the effects of immigrant generation and college generation on physical science and engineering career

    Science.gov (United States)

    Lung, Florin; Potvin, Geoff; Sonnert, Gerhard; Sadler, Philip M.

    2013-01-01

    Students enter college with social, cultural, and economic resources (well described Bourdieu's concepts of habitus and capital) that significantly impact their goals, actions, and successes. Two important determinants of the amount and type of resources available to students are their immigrant generation and college generation status. Drawing on a national sample of 6860 freshmen enrolled in college English, we compare and contrast the effects of immigrant generation with those of college generation status on physical science and engineering career intentions to explore some of the challenges faced by the first in the family to become an American and/or go to college.

  1. Opportunities and Challenges of Linking Scientific Core Samples to the Geoscience Data Ecosystem

    Science.gov (United States)

    Noren, A. J.

    2016-12-01

    Core samples generated in scientific drilling and coring are critical for the advancement of the Earth Sciences. The scientific themes enabled by analysis of these samples are diverse, and include plate tectonics, ocean circulation, Earth-life system interactions (paleoclimate, paleobiology, paleoanthropology), Critical Zone processes, geothermal systems, deep biosphere, and many others, and substantial resources are invested in their collection and analysis. Linking core samples to researchers, datasets, publications, and funding agencies through registration of globally unique identifiers such as International Geo Sample Numbers (IGSNs) offers great potential for advancing several frontiers. These include maximizing sample discoverability, access, reuse, and return on investment; a means for credit to researchers; and documentation of project outputs to funding agencies. Thousands of kilometers of core samples and billions of derivative subsamples have been generated through thousands of investigators' projects, yet the vast majority of these samples are curated at only a small number of facilities. These numbers, combined with the substantial similarity in sample types, make core samples a compelling target for IGSN implementation. However, differences between core sample communities and other geoscience disciplines continue to create barriers to implementation. Core samples involve parent-child relationships spanning 8 or more generations, an exponential increase in sample numbers between levels in the hierarchy, concepts related to depth/position in the sample, requirements for associating data derived from core scanning and lithologic description with data derived from subsample analysis, and publications based on tens of thousands of co-registered scan data points and thousands of analyses of subsamples. These characteristics require specialized resources for accurate and consistent assignment of IGSNs, and a community of practice to establish norms

  2. Study of tritium permeation through Peach Bottom Steam Generator tubes

    International Nuclear Information System (INIS)

    Yang, L.; Baugh, W.A.; Baldwin, N.L.

    1977-06-01

    The report describes the equipment developed, samples tested, procedures used, and results obtained in the tritium permeation tests conducted on steam generator tubing samples which were removed from the Peach Bottom Unit No. 1 reactor

  3. Sample Selection for Training Cascade Detectors.

    Directory of Open Access Journals (Sweden)

    Noelia Vállez

    Full Text Available Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  4. Use of Sequenom sample ID Plus® SNP genotyping in identification of FFPE tumor samples.

    Directory of Open Access Journals (Sweden)

    Jessica K Miller

    Full Text Available Short tandem repeat (STR analysis, such as the AmpFlSTR® Identifiler® Plus kit, is a standard, PCR-based human genotyping method used in the field of forensics. Misidentification of cell line and tissue DNA can be costly if not detected early; therefore it is necessary to have quality control measures such as STR profiling in place. A major issue in large-scale research studies involving archival formalin-fixed paraffin embedded (FFPE tissues is that varying levels of DNA degradation can result in failure to correctly identify samples using STR genotyping. PCR amplification of STRs of several hundred base pairs is not always possible when DNA is degraded. The Sample ID Plus® panel from Sequenom allows for human DNA identification and authentication using SNP genotyping. In comparison to lengthy STR amplicons, this multiplexing PCR assay requires amplification of only 76-139 base pairs, and utilizes 47 SNPs to discriminate between individual samples. In this study, we evaluated both STR and SNP genotyping methods of sample identification, with a focus on paired FFPE tumor/normal DNA samples intended for next-generation sequencing (NGS. The ability to successfully validate the identity of FFPE samples can enable cost savings by reducing rework.

  5. Ultrasonic assisted extraction - an alternative for sample preparation (M4)

    International Nuclear Information System (INIS)

    Santos Junior, P.; Barbosa Junior, F.; Krug, F.J.; Trevizan, L.C.; Nobrega, J.A.

    2002-01-01

    Full text: In the last years the ultrasound assisted metal extraction has been frequency proposed as a simple and inexpensive alternative for sample preparation of biological and inorganic samples. The extraction effect is considered as being caused by acoustic cavitation, that is, bubble formation and subsequent disruptive action. The collapse of bubbles created by sonication of solutions results in the generation of extremely high local temperature and pressure gradients, which may be regarded as localized 'hot spots'. On a timescale of about 10 -10 s, effective local pressures and temperature of about 10 5 atm and about 5000 K, respectively, are generated under sonochemical conditions. Usually, this method uses a diluted acid medium decreasing blank values and reducing both reagents and time consumption compared to traditional wet digestion systems using conductive or microwave-assisted heating. Furthermore, sonication can also allow the preparation of samples directly within the sample container, thereby preventing sample losses and minimizing sample contamination. Although some controversial results concerning metals extraction behavior have been reported, they could be explained by analyte-matrix interaction and the ability of the ultrasonic processor to generate ultrasound (i.e. the use of an ultrasonic bath or an ultrasonic probe at different power, frequency, and amplitude). This contribution presents a review of ultrasound assisted metal extraction and recent performance data obtained in our laboratory for determination of elements in biological materials, soils and sediments by ICP-OES and ETAAS. The effect of extraction parameters, such as type and concentration of the leaching solution, sonication time and performance of ultrasonic processor (bath or probe) will be presented. (author)

  6. Women's Relationship to Feminism: Effects of Generation and Feminist Self-Labeling

    Science.gov (United States)

    Duncan, Lauren E.

    2010-01-01

    The relative importance to feminism of generation and feminist self-labeling was explored in a sample of 667 women riding buses to a 1992 March on Washington for Reproductive Rights. Specifically, generational (Generation X vs. Baby Boomers) and feminist self-labeling (strong feminists vs. weak feminists vs. nonfeminists) similarities and…

  7. Transport Powder and Liquid Samples by Surface Acoustic Waves

    Science.gov (United States)

    Bao, Xiaoqi; Bar-Cohen, Yoseph; Sherrit, Stewart; Badescu, Mircea; Louyeh, Sahar

    2009-01-01

    Sample transport is an important requirement for In-situ analysis of samples in NASA planetary exploration missions. Tests have shown that powders or liquid drops on a surface can be transported by surface acoustic waves (SAW) that are generated on the surface using interdigital transducers. The phenomena were investigated experimentally and to generate SAWs interdigital electrodes were deposited on wafers of 128 deg rotated Y-cut LiNbO?. Transporting capability of the SAW device was tested using particles of various sizes and drops of various viscosities liquids. Because of different interaction mechanisms with the SAWs, the powders and the liquid drops were observed to move in opposite directions. In the preliminary tests, a speed of 180 mm/s was achieved for powder transportation. The detailed experimental setup and results are presented in this paper. The transporting mechanism can potentially be applied to miniaturize sample analysis system or " lab-on-chip" devices.

  8. Optimal Excitation Controller Design for Wind Turbine Generator

    Directory of Open Access Journals (Sweden)

    A. K. Boglou

    2011-01-01

    Full Text Available An optimal excitation controller design based on multirate-output controllers (MROCs having a multirate sampling mechanismwith different sampling period in each measured output of the system is presented. The proposed H∞ -control techniqueis applied to the discrete linear open-loop system model which represents a wind turbine generator supplying an infinite busthrough a transmission line.

  9. Container for gaseous samples for irradiation at accelerators

    International Nuclear Information System (INIS)

    Kupsch, H.; Riemenschneider, J.; Leonhardt, J.

    1985-01-01

    The invention concerns a container for gaseous samples for the irradiation at accelerators especially to generate short-lived radioisotopes. The container is also suitable for storage and transport of the target gas and can be multiply reused

  10. Surface plasmon resonance sensor based on golden nanoparticles and cold vapour generation technique for the detection of mercury in aqueous samples

    Science.gov (United States)

    Castillo, Jimmy; Chirinos, José; Gutiérrez, Héctor; La Cruz, Marie

    2017-09-01

    In this work, a surface plasmon resonance sensor for determination of Hg based on golden nanoparticles was developed. The sensor follows the change of the signal from solutions in contact with atomic mercury previously generated by the reaction with sodium borohydride. Mie theory predicts that Hg film, as low as 5 nm, induced a significant reduction of the surface plasmon resonance signal of 40 nm golden nanoparticles. This property was used for quantification purposes in the sensor. The device provide limits of detection of 172 ng/L that can compared with the 91 ng/L obtained with atomic fluorescence, a common technique used for Hg quantification in drinking water. This result was relevant, considering that it was not necessary to functionalize the nanoparticles or use nanoparticles deposited in a substrate. Also, thanks that Hg is released from the matrix, the surface plasmon resonance signal was not affected by concomitant elements in the sample.

  11. Generalized atmospheric sampling of self-avoiding walks

    International Nuclear Information System (INIS)

    Van Rensburg, E J Janse; Rechnitzer, A

    2009-01-01

    In this paper, we introduce a new Monte Carlo method for sampling lattice self-avoiding walks. The method, which we call 'GAS' (generalized atmospheric sampling), samples walks along weighted sequences by implementing elementary moves generated by the positive, negative and neutral atmospheric statistics of the walks. A realized sequence is weighted such that the average weight of states of length n is proportional to the number of self-avoiding walks from the origin c n . In addition, the method also self-tunes to sample from uniform distributions over walks of lengths in an interval [0, n max ]. We show how to implement GAS using both generalized and endpoint atmospheres of walks and analyse our data to obtain estimates of the growth constant and entropic exponent of self-avoiding walks in the square and cubic lattices.

  12. Sampling Methods for Wallenius' and Fisher's Noncentral Hypergeometric Distributions

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    the mode, ratio-of-uniforms rejection method, and rejection by sampling in the tau domain. Methods for the multivariate distributions include: simulation of urn experiments, conditional method, Gibbs sampling, and Metropolis-Hastings sampling. These methods are useful for Monte Carlo simulation of models...... of biased sampling and models of evolution and for calculating moments and quantiles of the distributions.......Several methods for generating variates with univariate and multivariate Wallenius' and Fisher's noncentral hypergeometric distributions are developed. Methods for the univariate distributions include: simulation of urn experiments, inversion by binary search, inversion by chop-down search from...

  13. Developing a cosmic ray muon sampling capability for muon tomography and monitoring applications

    Science.gov (United States)

    Chatzidakis, S.; Chrysikopoulou, S.; Tsoukalas, L. H.

    2015-12-01

    In this study, a cosmic ray muon sampling capability using a phenomenological model that captures the main characteristics of the experimentally measured spectrum coupled with a set of statistical algorithms is developed. The "muon generator" produces muons with zenith angles in the range 0-90° and energies in the range 1-100 GeV and is suitable for Monte Carlo simulations with emphasis on muon tomographic and monitoring applications. The muon energy distribution is described by the Smith and Duller (1959) [35] phenomenological model. Statistical algorithms are then employed for generating random samples. The inverse transform provides a means to generate samples from the muon angular distribution, whereas the Acceptance-Rejection and Metropolis-Hastings algorithms are employed to provide the energy component. The predictions for muon energies 1-60 GeV and zenith angles 0-90° are validated with a series of actual spectrum measurements and with estimates from the software library CRY. The results confirm the validity of the phenomenological model and the applicability of the statistical algorithms to generate polyenergetic-polydirectional muons. The response of the algorithms and the impact of critical parameters on computation time and computed results were investigated. Final output from the proposed "muon generator" is a look-up table that contains the sampled muon angles and energies and can be easily integrated into Monte Carlo particle simulation codes such as Geant4 and MCNP.

  14. Vapor generation methods for explosives detection research

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Ewing, Robert G.; Atkinson, David A.

    2012-12-01

    The generation of calibrated vapor samples of explosives compounds remains a challenge due to the low vapor pressures of the explosives, adsorption of explosives on container and tubing walls, and the requirement to manage (typically) multiple temperature zones as the vapor is generated, diluted, and delivered. Methods that have been described to generate vapors can be classified as continuous or pulsed flow vapor generators. Vapor sources for continuous flow generators are typically explosives compounds supported on a solid support, or compounds contained in a permeation or diffusion device. Sources are held at elevated isothermal temperatures. Similar sources can be used for pulsed vapor generators; however, pulsed systems may also use injection of solutions onto heated surfaces with generation of both solvent and explosives vapors, transient peaks from a gas chromatograph, or vapors generated by s programmed thermal desorption. This article reviews vapor generator approaches with emphasis on the method of generating the vapors and on practical aspects of vapor dilution and handling. In addition, a gas chromatographic system with two ovens that is configurable with up to four heating ropes is proposed that could serve as a single integrated platform for explosives vapor generation and device testing. Issues related to standards, calibration, and safety are also discussed.

  15. Human Genetic Variation and Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Sun Ju Chung

    2010-05-01

    Full Text Available Parkinson’s disease (PD is a chronic neurodegenerative disorder with multifactorial etiology. In the past decade, the genetic causes of monogenic forms of familial PD have been defined. However, the etiology and pathogenesis of the majority of sporadic PD cases that occur in outbred populations have yet to be clarified. The recent development of resources such as the International HapMap Project and technological advances in high-throughput genotyping have provided new basis for genetic association studies of common complex diseases, including PD. A new generation of genome-wide association studies will soon offer a potentially powerful approach for mapping causal genes and will likely change treatment and alter our perception of the genetic determinants of PD. However, the execution and analysis of such studies will require great care.

  16. Selection signatures in worldwide sheep populations.

    Science.gov (United States)

    Fariello, Maria-Ines; Servin, Bertrand; Tosser-Klopp, Gwenola; Rupp, Rachel; Moreno, Carole; San Cristobal, Magali; Boitard, Simon

    2014-01-01

    The diversity of populations in domestic species offers great opportunities to study genome response to selection. The recently published Sheep HapMap dataset is a great example of characterization of the world wide genetic diversity in sheep. In this study, we re-analyzed the Sheep HapMap dataset to identify selection signatures in worldwide sheep populations. Compared to previous analyses, we made use of statistical methods that (i) take account of the hierarchical structure of sheep populations, (ii) make use of linkage disequilibrium information and (iii) focus specifically on either recent or older selection signatures. We show that this allows pinpointing several new selection signatures in the sheep genome and distinguishing those related to modern breeding objectives and to earlier post-domestication constraints. The newly identified regions, together with the ones previously identified, reveal the extensive genome response to selection on morphology, color and adaptation to new environments.

  17. Analog automatic test pattern generation for quasi-static structural test.

    NARCIS (Netherlands)

    Zjajo, A.; Pineda de Gyvez, J.

    2009-01-01

    A new approach for structural, fault-oriented analog test generation methodology to test for the presence of manufacturing-related defects is proposed. The output of the test generator consists of optimized test stimuli, fault coverage and sampling instants that are sufficient to detect the failure

  18. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    Science.gov (United States)

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  20. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-01-01

    This document provides a management tool for evaluating and designing the appropriate elements of a field sampling program. This document provides discussion of the elements of a program and is to be used as a guidance document during the preparation of project and/or function specific documentation. This document does not specify how a sampling program shall be organized. The HSQMP is to be used as a companion document to the Hanford Analytical Services Quality Assurance Plan (HASQAP) DOE/RL-94-55. The generation of this document was enhanced by conducting baseline evaluations of current sampling organizations. Valuable input was received from members of field and Quality Assurance organizations. The HSQMP is expected to be a living document. Revisions will be made as regulations and or Hanford Site conditions warrant changes in the best management practices. Appendices included are: summary of the sampling and analysis work flow process, a user's guide to the Data Quality Objective process, and a self-assessment checklist

  1. Development of novel and sensitive methods for the determination of sulfide in aqueous samples by hydrogen sulfide generation-inductively coupled plasma-atomic emission spectroscopy.

    Science.gov (United States)

    Colon, M; Todolí, J L; Hidalgo, M; Iglesias, M

    2008-02-25

    Two new, simple and accurate methods for the determination of sulfide (S(2-)) at low levels (microgL(-1)) in aqueous samples were developed. The generation of hydrogen sulfide (H(2)S) took place in a coil where sulfide reacted with hydrochloric acid. The resulting H(2)S was then introduced as a vapor into an inductively coupled plasma-atomic emission spectrometer (ICP-AES) and sulfur emission intensity was measured at 180.669nm. In comparison to when aqueous sulfide was introduced, the introduction of sulfur as H(2)S enhanced the sulfur signal emission. By setting a gas separator at the end of the reaction coil, reduced sulfur species in the form of H(2)S were removed from the water matrix, thus, interferences could be avoided. Alternatively, the gas separator was replaced by a nebulizer/spray chamber combination to introduce the sample matrix and reagents into the plasma. This methodology allowed the determination of both sulfide and sulfate in aqueous samples. For both methods the linear response was found to range from 5microgL(-1) to 25mgL(-1) of sulfide. Detection limits of 5microgL(-1) and 6microgL(-1) were obtained with and without the gas separator, respectively. These new methods were evaluated by comparison to the standard potentiometric method and were successfully applied to the analysis of reduced sulfur species in environmental waters.

  2. Development of novel and sensitive methods for the determination of sulfide in aqueous samples by hydrogen sulfide generation-inductively coupled plasma-atomic emission spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Colon, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Departamento de Quimica Analitica, Nutricion y Bromatologia, University of Alicante, 03080 Alicante (Spain); Todoli, J.L. [Departamento de Quimica Analitica, Nutricion y Bromatologia, University of Alicante, 03080 Alicante (Spain); Hidalgo, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Iglesias, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain)], E-mail: monica.iglesias@udg.es

    2008-02-25

    Two new, simple and accurate methods for the determination of sulfide (S{sup 2-}) at low levels ({mu}g L{sup -1}) in aqueous samples were developed. The generation of hydrogen sulfide (H{sub 2}S) took place in a coil where sulfide reacted with hydrochloric acid. The resulting H{sub 2}S was then introduced as a vapor into an inductively coupled plasma-atomic emission spectrometer (ICP-AES) and sulfur emission intensity was measured at 180.669 nm. In comparison to when aqueous sulfide was introduced, the introduction of sulfur as H{sub 2}S enhanced the sulfur signal emission. By setting a gas separator at the end of the reaction coil, reduced sulfur species in the form of H{sub 2}S were removed from the water matrix, thus, interferences could be avoided. Alternatively, the gas separator was replaced by a nebulizer/spray chamber combination to introduce the sample matrix and reagents into the plasma. This methodology allowed the determination of both sulfide and sulfate in aqueous samples. For both methods the linear response was found to range from 5 {mu}g L{sup -1} to 25 mg L{sup -1} of sulfide. Detection limits of 5 {mu}g L{sup -1} and 6 {mu}g L{sup -1} were obtained with and without the gas separator, respectively. These new methods were evaluated by comparison to the standard potentiometric method and were successfully applied to the analysis of reduced sulfur species in environmental waters.

  3. Toward a Principled Sampling Theory for Quasi-Orders.

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  4. Toward a Principled Sampling Theory for Quasi-Orders

    Science.gov (United States)

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  5. An open-population hierarchical distance sampling model

    Science.gov (United States)

    Sollmann, Rachel; Beth Gardner,; Richard B Chandler,; Royle, J. Andrew; T Scott Sillett,

    2015-01-01

    Modeling population dynamics while accounting for imperfect detection is essential to monitoring programs. Distance sampling allows estimating population size while accounting for imperfect detection, but existing methods do not allow for direct estimation of demographic parameters. We develop a model that uses temporal correlation in abundance arising from underlying population dynamics to estimate demographic parameters from repeated distance sampling surveys. Using a simulation study motivated by designing a monitoring program for island scrub-jays (Aphelocoma insularis), we investigated the power of this model to detect population trends. We generated temporally autocorrelated abundance and distance sampling data over six surveys, using population rates of change of 0.95 and 0.90. We fit the data generating Markovian model and a mis-specified model with a log-linear time effect on abundance, and derived post hoc trend estimates from a model estimating abundance for each survey separately. We performed these analyses for varying number of survey points. Power to detect population changes was consistently greater under the Markov model than under the alternatives, particularly for reduced numbers of survey points. The model can readily be extended to more complex demographic processes than considered in our simulations. This novel framework can be widely adopted for wildlife population monitoring.

  6. An open-population hierarchical distance sampling model.

    Science.gov (United States)

    Sollmann, Rahel; Gardner, Beth; Chandler, Richard B; Royle, J Andrew; Sillett, T Scott

    2015-02-01

    Modeling population dynamics while accounting for imperfect detection is essential to monitoring programs. Distance sampling allows estimating population size while accounting for imperfect detection, but existing methods do not allow for estimation of demographic parameters. We develop a model that uses temporal correlation in abundance arising from underlying population dynamics to estimate demographic parameters from repeated distance sampling surveys. Using a simulation study motivated by designing a monitoring program for Island Scrub-Jays (Aphelocoma insularis), we investigated the power of this model to detect population trends. We generated temporally autocorrelated abundance and distance sampling data over six surveys, using population rates of change of 0.95 and 0.90. We fit the data generating Markovian model and a mis-specified model with a log-linear time effect on abundance, and derived post hoc trend estimates from a model estimating abundance for each survey separately. We performed these analyses for varying numbers of survey points. Power to detect population changes was consistently greater under the Markov model than under the alternatives, particularly for reduced numbers of survey points. The model can readily be extended to more complex demographic processes than considered in our simulations. This novel framework can be widely adopted for wildlife population monitoring.

  7. Genome-wide associations of gene expression variation in humans.

    Directory of Open Access Journals (Sweden)

    Barbara E Stranger

    2005-12-01

    Full Text Available The exploration of quantitative variation in human populations has become one of the major priorities for medical genetics. The successful identification of variants that contribute to complex traits is highly dependent on reliable assays and genetic maps. We have performed a genome-wide quantitative trait analysis of 630 genes in 60 unrelated Utah residents with ancestry from Northern and Western Europe using the publicly available phase I data of the International HapMap project. The genes are located in regions of the human genome with elevated functional annotation and disease interest including the ENCODE regions spanning 1% of the genome, Chromosome 21 and Chromosome 20q12-13.2. We apply three different methods of multiple test correction, including Bonferroni, false discovery rate, and permutations. For the 374 expressed genes, we find many regions with statistically significant association of single nucleotide polymorphisms (SNPs with expression variation in lymphoblastoid cell lines after correcting for multiple tests. Based on our analyses, the signal proximal (cis- to the genes of interest is more abundant and more stable than distal and trans across statistical methodologies. Our results suggest that regulatory polymorphism is widespread in the human genome and show that the 5-kb (phase I HapMap has sufficient density to enable linkage disequilibrium mapping in humans. Such studies will significantly enhance our ability to annotate the non-coding part of the genome and interpret functional variation. In addition, we demonstrate that the HapMap cell lines themselves may serve as a useful resource for quantitative measurements at the cellular level.

  8. Genome-Wide Associations of Gene Expression Variation in Humans.

    Directory of Open Access Journals (Sweden)

    2005-12-01

    Full Text Available The exploration of quantitative variation in human populations has become one of the major priorities for medical genetics. The successful identification of variants that contribute to complex traits is highly dependent on reliable assays and genetic maps. We have performed a genome-wide quantitative trait analysis of 630 genes in 60 unrelated Utah residents with ancestry from Northern and Western Europe using the publicly available phase I data of the International HapMap project. The genes are located in regions of the human genome with elevated functional annotation and disease interest including the ENCODE regions spanning 1% of the genome, Chromosome 21 and Chromosome 20q12-13.2. We apply three different methods of multiple test correction, including Bonferroni, false discovery rate, and permutations. For the 374 expressed genes, we find many regions with statistically significant association of single nucleotide polymorphisms (SNPs with expression variation in lymphoblastoid cell lines after correcting for multiple tests. Based on our analyses, the signal proximal (cis- to the genes of interest is more abundant and more stable than distal and trans across statistical methodologies. Our results suggest that regulatory polymorphism is widespread in the human genome and show that the 5-kb (phase I HapMap has sufficient density to enable linkage disequilibrium mapping in humans. Such studies will significantly enhance our ability to annotate the non-coding part of the genome and interpret functional variation. In addition, we demonstrate that the HapMap cell lines themselves may serve as a useful resource for quantitative measurements at the cellular level.

  9. Towards next-generation biodiversity assessment using DNA metabarcoding

    DEFF Research Database (Denmark)

    Taberlet, Pierre; Coissac, Eric; Pompanon, Francois

    2012-01-01

    Virtually all empirical ecological studies require species identification during data collection. DNA metabarcoding refers to the automated identification of multiple species from a single bulk sample containing entire organisms or from a single environmental sample containing degraded DNA (soil......, water, faeces, etc.). It can be implemented for both modern and ancient environmental samples. The availability of next-generation sequencing platforms and the ecologists need for high-throughput taxon identification have facilitated the emergence of DNA metabarcoding. The potential power of DNA...

  10. Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) scores generated from the MMPI-2 and MMPI-2-RF test booklets: internal structure comparability in a sample of criminal defendants.

    Science.gov (United States)

    Tarescavage, Anthony M; Alosco, Michael L; Ben-Porath, Yossef S; Wood, Arcangela; Luna-Jones, Lynn

    2015-04-01

    We investigated the internal structure comparability of Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) scores derived from the MMPI-2 and MMPI-2-RF booklets in a sample of 320 criminal defendants (229 males and 54 females). After exclusion of invalid protocols, the final sample consisted of 96 defendants who were administered the MMPI-2-RF booklet and 83 who completed the MMPI-2. No statistically significant differences in MMPI-2-RF invalidity rates were observed between the two forms. Individuals in the final sample who completed the MMPI-2-RF did not statistically differ on demographics or referral question from those who were administered the MMPI-2 booklet. Independent t tests showed no statistically significant differences between MMPI-2-RF scores generated with the MMPI-2 and MMPI-2-RF booklets on the test's substantive scales. Statistically significant small differences were observed on the revised Variable Response Inconsistency (VRIN-r) and True Response Inconsistency (TRIN-r) scales. Cronbach's alpha and standard errors of measurement were approximately equal between the booklets for all MMPI-2-RF scales. Finally, MMPI-2-RF intercorrelations produced from the two forms yielded mostly small and a few medium differences, indicating that discriminant validity and test structure are maintained. Overall, our findings reflect the internal structure comparability of MMPI-2-RF scale scores generated from MMPI-2 and MMPI-2-RF booklets. Implications of these results and limitations of these findings are discussed. © The Author(s) 2014.

  11. Effects of sample size on robustness and prediction accuracy of a prognostic gene signature

    Directory of Open Access Journals (Sweden)

    Kim Seon-Young

    2009-05-01

    Full Text Available Abstract Background Few overlap between independently developed gene signatures and poor inter-study applicability of gene signatures are two of major concerns raised in the development of microarray-based prognostic gene signatures. One recent study suggested that thousands of samples are needed to generate a robust prognostic gene signature. Results A data set of 1,372 samples was generated by combining eight breast cancer gene expression data sets produced using the same microarray platform and, using the data set, effects of varying samples sizes on a few performances of a prognostic gene signature were investigated. The overlap between independently developed gene signatures was increased linearly with more samples, attaining an average overlap of 16.56% with 600 samples. The concordance between predicted outcomes by different gene signatures also was increased with more samples up to 94.61% with 300 samples. The accuracy of outcome prediction also increased with more samples. Finally, analysis using only Estrogen Receptor-positive (ER+ patients attained higher prediction accuracy than using both patients, suggesting that sub-type specific analysis can lead to the development of better prognostic gene signatures Conclusion Increasing sample sizes generated a gene signature with better stability, better concordance in outcome prediction, and better prediction accuracy. However, the degree of performance improvement by the increased sample size was different between the degree of overlap and the degree of concordance in outcome prediction, suggesting that the sample size required for a study should be determined according to the specific aims of the study.

  12. In situ sampling for pressure tube deuterium concentration

    International Nuclear Information System (INIS)

    Harrington, A.J.; Kittmer, C.A.

    1988-01-01

    The present method of assessing the useful life of pressure tubes in CANDU (CANada Deuterium Uranium) reactors requires the periodic removal and examination of a tube. Special tooling was developed at Atomic Energy of Canada Limited (AECL) to obtain a sample of material from a pressure tube without removing the tube from the reactor. The sampling tool concept has been successfully used by Ontario Hydro during scheduled outages at the Pickering Nuclear Generating Station (PNGS). (author)

  13. Simple and rapid determination methods for low-level radioactive wastes generated from nuclear research facilities. Guidelines for determination of radioactive waste samples

    International Nuclear Information System (INIS)

    Kameo, Yutaka; Shimada, Asako; Ishimori, Ken-ichiro; Haraga, Tomoko; Katayama, Atsushi; Nakashima, Mikio; Hoshi, Akiko

    2009-10-01

    Analytical methods were developed for simple and rapid determination of U, Th, and several nuclides, which are selected as important nuclides for safety assessment of disposal of wastes generated from research facilities at Nuclear Science Research Institute and Oarai Research and Development Center. The present analytical methods were assumed to apply to solidified products made from miscellaneous wastes by plasma melting in the Advanced Volume Reduction Facilities. In order to establish a system to analyze the important nuclides in the solidified products at low cost and routinely, we have advanced the development of a high-efficiency non-destructive measurement technique for γ-ray emitting nuclides, simple and rapid methods for pretreatment of solidified product samples and subsequent radiochemical separations, and rapid determination methods for long-lived nuclides. In the present paper, we summarized the methods developed as guidelines for determination of radionuclides in the low-level solidified products. (author)

  14. Boat sampling technique for assessment of ageing of components

    International Nuclear Information System (INIS)

    Kumar, Kundan; Shyam, T.V.; Kayal, J.N.; Rupani, B.B.

    2006-01-01

    Boat sampling technique (BST) is a surface sampling technique, which has been developed for obtaining, in-situ, metal samples from the surface of an operating component without affecting its operating service life. The BST is non-destructive in nature and the sample is obtained without plastic deformation or without thermal degradation of the parent material. The shape and size of the sample depends upon the shape of the cutter and the surface geometry of the parent material. Miniature test specimens are generated from the sample and the specimens are subjected to various tests, viz. Metallurgical Evaluation, Metallographic Evaluation, Micro-hardness Evaluation, sensitisation test, small punch test etc. to confirm the integrity and assessment of safe operating life of the component. This paper highlights design objective of boat sampling technique, description of sampling module, sampling cutter and its performance evaluation, cutting process, boat samples, operational sequence of sampling module, qualification of sampling module, qualification of sampling technique, qualification of scooped region of the parent material, sample retrieval system, inspection, testing and examination to be carried out on the boat samples and scooped region. (author)

  15. New adaptive sampling method in particle image velocimetry

    International Nuclear Information System (INIS)

    Yu, Kaikai; Xu, Jinglei; Tang, Lan; Mo, Jianwei

    2015-01-01

    This study proposes a new adaptive method to enable the number of interrogation windows and their positions in a particle image velocimetry (PIV) image interrogation algorithm to become self-adapted according to the seeding density. The proposed method can relax the constraint of uniform sampling rate and uniform window size commonly adopted in the traditional PIV algorithm. In addition, the positions of the sampling points are redistributed on the basis of the spring force generated by the sampling points. The advantages include control of the number of interrogation windows according to the local seeding density and smoother distribution of sampling points. The reliability of the adaptive sampling method is illustrated by processing synthetic and experimental images. The synthetic example attests to the advantages of the sampling method. Compared with that of the uniform interrogation technique in the experimental application, the spatial resolution is locally enhanced when using the proposed sampling method. (technical design note)

  16. An empirical assessment of generational differences in basic human values.

    Science.gov (United States)

    Lyons, Sean T; Duxbury, Linda; Higgins, Christopher

    2007-10-01

    This study assessed generational differences in human values as measured by the Schwartz Value Survey. It was proposed that the two most recent generations, Millennials and Generation Xers, would value Self-enhancement and Openness to Change more than the two older generations, Baby Boomers and Matures, while the two older generations would value Self-transcendence and Conservation more. The hypotheses were tested with a combined sample of Canadian knowledge workers and undergraduate business students (N = 1,194). Two hypotheses were largely supported, although an unexpectedly large difference was observed between Millennials and Generation Xers with respect to Openness to Change and Self-enhancement. The findings suggest that generation is a useful variable in examining differences in social values.

  17. Geographical structure and differential natural selection among North European populations

    DEFF Research Database (Denmark)

    McEvoy, Brian P; Montgomery, Grant W; McRae, Allan F

    2009-01-01

    polymorphism, in 2099 individuals from populations of Northern European origin (Ireland, United Kingdom, Netherlands, Denmark, Sweden, Finland, Australia, and HapMap European-American). The major trends (PC1 and PC2) demonstrate an ability to detect geographic substructure, even over a small area like...

  18. Survey of generational aspects of nurse faculty organizational commitment.

    Science.gov (United States)

    Carver, Lara; Candela, Lori; Gutierrez, Antonio P

    2011-01-01

    To describe organizational commitment and generational differences in nursing faculty. The study provides new knowledge on generational differences in organizational commitment among nursing faculty with regard to work values, perceived organizational support, perceived person-organization fit, developmental experiences, and global job satisfaction. A cross-sectional, descriptive design was used with random stratified sampling procedures. Surveys measuring organizational commitment and related constructs were sent electronically to 4886 faculty, yielding a 30% response rate. Significant differences were noted between generations of faculty regarding organizational commitment and related measures. Include specific strategies for fostering commitment from each generation. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Fuzzy generation scheduling for a generation company (GenCo) with large scale wind farms

    International Nuclear Information System (INIS)

    Siahkali, H.; Vakilian, M.

    2010-01-01

    Wind power is a promising alternative in power generation because of its tremendous environmental and social benefits. Generation scheduling (GS) is more important in a power system integrating wind farms. Unlike conventional power generation sources, wind power generators supply intermittent power because of uncertainty in resource. This paper presents a fuzzy approach to the generation scheduling problem of a GenCo considering uncertainties in parameters or constraints such as load, reserve and available wind power generation. The modeling of constraints is an important issue in power system scheduling. A fuzzy optimization approach is an approach that can be used to obtain the generation scheduling under an uncertain environment. In this paper, a fuzzy optimization-based method is developed to solve power system GS problem with fuzzy objective and constraints. The crisp formulation of this GS problem is firstly defined and is rearranged by introduction of a membership function of some constraints and objective function. Then, this fuzzy optimization problem is converted to a crisp optimization and solved using GAMS software by mixed integer nonlinear programming. Employing the fuzzy optimization GS, it is expected that in practice a higher profit would be achieved in the operation and cost management of a real power system with large scale wind farms in different level of constraints' satisfaction. The proposed approach is applied to a sample system (including six conventional units and two wind farms) and the results are compared with the results of crisp solution. This approach is also applied to a larger test case to demonstrate the robustness of this fuzzy optimization method.

  20. Design of analytical instrumentation with D-T sealed neutron generators

    International Nuclear Information System (INIS)

    Qiao Yahua; Wu Jizong; Zheng Weiming; Liu Quanwei; Zhang Min

    2008-01-01

    Analytical instrumentation with D-T sealed neutron generators source activation, The 14 MeV D-T sealed neutron tube with 10 9 n · s -1 neutron yield is used as generator source. The optimal structure of moderator and shield was achieved by MC computing.The instrumentation's configuration is showed. The instrumentation is made up of the SMY-DT50.8-2.1 sealed neutron tube and the high-voltage power supply system, which center is the sealed neutron generators. 6 cm Pb and 20 cm polythene is chosen as moderator, Pb, polythene and 10 cm boron-PE was chosen as shield .The sample box is far the source from 9 cm, the measurement system were made up of HPGe detector and the sample transforming system. After moderator and shield, the thermal neutron fluence rate at the point of sample is 0.93 × 10 6 n · s -1 cm -2 , which is accorded with design demand, and the laboratory and surroundings reaches the safety standard of the dose levels. (authors)

  1. P. 2234 – Intergenerational transmission of perceived parental rearing styles: a three generation families study

    OpenAIRE

    Lopes, Fábio; Espirito-Santo, Helena; Vicente, Henrique

    2013-01-01

    Introduction The transmission of perceived parental rearing styles through generations has been proved in several studies, mostly in studies with two generations samples. Objectives/aims The main aim of this study is to investigate the intergenerational transmission of the perception of parental rearing styles in families composed by three generations. Methodology A convenience sample of 143 participants was collected, belonging to a female lineage subsystem, divided in three...

  2. Multiscale study on stochastic reconstructions of shale samples

    Science.gov (United States)

    Lili, J.; Lin, M.; Jiang, W. B.

    2016-12-01

    Shales are known to have multiscale pore systems, composed of macroscale fractures, micropores, and nanoscale pores within gas or oil-producing organic material. Also, shales are fissile and laminated, and the heterogeneity in horizontal is quite different from that in vertical. Stochastic reconstructions are extremely useful in situations where three-dimensional information is costly and time consuming. Thus the purpose of our paper is to reconstruct stochastically equiprobable 3D models containing information from several scales. In this paper, macroscale and microscale images of shale structure in the Lower Silurian Longmaxi are obtained by X-ray microtomography and nanoscale images are obtained by scanning electron microscopy. Each image is representative for all given scales and phases. Especially, the macroscale is four times coarser than the microscale, which in turn is four times lower in resolution than the nanoscale image. Secondly, the cross correlation-based simulation method (CCSIM) and the three-step sampling method are combined together to generate stochastic reconstructions for each scale. It is important to point out that the boundary points of pore and matrix are selected based on multiple-point connectivity function in the sampling process, and thus the characteristics of the reconstructed image can be controlled indirectly. Thirdly, all images with the same resolution are developed through downscaling and upscaling by interpolation, and then we merge multiscale categorical spatial data into a single 3D image with predefined resolution (the microscale image). 30 realizations using the given images and the proposed method are generated. The result reveals that the proposed method is capable of preserving the multiscale pore structure, both vertically and horizontally, which is necessary for accurate permeability prediction. The variogram curves and pore-size distribution for both original 3D sample and the generated 3D realizations are compared

  3. The results of experimental studies of VLF–ULF electromagnetic emission by rock samples due to mechanical action

    OpenAIRE

    A. A. Panfilov

    2013-01-01

    The paper presents the results of laboratory experiments on electromagnetic emission excitation (electric component of electromagnetic field) by rock samples due to different forms of mechanical stress applications. It was shown that samples generate electric impulses with different spectra when the impact action, gradual loading or dynamic friction is applied. It was ascertained that level and spectral compositions of signals, generated by rock samples, cha...

  4. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  5. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  6. Sampling populations of humans across the world: ELSI issues.

    Science.gov (United States)

    Knoppers, Bartha Maria; Zawati, Ma'n H; Kirby, Emily S

    2012-01-01

    There are an increasing number of population studies collecting data and samples to illuminate gene-environment contributions to disease risk and health. The rising affordability of innovative technologies capable of generating large amounts of data helps achieve statistical power and has paved the way for new international research collaborations. Most data and sample collections can be grouped into longitudinal, disease-specific, or residual tissue biobanks, with accompanying ethical, legal, and social issues (ELSI). Issues pertaining to consent, confidentiality, and oversight cannot be examined using a one-size-fits-all approach-the particularities of each biobank must be taken into account. It remains to be seen whether current governance approaches will be adequate to handle the impact of next-generation sequencing technologies on communication with participants in population biobanking studies.

  7. Probabilistic generation of quantum contextual sets

    International Nuclear Information System (INIS)

    Megill, Norman D.; Fresl, Kresimir; Waegell, Mordecai; Aravind, P.K.; Pavicic, Mladen

    2011-01-01

    We give a method for exhaustive generation of a huge number of Kochen-Specker contextual sets, based on the 600-cell, for possible experiments and quantum gates. The method is complementary to our previous parity proof generation of these sets, and it gives all sets while the parity proof method gives only sets with an odd number of edges in their hypergraph representation. Thus we obtain 35 new kinds of critical KS sets with an even number of edges. We also give a statistical estimate of the number of sets that might be obtained in an eventual exhaustive enumeration. -- Highlights: → We generate millions of new Kochen-Specker noncontextual set. → We find thousands of novel critical Kochen-Specker (KS) sets. → We give algorithms for generating KS sets from a new 4-dim class. → We represent KS sets by means of hypergraphs and their figures. → We give a new exact estimation method for random sampling of sets.

  8. Tissue Sampling Guides for Porcine Biomedical Models.

    Science.gov (United States)

    Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas

    2016-04-01

    This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.

  9. Quantitative miRNA expression analysis: comparing microarrays with next-generation sequencing

    DEFF Research Database (Denmark)

    Willenbrock, Hanni; Salomon, Jesper; Søkilde, Rolf

    2009-01-01

    Recently, next-generation sequencing has been introduced as a promising, new platform for assessing the copy number of transcripts, while the existing microarray technology is considered less reliable for absolute, quantitative expression measurements. Nonetheless, so far, results from the two...... technologies have only been compared based on biological data, leading to the conclusion that, although they are somewhat correlated, expression values differ significantly. Here, we use synthetic RNA samples, resembling human microRNA samples, to find that microarray expression measures actually correlate...... better with sample RNA content than expression measures obtained from sequencing data. In addition, microarrays appear highly sensitive and perform equivalently to next-generation sequencing in terms of reproducibility and relative ratio quantification....

  10. Aggregated wind power generation probabilistic forecasting based on particle filter

    International Nuclear Information System (INIS)

    Li, Pai; Guan, Xiaohong; Wu, Jiang

    2015-01-01

    Highlights: • A new method for probabilistic forecasting of aggregated wind power generation. • A dynamic system is established based on a numerical weather prediction model. • The new method handles the non-Gaussian and time-varying wind power uncertainties. • Particle filter is applied to forecast predictive densities of wind generation. - Abstract: Probability distribution of aggregated wind power generation in a region is one of important issues for power system daily operation. This paper presents a novel method to forecast the predictive densities of the aggregated wind power generation from several geographically distributed wind farms, considering the non-Gaussian and non-stationary characteristics in wind power uncertainties. Based on a mesoscale numerical weather prediction model, a dynamic system is established to formulate the relationship between the atmospheric and near-surface wind fields of geographically distributed wind farms. A recursively backtracking framework based on the particle filter is applied to estimate the atmospheric state with the near-surface wind power generation measurements, and to forecast the possible samples of the aggregated wind power generation. The predictive densities of the aggregated wind power generation are then estimated based on these predicted samples by a kernel density estimator. In case studies, the new method presented is tested on a 9 wind farms system in Midwestern United States. The testing results that the new method can provide competitive interval forecasts for the aggregated wind power generation with conventional statistical based models, which validates the effectiveness of the new method

  11. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  12. Water and steam sampling systems; Provtagningssystem foer vatten och aanga

    Energy Technology Data Exchange (ETDEWEB)

    Hellman, Mats

    2009-10-15

    The supervision of cycle chemistry can be divided into two parts, the sampling system and the chemical analysis. In modern steam generating plants most of the chemical analyses are carried out on-line. The detection limits of these analyzers are pushed downward to the ppt-range (parts per trillion), however the analyses are not more correct than the accuracy of the sampling system. A lot of attention has been put to the analyzers and the statistics to interpret the results but the sampling procedures has gained much less attention. This report aims to give guidance of the considerations to be made regarding sampling systems. Sampling is necessary since most analysis of interesting parameters cannot be carried out in- situ on-line in the steam cycle. Today's on-line instruments for pH, conductivity, silica etc. are designed to meet a water sample at a temperature of 10-30 deg C. This means that the sampling system has to extract a representative sample from the process, transport and cool it down to room temperature without changing the characteristics of the fluid. In the literature research work, standards and other reports can be found. Although giving similar recommendations in most aspects there are some discrepancies that may be confusing. This report covers all parts in the sampling system: Sample points and nozzles; Sample lines; Valves, regulating and on-off; Sample coolers; Temperature, pressure and flow rate control; Cooling water; and Water recovery. On-line analyzers connecting to the sampling system are not covered. This report aims to clarify what guidelines are most appropriate amongst the existing ones. The report should also give guidance to the design of the sampling system in order to achieve representative samples. In addition to this the report gives an overview of the fluid mechanics involved in sampling. The target group of this report is owners and operators of steam generators, vendors of power plant equipment, consultants working in

  13. Will generation experience a culture clash in hotels : An exploratory study

    NARCIS (Netherlands)

    Groen, B.; Lub, X.D.; Nije Bijvank, M.

    2009-01-01

    This research explores how Baby Boomers (1945 -1964), Generation X (1965-1980), and Generation Y (1981-1995) perceive organisational culture in an international hotel chain, using Quinn's competing values model. The sample consisted of 181 employees (response 72%). The four orientations that

  14. Generating heavy particles with energy and momentum conservation

    Science.gov (United States)

    Mereš, Michal; Melo, Ivan; Tomášik, Boris; Balek, Vladimír; Černý, Vladimír

    2011-12-01

    We propose a novel algorithm, called REGGAE, for the generation of momenta of a given sample of particle masses, evenly distributed in Lorentz-invariant phase space and obeying energy and momentum conservation. In comparison to other existing algorithms, REGGAE is designed for the use in multiparticle production in hadronic and nuclear collisions where many hadrons are produced and a large part of the available energy is stored in the form of their masses. The algorithm uses a loop simulating multiple collisions which lead to production of configurations with reasonably large weights. Program summaryProgram title: REGGAE (REscattering-after-Genbod GenerAtor of Events) Catalogue identifier: AEJR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1523 No. of bytes in distributed program, including test data, etc.: 9608 Distribution format: tar.gz Programming language: C++ Computer: PC Pentium 4, though no particular tuning for this machine was performed. Operating system: Originally designed on Linux PC with g++, but it has been compiled and ran successfully on OS X with g++ and MS Windows with Microsoft Visual C++ 2008 Express Edition, as well. RAM: This depends on the number of particles which are generated. For 10 particles like in the attached example it requires about 120 kB. Classification: 11.2 Nature of problem: The task is to generate momenta of a sample of particles with given masses which obey energy and momentum conservation. Generated samples should be evenly distributed in the available Lorentz-invariant phase space. Solution method: In general, the algorithm works in two steps. First, all momenta are generated with the GENBOD algorithm. There, particle production is modeled as a sequence of two

  15. Magnet Free Generators - 3rd Generation Wind Turbine Generators

    DEFF Research Database (Denmark)

    Jensen, Bogi Bech; Mijatovic, Nenad; Henriksen, Matthew Lee

    2013-01-01

    This paper presents an introduction to superconducting wind turbine generators, which are often referred to as 3rd generation wind turbine generators. Advantages and challenges of superconducting generators are presented with particular focus on possible weight and efficiency improvements. A comp...

  16. Does intergenerational transmission of trauma skip a generation? No meta-analytic evidence for tertiary traumatization with third generation of Holocaust survivors.

    Science.gov (United States)

    Sagi-Schwartz, Abraham; van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J

    2008-06-01

    In a series of meta-analyses with the second generation of Holocaust survivors, no evidence for secondary traumatization was found (Van IJzendoorn, Bakermans-Kranenburg, & Sagi-Schwartz, 2003). With regard to third generation traumatization, various reports suggest the presence of intergenerational transmission of trauma. Some scholars argue that intergenerational transmission of trauma might skip a generation. Therefore, we focus in this study on the transmission of trauma to the third generation offspring (the grandchildren) of the first generation's traumatic Holocaust experiences (referred to as "tertiary traumatization"), and we present a narrative review of the pertinent studies. Meta-analytic results of 13 non-clinical samples involving 1012 participants showed no evidence for tertiary traumatization in Holocaust survivor families. Our previous meta-analytic study on secondary traumatization and the present one on third generation's psychological consequences of the Holocaust indicate a remarkable resilience of profoundly traumatized survivors in their (grand-)parental roles.

  17. Indian Institute of Science, Bangalore

    Indian Academy of Sciences (India)

    user

    2015-07-04

    Jul 4, 2015 ... About 18% of Indian population speak Dravidian language. Linguistic ... Military conquests by Arabs and Turks. British colonization. Among several ... 132 individuals. 25 populations. 15 states. All the language families. 560,123 SNPs. HGDP & HapMap. PCA - EIGENSOFT. Autosomal SNPs. Affymetrix 6.0 ...

  18. A user`s guide to LHS: Sandia`s Latin Hypercube Sampling Software

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Jorgensen, K.H. [Sandia National Labs., Albuquerque, NM (United States). Risk Assessment and Systems Modeling Dept.

    1998-02-01

    This document is a reference guide for LHS, Sandia`s Latin Hypercube Sampling Software. This software has been developed to generate either Latin hypercube or random multivariate samples. The Latin hypercube technique employs a constrained sampling scheme, whereas random sampling corresponds to a simple Monte Carlo technique. The present program replaces the previous Latin hypercube sampling program developed at Sandia National Laboratories (SAND83-2365). This manual covers the theory behind stratified sampling as well as use of the LHS code both with the Windows graphical user interface and in the stand-alone mode.

  19. Nonassociation of homocysteine gene polymorphisms with treatment outcome in South Indian Tamil Rheumatoid Arthritis patients.

    Science.gov (United States)

    Muralidharan, Niveditha; Gulati, Reena; Misra, Durga Prasanna; Negi, Vir S

    2018-02-01

    The aim of the study was to look for any association of MTR 2756A>G and MTRR 66A>G gene polymorphisms with clinical phenotype, methotrexate (MTX) treatment response, and MTX-induced adverse events in South Indian Tamil patients with rheumatoid arthritis (RA). A total of 335 patients with RA were investigated. MTR 2756A>G gene polymorphism was analyzed by PCR-RFLP, and MTRR 66A>G SNP was analyzed by TaqMan 5' nuclease assay. The allele frequencies were compared with HapMap groups. MTR 2756G allele was found to be associated with risk of developing RA. The allele frequencies of MTR 2756A>G and MTRR 66A>G SNPs in controls differed significantly when compared with HapMap groups. Neither of the SNPs influenced the MTX treatment outcome and adverse effects. Neither of the SNPs seems to be associated with MTX treatment outcome and adverse events in South Indian Tamil patients with RA.

  20. Generative Adversarial Networks-Based Semi-Supervised Learning for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Zhi He

    2017-10-01

    Full Text Available Classification of hyperspectral image (HSI is an important research topic in the remote sensing community. Significant efforts (e.g., deep learning have been concentrated on this task. However, it is still an open issue to classify the high-dimensional HSI with a limited number of training samples. In this paper, we propose a semi-supervised HSI classification method inspired by the generative adversarial networks (GANs. Unlike the supervised methods, the proposed HSI classification method is semi-supervised, which can make full use of the limited labeled samples as well as the sufficient unlabeled samples. Core ideas of the proposed method are twofold. First, the three-dimensional bilateral filter (3DBF is adopted to extract the spectral-spatial features by naturally treating the HSI as a volumetric dataset. The spatial information is integrated into the extracted features by 3DBF, which is propitious to the subsequent classification step. Second, GANs are trained on the spectral-spatial features for semi-supervised learning. A GAN contains two neural networks (i.e., generator and discriminator trained in opposition to one another. The semi-supervised learning is achieved by adding samples from the generator to the features and increasing the dimension of the classifier output. Experimental results obtained on three benchmark HSI datasets have confirmed the effectiveness of the proposed method , especially with a limited number of labeled samples.

  1. Sampling methods and data generation

    Science.gov (United States)

    The study of forensic microbiology is an inherent blend of forensic science and microbiology, and both disciplines have recently been undergoing rapid advancements in technology that are allowing for exciting new research avenues. The integration of two different disciplines poses challenges becaus...

  2. A method for generating permutation distribution of ranks in a k ...

    African Journals Online (AJOL)

    ... in a combinatorial sense the distribution of the ranks is obtained via its generating function. The formulas are defined recursively to speed up computations using the computer algebra system Mathematica. Key words: Partitions, generating functions, combinatorics, permutation test, exact tests, computer algebra, k-sample, ...

  3. Effects of data sampling rate on image quality in fan-beam-CT system

    International Nuclear Information System (INIS)

    Iwata, Akira; Yamagishi, Nobutoshi; Suzumura, Nobuo; Horiba, Isao.

    1984-01-01

    Investigation was made into the relationship between spatial resolution or artifacts and data sampling rate in order to pursue the causes of the degradation of CT image quality by computer simulation. First the generation of projection data and reconstruction calculating process are described, and then the results are shown about the relation between angular sampling interval and spatical resolution or artifacts, and about the relation between projection data sampling interval and spatial resolution or artifacts. It was clarified that the formulation of the relationship between spatial resolution and data sampling rate performed so far for parallel X-ray beam was able to be applied to fan beam. As a conclusion, when other reconstruction parameters are the same in fan beam CT systems, spatial resolution can be determined by projection data sampling rate rather than angular sampling rate. The mechanism of artifact generation due to the insufficient number of angular samples was made clear. It was also made clear that there was a definite relationship among measuring region, angular sampling rate and projection data sampling rate, and the amount of artifacts depending upon projection data sampling rate was proportional to the amount of spatial frequency components (Aliasing components) of a test object above the Nyquist frequency of projection data. (Wakatsuki, Y.)

  4. Hanford site transuranic waste sampling plan

    International Nuclear Information System (INIS)

    GREAGER, T.M.

    1999-01-01

    This sampling plan (SP) describes the selection of containers for sampling of homogeneous solids and soil/gravel and for visual examination of transuranic and mixed transuranic (collectively referred to as TRU) waste generated at the U.S. Department of Energy (DOE) Hanford Site. The activities described in this SP will be conducted under the Hanford Site TRU Waste Certification Program. This SP is designed to meet the requirements of the Transuranic Waste Characterization Quality Assurance Program Plan (CAO-94-1010) (DOE 1996a) (QAPP), site-specific implementation of which is described in the Hanford Site Transuranic Waste Characterization Program Quality Assurance Project Plan (HNF-2599) (Hanford 1998b) (QAPP). The QAPP defines the quality assurance (QA) requirements and protocols for TRU waste characterization activities at the Hanford Site. In addition, the QAPP identifies responsible organizations, describes required program activities, outlines sampling and analysis strategies, and identifies procedures for characterization activities. The QAPP identifies specific requirements for TRU waste sampling plans. Table 1-1 presents these requirements and indicates sections in this SP where these requirements are addressed

  5. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  6. Retained Gas Sampling Results for the Flammable Gas Program

    International Nuclear Information System (INIS)

    Bates, J.M.; Mahoney, L.A.; Dahl, M.E.; Antoniak, Z.I.

    1999-01-01

    The key phenomena of the Flammable Gas Safety Issue are generation of the gas mixture, the modes of gas retention, and the mechanisms causing release of the gas. An understanding of the mechanisms of these processes is required for final resolution of the safety issue. Central to understanding is gathering information from such sources as historical records, tank sampling data, tank process data (temperatures, ventilation rates, etc.), and laboratory evaluations conducted on tank waste samples

  7. Retained Gas Sampling Results for the Flammable Gas Program

    Energy Technology Data Exchange (ETDEWEB)

    J.M. Bates; L.A. Mahoney; M.E. Dahl; Z.I. Antoniak

    1999-11-18

    The key phenomena of the Flammable Gas Safety Issue are generation of the gas mixture, the modes of gas retention, and the mechanisms causing release of the gas. An understanding of the mechanisms of these processes is required for final resolution of the safety issue. Central to understanding is gathering information from such sources as historical records, tank sampling data, tank process data (temperatures, ventilation rates, etc.), and laboratory evaluations conducted on tank waste samples.

  8. Generational differences in acute care nurses.

    Science.gov (United States)

    Widger, Kimberley; Pye, Christine; Cranley, Lisa; Wilson-Keates, Barbara; Squires, Mae; Tourangeau, Ann

    2007-01-01

    Generational differences in values, expectations and perceptions of work have been proposed as one basis for problems and solutions in recruitment and retention of nurses. This study used a descriptive design. A sample of 8207 registered nurses and registered practical nurses working in Ontario, Canada, acute care hospitals who responded to the Ontario Nurse Survey in 2003 were included in this study. Respondents were categorized as Baby Boomers, Generation X or Generation Y based on their birth year. Differences in responses among these three generations to questions about their own characteristics, employment circumstances, work environment and responses to the work environment were explored. There were statistically significant differences among the generations. Baby Boomers primarily worked full-time day shifts. Gen Y tended to be employed in teaching hospitals; Boomers worked more commonly in community hospitals. Baby Boomers were generally more satisfied with their jobs than Gen X or Gen Y nurses. Gen Y had the largest proportion of nurses with high levels of burnout in the areas of emotional exhaustion and depersonalization. Baby Boomers had the largest proportion of nurses with low levels of burnout. Nurse managers may be able to capitalize on differences in generational values and needs in designing appropriate interventions to enhance recruitment and retention of nurses.

  9. An integrate-over-temperature approach for enhanced sampling.

    Science.gov (United States)

    Gao, Yi Qin

    2008-02-14

    A simple method is introduced to achieve efficient random walking in the energy space in molecular dynamics simulations which thus enhances the sampling over a large energy range. The approach is closely related to multicanonical and replica exchange simulation methods in that it allows configurations of the system to be sampled in a wide energy range by making use of Boltzmann distribution functions at multiple temperatures. A biased potential is quickly generated using this method and is then used in accelerated molecular dynamics simulations.

  10. Glyphosate–rich air samples induce IL–33, TSLP and generate IL–13 dependent airway inflammation

    Science.gov (United States)

    Kumar, Sudhir; Khodoun, Marat; Kettleson, Eric M.; McKnight, Christopher; Reponen, Tiina; Grinshpun, Sergey A.; Adhikari, Atin

    2014-01-01

    Several low weight molecules have often been implicated in the induction of occupational asthma. Glyphosate, a small molecule herbicide, is widely used in the world. There is a controversy regarding a role of glyphosate in developing asthma and rhinitis among farmers, the mechanism of which is unexplored. The aim of this study was to explore the mechanisms of glyphosate induced pulmonary pathology by utilizing murine models and real environmental samples. C57BL/6, TLR4−/−, and IL-13−/− mice inhaled extracts of glyphosate-rich air samples collected on farms during spraying of herbicides or inhaled different doses of glyphosate and ovalbumin. The cellular response, humoral response, and lung function of exposed mice were evaluated. Exposure to glyphosate-rich air samples as well as glyphosate alone to the lungs increased: eosinophil and neutrophil counts, mast cell degranulation, and production of IL-33, TSLP, IL-13, and IL-5. In contrast, in vivo systemic IL-4 production was not increased. Co-administration of ovalbumin with glyphosate did not substantially change the inflammatory immune response. However, IL-13-deficiency resulted in diminished inflammatory response but did not have a significant effect on airway resistance upon methacholine challenge after 7 or 21 days of glyphosate exposure. Glyphosate-rich farm air samples as well as glyphosate alone were found to induce pulmonary IL-13-dependent inflammation and promote Th2 type cytokines, but not IL-4 for glyphosate alone. This study, for the first time, provides evidence for the mechanism of glyphosate-induced occupational lung disease. PMID:25172162

  11. WTP Waste Feed Qualification: Hydrogen Generation Rate Measurement Apparatus Testing Report

    Energy Technology Data Exchange (ETDEWEB)

    Stone, M. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Newell, J. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Smith, T. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Pareizs, J. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-06-01

    The generation rate of hydrogen gas in the Hanford tank waste will be measured during the qualification of the staged tank waste for processing in the Hanford Tank Waste Treatment and Immobilization Plant. Based on a review of past practices in measurement of the hydrogen generation, an apparatus to perform this measurement has been designed and tested for use during waste feed qualification. The hydrogen generation rate measurement apparatus (HGRMA) described in this document utilized a 100 milliliter sample in a continuously-purged, continuously-stirred vessel, with measurement of hydrogen concentration in the vent gas. The vessel and lid had a combined 220 milliliters of headspace. The vent gas system included a small condenser to prevent excessive evaporative losses from the sample during the test, as well as a demister and filter to prevent particle migration from the sample to the gas chromatography system. The gas chromatograph was an on line automated instrument with a large-volume sample-injection system to allow measurement of very low hydrogen concentrations. This instrument automatically sampled the vent gas from the hydrogen generation rate measurement apparatus every five minutes and performed data regression in real time. The fabrication of the hydrogen generation rate measurement apparatus was in accordance with twenty three (23) design requirements documented in the conceptual design package, as well as seven (7) required developmental activities documented in the task plan associated with this work scope. The HGRMA was initially tested for proof of concept with physical simulants, and a remote demonstration of the system was performed in the Savannah River National Laboratory Shielded Cells Mockup Facility. Final verification testing was performed using non-radioactive simulants of the Hanford tank waste. Three different simulants were tested to bound the expected rheological properties expected during waste feed qualification testing. These

  12. A New Generation of Thermal Desorption Technology Incorporating Multi Mode Sampling (NRT/DAAMS/Liquid Agent) for Both on and off Line Analysis of Trace Level Airbone Chemical Warfare Agents

    International Nuclear Information System (INIS)

    Roberts, G. M.

    2007-01-01

    A multi functional, twin-trap, electrically-cooled thermal desorption (TD) system (TT24-7) will be discussed for the analysis of airborne trace level chemical warfare agents. This technology can operate in both military environments (CW stockpile, or destruction facilities) and civilian locations where it is used to monitor for accidental or terrorist release of acutely toxic substances. The TD system interfaces to GC, GCMS or direct MS analytical platforms and provides for on-line continuous air monitoring with no sampling time blind spots and within a near real time (NRT) context. Using this technology enables on-line sub ppt levels of agent detection from a vapour sample. In addition to continuous sampling the system has the capacity for off-line single (DAAMS) tube analysis and the ability to receive an external liquid agent injection. The multi mode sampling functionality provides considerable flexibility to the TD system, allowing continuous monitoring of an environment for toxic substances plus the ability to analyse calibration standards. A calibration solution can be introduced via a conventional sampling tube on to either cold trap or as a direct liquid injection using a conventional capillary split/splitless injection port within a gas chromatograph. Low level (linearity) data will be supplied showing the TT24-7 analyzing a variety of CW compounds including free (underivitised) VX using the three sampling modes described above. Stepwise changes in vapor generated agent concentrations will be shown, and this is cross referenced against direct liquid agent introduction, and the tube sampling modes. This technology is in use today in several geographies around the world in both static and mobile analytical laboratories. (author)

  13. Efficient and exact sampling of simple graphs with given arbitrary degree sequence.

    Directory of Open Access Journals (Sweden)

    Charo I Del Genio

    Full Text Available Uniform sampling from graphical realizations of a given degree sequence is a fundamental component in simulation-based measurements of network observables, with applications ranging from epidemics, through social networks to Internet modeling. Existing graph sampling methods are either link-swap based (Markov-Chain Monte Carlo algorithms or stub-matching based (the Configuration Model. Both types are ill-controlled, with typically unknown mixing times for link-swap methods and uncontrolled rejections for the Configuration Model. Here we propose an efficient, polynomial time algorithm that generates statistically independent graph samples with a given, arbitrary, degree sequence. The algorithm provides a weight associated with each sample, allowing the observable to be measured either uniformly over the graph ensemble, or, alternatively, with a desired distribution. Unlike other algorithms, this method always produces a sample, without back-tracking or rejections. Using a central limit theorem-based reasoning, we argue, that for large , and for degree sequences admitting many realizations, the sample weights are expected to have a lognormal distribution. As examples, we apply our algorithm to generate networks with degree sequences drawn from power-law distributions and from binomial distributions.

  14. Effects of climate change on income generating activities of farmers ...

    African Journals Online (AJOL)

    The need to examine the changes that the effect of climate change brings about on the income generating activities of farmers necessitated this study. Two local government areas (LGAs) were randomly selected and simple random sampling was used to sample 160 farmers from the 2 LGAs. Chi-square and Pearson ...

  15. Connecting Research to Teaching: Using Data to Motivate the Use of Empirical Sampling Distributions

    Science.gov (United States)

    Lee, Hollylynne S.; Starling, Tina T.; Gonzalez, Marggie D.

    2014-01-01

    Research shows that students often struggle with understanding empirical sampling distributions. Using hands-on and technology models and simulations of problems generated by real data help students begin to make connections between repeated sampling, sample size, distribution, variation, and center. A task to assist teachers in implementing…

  16. Stochastic generation of explicit pore structures by thresholding Gaussian random fields

    Energy Technology Data Exchange (ETDEWEB)

    Hyman, Jeffrey D., E-mail: jhyman@lanl.gov [Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721-0089 (United States); Computational Earth Science, Earth and Environmental Sciences (EES-16), and Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM 87544 (United States); Winter, C. Larrabee, E-mail: winter@email.arizona.edu [Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721-0089 (United States); Department of Hydrology and Water Resources, University of Arizona, Tucson, AZ 85721-0011 (United States)

    2014-11-15

    We provide a description and computational investigation of an efficient method to stochastically generate realistic pore structures. Smolarkiewicz and Winter introduced this specific method in pores resolving simulation of Darcy flows (Smolarkiewicz and Winter, 2010 [1]) without giving a complete formal description or analysis of the method, or indicating how to control the parameterization of the ensemble. We address both issues in this paper. The method consists of two steps. First, a realization of a correlated Gaussian field, or topography, is produced by convolving a prescribed kernel with an initial field of independent, identically distributed random variables. The intrinsic length scales of the kernel determine the correlation structure of the topography. Next, a sample pore space is generated by applying a level threshold to the Gaussian field realization: points are assigned to the void phase or the solid phase depending on whether the topography over them is above or below the threshold. Hence, the topology and geometry of the pore space depend on the form of the kernel and the level threshold. Manipulating these two user prescribed quantities allows good control of pore space observables, in particular the Minkowski functionals. Extensions of the method to generate media with multiple pore structures and preferential flow directions are also discussed. To demonstrate its usefulness, the method is used to generate a pore space with physical and hydrological properties similar to a sample of Berea sandstone. -- Graphical abstract: -- Highlights: •An efficient method to stochastically generate realistic pore structures is provided. •Samples are generated by applying a level threshold to a Gaussian field realization. •Two user prescribed quantities determine the topology and geometry of the pore space. •Multiple pore structures and preferential flow directions can be produced. •A pore space based on Berea sandstone is generated.

  17. Rapid analysis of mixed waste samples via the optical emission from laser initiated microplasmas

    International Nuclear Information System (INIS)

    Barefield, J.E. II; Ferran, M.D.; Cremers, D.A.

    1993-01-01

    Wavelength resolved optical emission from laser initiated microplasmas in samples containing Pu, Am, Pb, Cr, and Be was used to determine elemental compositions. Traditionally, samples of this type are analyzed by neutron activation, X-ray fluorescence, atomic absorption (AA), inductively coupled plasma - atomic emission spectroscopy (ICP-AES), and inductively coupled plasma mass spectroscopy (ICP-MS). Analysis via the traditional analytical spectroscopic techniques involves extensive sample separation and preparation which results in the generation of significant quantities of additional waste. In the laser based method, little to no sample preparation is required. The method is essentially waste free since only a few micrograms of material is removed from the sample in the generation of the microplasma. Detection limits of the laser based method typically range between subppm to tens of ppM. In this report, the optical emission from samples containing Pu, Am, Pb, Cr, and Be will be discussed. we will also discuss the essential elements of the analysis method

  18. GRD: An SPSS extension command for generating random data

    Directory of Open Access Journals (Sweden)

    Bradley Harding

    2014-09-01

    Full Text Available To master statistics and data analysis tools, it is necessary to understand a number of concepts, manyof which are quite abstract. For example, sampling from a theoretical distribution can help individuals explore andunderstand randomness. Sampling can also be used to build exercises aimed to help students master statistics. Here, we present GRD (Generator of Random Data, an extension command for SPSS (version 17 and above. With GRD, it is possible to get random data from a given distribution. In its simplest use, GRD will return a set of simulated data from a normal distribution.With subcommands to GRD, it is possible to get data from multiple groups, over multiple repeated measures, and with desired effectsizes. Group sizes can be equal or unequal. With further subcommands, it is possible to sample from any theoretical population, (not simply the normal distribution, introduce non-homogeneous variances,fix or randomize subject effects, etc. Finally, GRD’s generated data are in a format ready to be analyzed.

  19. Next-Generation Pathology.

    Science.gov (United States)

    Caie, Peter D; Harrison, David J

    2016-01-01

    The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.

  20. Correlation between k-space sampling pattern and MTF in compressed sensing MRSI.

    Science.gov (United States)

    Heikal, A A; Wachowicz, K; Fallone, B G

    2016-10-01

    To investigate the relationship between the k-space sampling patterns used for compressed sensing MR spectroscopic imaging (CS-MRSI) and the modulation transfer function (MTF) of the metabolite maps. This relationship may allow the desired frequency content of the metabolite maps to be quantitatively tailored when designing an undersampling pattern. Simulations of a phantom were used to calculate the MTF of Nyquist sampled (NS) 32 × 32 MRSI, and four-times undersampled CS-MRSI reconstructions. The dependence of the CS-MTF on the k-space sampling pattern was evaluated for three sets of k-space sampling patterns generated using different probability distribution functions (PDFs). CS-MTFs were also evaluated for three more sets of patterns generated using a modified algorithm where the sampling ratios are constrained to adhere to PDFs. Strong visual correlation as well as high R 2 was found between the MTF of CS-MRSI and the product of the frequency-dependant sampling ratio and the NS 32 × 32 MTF. Also, PDF-constrained sampling patterns led to higher reproducibility of the CS-MTF, and stronger correlations to the above-mentioned product. The relationship established in this work provides the user with a theoretical solution for the MTF of CS MRSI that is both predictable and customizable to the user's needs.

  1. Monte Carlo generated spectra for QA/QC of automated NAA routine

    International Nuclear Information System (INIS)

    Jackman, K.R.; Biegalski, S.R.

    2007-01-01

    A quality check for an automated system of analyzing large sets of neutron activated samples has been developed. Activated samples are counted with an HPGe detector, in conjunction with an automated sample changer and spectral analysis tools, controlled by the Canberra GENIE 2K and REXX software. After each sample is acquired and analyzed, a Microsoft Visual Basic program imports the results into a template Microsoft Excel file where the final concentrations, uncertainties, and detection limits are determined. Standard reference materials are included in each set of 40 samples as a standard quality assurance/quality control (QA/QC) test. A select group of sample spectra are also visually reviewed to check the peak fitting routines. A reference spectrum was generated in MCNP 4c2 using an F8, pulse-height, tally with a detector model of the actual detector used in counting. The detector model matches the detector resolution, energy calibration, and counting geometry. The generated spectrum also contained a radioisotope matrix that was similar to what was expected in the samples. This spectrum can then be put through the automated system and analyzed along with the other samples. The automated results are then compared to expected results for QA/QC assurance. (author)

  2. Effects of water treatment and sample granularity on radiation sensitivity and stability of EPR signals in X-ray irradiated bone samples

    International Nuclear Information System (INIS)

    Ciesielski, Bartlomiej; Krefft, Karolina; Penkowski, Michal; Kaminska, Joanna; Drogoszewska, Barbara

    2014-01-01

    The article describes effects of sample conditions during its irradiation and electron paramagnetic resonance (EPR) measurements on the background (BG) and dosimetric EPR signals in bone. Intensity of the BG signal increased up to two to three times after crushing of bone to sub-millimetre grains. Immersion of samples in water caused about 50 % drop in intensity of the BG component followed by its regrowth in 1-2 months. Irradiation of bone samples produced an axial dosimetric EPR signal (radiation-induced signal) attributed to hydroxyapatite component of bone. This signal was stable and was not affected by water. In samples irradiated in dry conditions, EPR signal similar to the native BG was also generated by radiation. In samples irradiated in wet conditions, this BG-like component was initially much smaller than in bone irradiated as dry, but increased in time, reaching similar levels as in dry-irradiated samples. It is concluded that accuracy of EPR dosimetry in bones can be improved, if calibration of the samples is done by their irradiations in wet conditions. (authors)

  3. Planning the human variome project: the Spain report.

    NARCIS (Netherlands)

    Kaput, J.; Cotton, R.G.; Hardman, L.; Watson, M.; Aqeel, A.I. Al; Al-Aama, J.Y.; Al-Mulla, F.; Alonso, S.; Aretz, S.; Auerbach, A.D.; Bapat, B.; Bernstein, I.T.; Bhak, J.; Bleoo, S.L.; Blocker, H.; Brenner, S.E.; Burn, J.; Bustamante, M.; Calzone, R.; Cambon-Thomsen, A.; Cargill, M.; Carrera, P.; Cavedon, L.; Cho, Y.S.; Chung, Y.J.; Claustres, M.; Cutting, G.; Dalgleish, R.; Dunnen, J.T. den; Diaz, C.; Dobrowolski, S.; Santos, M.R. dos; Ekong, R.; Flanagan, S.B.; Flicek, P.; Furukawa, Y.; Genuardi, M.; Ghang, H.; Golubenko, M.V.; Greenblatt, M.S.; Hamosh, A.; Hancock, J.M.; Hardison, R.; Harrison, T.M.; Hoffmann, R.; Horaitis, R.; Howard, H.J.; Barash, C.I.; Izagirre, N.; Jung, J.; Kojima, T.; Laradi, S.; Lee, Y.S.; Lee, J.Y.; Gil-da-Silva-Lopes, V.L.; Macrae, F.A.; Maglott, D.; Marafie, M.J.; Marsh, S.G.; Matsubara, Y.; Messiaen, L.M.; Moslein, G.; Netea, M.G.; Norton, M.L.; Oefner, P.J.; Oetting, W.S.; O'Leary, J.C.; Ramirez, A.M. de; Paalman, M.H.; Parboosingh, J.; Patrinos, G.P.; Perozzi, G.; Phillips, I.R.; Povey, S.; Prasad, S.; Qi, M.; Quin, D.J.; Ramesar, R.S.; Richards, C.S.; Savige, J.; Scheible, D.G.; Scott, R.J.; Seminara, D.; Shephard, E.A.; Sijmons, R.H.; Smith, T.D.; Sobrido, M.J.; Tanaka, T.; Tavtigian, S.V.; Taylor, G.R.; Teague, J.; Topel, T.; Ullman-Cullere, M.; Utsunomiya, J.; Kranen, H.J. van; Vihinen, M.; Webb, E.; Weber, T.K.; Yeager, M.

    2009-01-01

    The remarkable progress in characterizing the human genome sequence, exemplified by the Human Genome Project and the HapMap Consortium, has led to the perception that knowledge and the tools (e.g., microarrays) are sufficient for many if not most biomedical research efforts. A large amount of data

  4. Variation in the Kozak sequence of WNT16 results in an increased translation and is associated with osteoporosis related parameters

    DEFF Research Database (Denmark)

    Hendrickx, Gretl; Boudin, Eveline; Fijałkowski, Igor

    2014-01-01

    on osteoporosis related parameters. Hereto, we performed a WNT16 candidate gene association study in a population of healthy Caucasian men from the Odense Androgen Study (OAS). Using HapMap, five tagSNPs and one multimarker test were selected for genotyping to cover most of the common genetic variation...

  5. Planning the Human Variome Project : The Spain Report

    NARCIS (Netherlands)

    Kaput, Jim; Cotton, Richard G. H.; Hardman, Lauren; Watson, Michael; Al Aqeel, Aida I.; Al-Aama, Jumana Y.; Al-Mulla, Fahd; Alonso, Santos; Aretz, Stefan; Auerbach, Arleen D.; Bapat, Bharati; Bernstein, Inge T.; Bhak, Jong; Bleoo, Stacey L.; Bloecker, Helmut; Brenner, Steven E.; Burn, John; Bustamante, Mariona; Calone, Rita; Cambon-Thomsen, Anne; Cargill, Michele; Carrera, Paola; Cavedon, Lawrence; Cho, Yoon Shin; Chung, Yeun-Jun; Claustres, Mireille; Cutting, Garry; Dalgleish, Raymond; den Dunnen, Johan T.; Diaz, Carlos; Dobrowolski, Steven; dos Santos, M. Rosario N.; Ekong, Rosemary; Flanagan, Simon B.; Flicek, Paul; Furukawa, Yoichi; Genuardi, Maurizio; Ghang, Ho; Golubenko, Maria V.; Greenblatt, Marc S.; Hamosh, Ada; Hancock, John M.; Hardison, Ross; Harrison, Terence M.; Hoffmann, Robert; Horaitis, Rania; Howard, Heather J.; Barash, Carol Isaacson; Izagirre, Neskuts; Sijmons, Rolf H.

    The remarkable progress in characterizing the human genome sequence, exemplified by the Human Genome Project and the HapMap Consortium, has led to the perception that knowledge and the tools (e.g., microarrays) are sufficient for many if not most biomedical research efforts. A large amount of data

  6. Planning the human variome project: the Spain report

    DEFF Research Database (Denmark)

    Kaput, Jim; Cotton, Richard G H; Hardman, Lauren

    2009-01-01

    The remarkable progress in characterizing the human genome sequence, exemplified by the Human Genome Project and the HapMap Consortium, has led to the perception that knowledge and the tools (e.g., microarrays) are sufficient for many if not most biomedical research efforts. A large amount of dat...

  7. Detecting chaos in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  8. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  9. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  10. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  11. Next Generation Offline Approaches to Trace Gas-Phase Organic Compound Speciation: Sample Collection and Analysis

    Science.gov (United States)

    Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.

    2017-12-01

    Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.

  12. Multipumping flow system for improving hydride generation atomic fluorescence spectrometric determinations

    International Nuclear Information System (INIS)

    Lopez-Garcia, Ignacio; Ruiz-Alcaraz, Irene; Hernandez-Cordoba, Manuel

    2006-01-01

    The advantages of using membrane micropumps rather than peristaltic pumps to introduce both sample and reagent solutions for hydride generation atomic fluorescence spectrometry are discussed. Arsenic was used as a test analyte to check the performance of the proposed manifold. Sample and reagent consumption was reduced 8-9 fold compared with continuous mode measurements made with peristaltic pumps, with no deterioration in sensitivity. The calibration graph was linear in the 0.05 to 2.5 μg l -1 As range using peak area as the analytical signal and maximum gain in the detector setting. A limit of detection (3σ) of 0.02 μg l -1 and relative standard deviation values close to 2% for 10 independent measurements of a 1 μg l -1 As solution were obtained. The sampling frequency increased from 45 to 102 h -1 with the subsequent saving in carrier gas used and reduction in wastes generated. The instrumental modification, which could be used for other elements currently determined by atomic fluorescence spectrometry, will permit hydride generators of more reduced dimensions to be constructed

  13. Fast Ordered Sampling of DNA Sequence Variants

    Directory of Open Access Journals (Sweden)

    Anthony J. Greenberg

    2018-05-01

    Full Text Available Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects.

  14. Fast Ordered Sampling of DNA Sequence Variants.

    Science.gov (United States)

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  15. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  16. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  17. Development of a One-Handed, Environmental Surface-Sampling Device

    Science.gov (United States)

    2016-05-01

    individual packaging, an operator can generate a large amount of waste that needs to be managed during a sampling mission. The U.S. Army Edgewood...prepared and spore spotting was performed in a biological safety cabinet. For the spore- spotting procedures, the surfaces were spotted with 1 mL of...260 nm (A260) and 280 nm (A280). To determine the DNA concentration for each sample, the NanoDrop software used a modified Beer –Lambert equation and

  18. A Study on the Representative Sampling Survey for Radionuclide Analysis of RI Waste

    Energy Technology Data Exchange (ETDEWEB)

    Jee, K. Y. [KAERI, Daejeon (Korea, Republic of); Kim, Juyoul; Jung, Gunhyo [FNC Tech. Co., Daejeon (Korea, Republic of)

    2007-07-15

    We developed a quantitative method for attaining a representative sample during sampling survey of RI waste. Considering a source, process, and type of RI waste, the method computes the number of sample, confidence interval, variance, and coefficient of variance. We also systematize the method of sampling survey logically and quantitatively. The result of this study can be applied to sampling survey of low- and intermediate-level waste generated from nuclear power plant during the transfer process to disposal facility.

  19. Factors affecting the rural domestic waste generation

    Directory of Open Access Journals (Sweden)

    A.R. Darban Astane

    2017-12-01

    Full Text Available The current study was carried out to evaluate the quantity and quality of rural domestic waste generation and to identify the factors affecting it in rural areas of Khodabandeh county in Zanjan Province, Iran. Waste samplings consisted of 318 rural households in 11 villages. In order to evaluate the quality and quantity of the rural domestic waste, waste production was classified into 12 groups and 2 main groups of organic waste and solid waste. Moreover, kriging interpolation technique in ARC-GIS software was used to evaluate the spatial distribution of the generated domestic waste and ultimately multiple regression analysis was used to evaluate the factors affecting the generation of domestic waste. The results of this study showed that the average waste generated by each person was 0.588 kilograms per day. with the share of organic waste generated by each person being 0.409 kilograms per day and the share of solid waste generated by each person being 0.179 kilograms per day. The results from spatial distribution of waste generation showed a certain pattern in three groups and a higher rate of waste generation in the northern and northwestern parts, especially in the subdistrict. The results of multiple regression analysis showed that the households’ income, assets, age, and personal attitude are respectively the most important variables affecting waste generation. The housholds’ attitude and indigenous knowledge on efficient use of materials are also the key factors which can help reducing waste generation.

  20. A 1000 Arab genome project to study the Emirati population.

    Science.gov (United States)

    Al-Ali, Mariam; Osman, Wael; Tay, Guan K; AlSafar, Habiba S

    2018-04-01

    Discoveries from the human genome, HapMap, and 1000 genome projects have collectively contributed toward the creation of a catalog of human genetic variations that has improved our understanding of human diversity. Despite the collegial nature of many of these genome study consortiums, which has led to the cataloging of genetic variations of different ethnic groups from around the world, genome data on the Arab population remains overwhelmingly underrepresented. The National Arab Genome project in the United Arab Emirates (UAE) aims to address this deficiency by using Next Generation Sequencing (NGS) technology to provide data to improve our understanding of the Arab genome and catalog variants that are unique to the Arab population of the UAE. The project was conceived to shed light on the similarities and differences between the Arab genome and those of the other ethnic groups.

  1. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  2. Sampling for radionuclides and other trace substances

    International Nuclear Information System (INIS)

    Eberhardt, L.L.

    1976-01-01

    Various problems with the environment and an energy crisis have resulted in considerable emphasis on the analysis and understanding of natural systems. The present generation of ecological models suffers greatly from a lack of attention to use of accurate and efficient sampling methods in obtaining the data on which these models are based. Improving ecological sampling requires first of all that the objectives be clearly defined, since different schemes are required for sampling for totals, for changes over time and space, to determine hazards, or for estimating parameters in models. The frequency distributions of most ecological contaminants are not normal, but seem instead to follow a skewed distribution. Coefficients of variation appear to be relatively constant and typical values may range from 0.1 to 1.0 depending on the substance and circumstances. These typical values may be very useful in designing a sampling plan, either for fixed relative variance, or in terms of the sensitivity of a comparison. Several classes of sampling methods are available for particular kinds of objectives. The notion of optimal sampling for parameter estimates is new to ecology, but may possibly be adapted from work done in industrial experimentation to provide a rationale for sampling in time

  3. FINANCIAL LITERACY AMONGSTAFRICAN GENERATION YSTUDENTS:ANEMPIRICALANALYSISOF SELECTEDDEMOGRAPIC FACTORS

    Directory of Open Access Journals (Sweden)

    Marko van Deventer

    2017-01-01

    Full Text Available The entire spectrum of society, includingGeneration Y, face the challenge ofmanaging their personalfinances in uncertain economic,financialand politicaltimes. Thischallengehighlights the importance of being equipped with thenecessary financial literacy to make informed financial decisions. Financialilliteracy is a global phenomenon that has become a topical issue. As a result,there has been a steady increase in the body of knowledge that pertains to theimportance and benefits of financial literacy and the consequences offinancialilliteracy.This study investigates differences inthe significantly sizedblackGeneration Y (hereafter referred to as African Generation Ystudent cohort’sfinancial literacy in terms ofselected demographic factors, namely gender, yearand field of study respectively, within the South African context. Following adescriptive research design and a quantitative research approach, data werecollected from a convenience sample of 385 African students registered at twoGauteng based public South African university campuses. Multiple-choicequestions,relatingto general financial knowledge, saving, spending and debt,were used to test the students’ financial literacy. Data analysis includeddescriptive statistics, an independent-samples t-test and one-way analysis ofvariance (ANOVA. The findings suggest thatAfrican Generation Y studentsmaybe categorised as having a relatively low level of financial literacyand that thesample’s financial literacydid not differ much in terms ofgender.The findings ofthis study is likely to inform policymakers, educators, universities and financialinstitutions on the most effective strategies to employ for implementationwithregards to differing financial literacy levels.

  4. Fast physical random bit generation with chaotic semiconductor lasers

    Science.gov (United States)

    Uchida, Atsushi; Amano, Kazuya; Inoue, Masaki; Hirano, Kunihito; Naito, Sunao; Someya, Hiroyuki; Oowada, Isao; Kurashige, Takayuki; Shiki, Masaru; Yoshimori, Shigeru; Yoshimura, Kazuyuki; Davis, Peter

    2008-12-01

    Random number generators in digital information systems make use of physical entropy sources such as electronic and photonic noise to add unpredictability to deterministically generated pseudo-random sequences. However, there is a large gap between the generation rates achieved with existing physical sources and the high data rates of many computation and communication systems; this is a fundamental weakness of these systems. Here we show that good quality random bit sequences can be generated at very fast bit rates using physical chaos in semiconductor lasers. Streams of bits that pass standard statistical tests for randomness have been generated at rates of up to 1.7 Gbps by sampling the fluctuating optical output of two chaotic lasers. This rate is an order of magnitude faster than that of previously reported devices for physical random bit generators with verified randomness. This means that the performance of random number generators can be greatly improved by using chaotic laser devices as physical entropy sources.

  5. The effect of inversion at 8p23 on BLK association with lupus in Caucasian population.

    Directory of Open Access Journals (Sweden)

    Bahram Namjou

    Full Text Available To explore the potential influence of the polymorphic 8p23.1 inversion on known autoimmune susceptibility risk at or near BLK locus, we validated a new bioinformatics method that utilizes SNP data to enable accurate, high-throughput genotyping of the 8p23.1 inversion in a Caucasian population.Principal components analysis (PCA was performed using markers inside the inversion territory followed by k-means cluster analyses on 7416 European derived and 267 HapMaP CEU and TSI samples. A logistic regression conditional analysis was performed.Three subgroups have been identified; inversion homozygous, heterozygous and non-inversion homozygous. The status of inversion was further validated using HapMap samples that had previously undergone Fluorescence in situ hybridization (FISH assays with a concordance rate of above 98%. Conditional analyses based on the status of inversion were performed. We found that overall association signals in the BLK region remain significant after controlling for inversion status. The proportion of lupus cases and controls (cases/controls in each subgroup was determined to be 0.97 for the inverted homozygous group (1067 cases and 1095 controls, 1.12 for the inverted heterozygous group (1935 cases 1717 controls and 1.36 for non-inverted subgroups (924 cases and 678 controls. After calculating the linkage disequilibrium between inversion status and lupus risk haplotype we found that the lupus risk haplotype tends to reside on non-inversion background. As a result, a new association effect between non-inversion status and lupus phenotype has been identified ((p = 8.18×10(-7, OR = 1.18, 95%CI = 1.10-1.26.Our results demonstrate that both known lupus risk haplotype and inversion status act additively in the pathogenesis of lupus. Since inversion regulates expression of many genes in its territory, altered expression of other genes might also be involved in the development of lupus.

  6. Employer attractiveness from a generational perspective: Implications for employer branding

    Directory of Open Access Journals (Sweden)

    Germano Glufke Reis

    2016-03-01

    Full Text Available ABSTRACT This study aimed to identify the employer attractiveness factors prioritized by different generations: Baby Boomers, Generation X, and Generation Y. The survey was conducted with a sample of 937 professionals, working in various areas and companies, most of them were managers and had a high education level. The Employer Attractiveness Scale proposed by Berthon et al. (2005 was adopted and the results indicate that, when choosing a company, the generations under study have specific features regarding the attractiveness attributes they prioritize. It was also observed that Generation Y discriminates and ranks such attributes more clearly than the others. Possible implications for employer branding and research limitations are discussed at the end of the article.

  7. Sample Results From Tank 48H Samples HTF-48-14-158, -159, -169, and -170

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hang, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-28

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 48H in support of determining the cause for the unusually high dose rates at the sampling points for this tank. A set of two samples was taken from the quiescent tank, and two additional samples were taken after the contents of the tank were mixed. The results of the analyses of all the samples show that the contents of the tank have changed very little since the analysis of the previous sample in 2012. The solids are almost exclusively composed of tetraphenylborate (TPB) salts, and there is no indication of acceleration in the TPB decomposition. The filtrate composition shows a moderate increase in salt concentration and density, which is attributable to the addition of NaOH for the purposes of corrosion control. An older modeling simulation of the TPB degradation was updated, and the supernate results from a 2012 sample were run in the model. This result was compared to the results from the 2014 recent sample results reported in this document. The model indicates there is no change in the TPB degradation from 2012 to 2014. SRNL measured the buoyancy of the TPB solids in Tank 48H simulant solutions. It was determined that a solution of density 1.279 g/mL (~6.5M sodium) was capable of indefinitely suspending the TPB solids evenly throughout the solution. A solution of density 1.296 g/mL (~7M sodium) caused a significant fraction of the solids to float on the solution surface. As the experiments could not include the effect of additional buoyancy elements such as benzene or hydrogen generation, the buoyancy measurements provide an upper bound estimate of the density in Tank 48H required to float the solids.

  8. Extent and distribution of linkage disequilibrium in the Old Order Amish.

    Science.gov (United States)

    Van Hout, Cristopher V; Levin, Albert M; Rampersaud, Evadnie; Shen, Haiqing; O'Connell, Jeffrey R; Mitchell, Braxton D; Shuldiner, Alan R; Douglas, Julie A

    2010-02-01

    Knowledge of the extent and distribution of linkage disequilibrium (LD) is critical to the design and interpretation of gene mapping studies. Because the demographic history of each population varies and is often not accurately known, it is necessary to empirically evaluate LD on a population-specific basis. Here we present the first genome-wide survey of LD in the Old Order Amish (OOA) of Lancaster County Pennsylvania, a closed population derived from a modest number of founders. Specifically, we present a comparison of LD between OOA individuals and US Utah participants in the International HapMap project (abbreviated CEU) using a high-density single nucleotide polymorphism (SNP) map. Overall, the allele (and haplotype) frequency distributions and LD profiles were remarkably similar between these two populations. For example, the median absolute allele frequency difference for autosomal SNPs was 0.05, with an inter-quartile range of 0.02-0.09, and for autosomal SNPs 10-20 kb apart with common alleles (minor allele frequency > or =0.05), the LD measure r(2) was at least 0.8 for 15 and 14% of SNP pairs in the OOA and CEU, respectively. Moreover, tag SNPs selected from the HapMap CEU sample captured a substantial portion of the common variation in the OOA ( approximately 88%) at r(2) > or =0.8. These results suggest that the OOA and CEU may share similar LD profiles for other common but untyped SNPs. Thus, in the context of the common variant-common disease hypothesis, genetic variants discovered in gene mapping studies in the OOA may generalize to other populations. 2009 Wiley-Liss, Inc.

  9. An extensive analysis of the hereditary hemochromatosis gene HFE and neighboring histone genes: associations with childhood leukemia.

    Science.gov (United States)

    Davis, Charronne F; Dorak, M Tevfik

    2010-04-01

    The most common mutation of the HFE gene C282Y has shown a risk association with childhood acute lymphoblastic leukemia (ALL) in Welsh and Scottish case-control studies. This finding has not been replicated outside Britain. Here, we present a thorough analysis of the HFE gene in a panel of HLA homozygous reference cell lines and in the original population sample from South Wales (117 childhood ALL cases and 414 newborn controls). The 21 of 24 variants analyzed were from the HFE gene region extending 52 kb from the histone gene HIST1H1C to HIST1H1T. We identified the single-nucleotide polymorphism (SNP) rs807212 as a tagging SNP for the most common HFE region haplotype, which contains wild-type alleles of all HFE variants examined. This intergenic SNP rs807212 yielded a strong male-specific protective association (per allele OR = 0.38, 95% CI = 0.22-0.64, P (trend) = 0.0002; P = 0.48 in females), which accounted for the original C282Y risk association. In the HapMap project data, rs807212 was in strong linkage disequilibrium with 25 other SNPs spanning 151 kb around HFE. Minor alleles of these 26 SNPs characterized the most common haplotype for the HFE region, which lacked all disease-associated HFE variants. The HapMap data suggested positive selection in this region even in populations where the HFE C282Y mutation is absent. These results have implications for the sex-specific associations observed in this region and suggest the inclusion of rs807212 in future studies of the HFE gene and the extended HLA class I region.

  10. Using Dried Blood Spot Sampling to Improve Data Quality and Reduce Animal Use in Mouse Pharmacokinetic Studies

    Science.gov (United States)

    Wickremsinhe, Enaksha R; Perkins, Everett J

    2015-01-01

    Traditional pharmacokinetic analysis in nonclinical studies is based on the concentration of a test compound in plasma and requires approximately 100 to 200 µL blood collected per time point. However, the total blood volume of mice limits the number of samples that can be collected from an individual animal—often to a single collection per mouse—thus necessitating dosing multiple mice to generate a pharmacokinetic profile in a sparse-sampling design. Compared with traditional methods, dried blood spot (DBS) analysis requires smaller volumes of blood (15 to 20 µL), thus supporting serial blood sampling and the generation of a complete pharmacokinetic profile from a single mouse. Here we compare plasma-derived data with DBS-derived data, explain how to adopt DBS sampling to support discovery mouse studies, and describe how to generate pharmacokinetic and pharmacodynamic data from a single mouse. Executing novel study designs that use DBS enhances the ability to identify and streamline better drug candidates during drug discovery. Implementing DBS sampling can reduce the number of mice needed in a drug discovery program. In addition, the simplicity of DBS sampling and the smaller numbers of mice needed translate to decreased study costs. Overall, DBS sampling is consistent with 3Rs principles by achieving reductions in the number of animals used, decreased restraint-associated stress, improved data quality, direct comparison of interanimal variability, and the generation of multiple endpoints from a single study. PMID:25836959

  11. On grey levels in random CAPTCHA generation

    Science.gov (United States)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  12. Laser ablation of microparticles for nanostructure generation

    International Nuclear Information System (INIS)

    Waraich, Palneet Singh; Tan, Bo; Venkatakrishnan, Krishnan

    2011-01-01

    The process of laser ablation of microparticles has been shown to generate nanoparticles from microparticles; but the generation of nanoparticle networks from microparticles has never been reported before. We report a unique approach for the generation of nanoparticle networks through ablation of microparticles. Using this approach, two samples containing microparticles of lead oxide (Pb 3 O 4 ) and nickel oxide (NiO), respectively, were ablated under ambient conditions using a femtosecond laser operating in the MHz repetition rate regime. Nanoparticle networks with particle diameter ranging from 60 to 90 nm were obtained by ablation of microparticles without use of any specialized equipment, catalysts or external stimulants. The formation of finer nanoparticle networks has been explained by considering the low pressure region created by the shockwave, causing rapid condensation of microparticles into finer nanoparticles. A comparison between the nanostructures generated by ablating microparticle and those by ablating bulk substrate was carried out; and a considerable reduction in size and narrowed size distribution was observed. Our nanostructure fabrication technique will be a unique process for nanoparticle network generation from a vast array of materials.

  13. Employing online quantum random number generators for generating truly random quantum states in Mathematica

    Science.gov (United States)

    Miszczak, Jarosław Adam

    2013-01-01

    numbers generated by quantum real number generator. Reasons for new version: Added support for the high-speed on-line quantum random number generator and improved methods for retrieving lists of random numbers. Summary of revisions: The presented version provides two signicant improvements. The first one is the ability to use the on-line Quantum Random Number Generation service developed by PicoQuant GmbH and the Nano-Optics groups at the Department of Physics of Humboldt University. The on-line service supported in the version 2.0 of the TRQS package provides faster access to true randomness sources constructed using the laws of quantum physics. The service is freely available at https://qrng.physik.hu-berlin.de/. The use of this service allows using the presented package with the need of a physical quantum random number generator. The second improvement introduced in this version is the ability to retrieve arrays of random data directly for the used source. This increases the speed of the random number generation, especially in the case of an on-line service, where it reduces the time necessary to establish the connection. Thanks to the speed improvement of the presented version, the package can now be used in simulations requiring larger amounts of random data. Moreover, the functions for generating random numbers provided by the current version of the package more closely follow the pattern of functions for generating pseudo- random numbers provided in Mathematica. Additional comments: Speed comparison: The implementation of the support for the QRNG on-line service provides a noticeable improvement in the speed of random number generation. For the samples of real numbers of size 101; 102,…,107 the times required to generate these samples using Quantis USB device and QRNG service are compared in Fig. 1. The presented results show that the use of the on-line service provides faster access to random numbers. One should note, however, that the speed gain can increase or

  14. Sampling in forests for radionuclide analysis. General and practical guidance

    Energy Technology Data Exchange (ETDEWEB)

    Aro, Lasse (Finnish Forest Research Inst. (METLA) (Finland)); Plamboeck, Agneta H. (Swedish Defence Research Agency (FOI) (Sweden)); Rantavaara, Aino; Vetikko, Virve (Radiation and Nuclear Safety Authority (STUK) (Finland)); Straalberg, Elisabeth (Inst. Energy Technology (IFE) (Norway))

    2009-01-15

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  15. Sampling in forests for radionuclide analysis. General and practical guidance

    International Nuclear Information System (INIS)

    Aro, Lasse; Plamboeck, Agneta H.; Rantavaara, Aino; Vetikko, Virve; Straelberg, Elisabeth

    2009-01-01

    The NKS project FOREST was established to prepare a guide for sampling in forest ecosystems for radionuclide analysis. The aim of this guide is to improve the reliability of datasets generated in future studies by promoting the use of consistent, recommended practices, thorough documentation of field sampling regimes and robust preparation of samples from the forest ecosystem. The guide covers general aims of sampling, the description of major compartments of the forest ecosystem and outlines key factors to consider when planning sampling campaigns for radioecological field studies in forests. Recommended and known sampling methods for various sample types are also compiled and presented. The guide focuses on sampling practices that are applicable in various types of boreal forests, robust descriptions of sampling sites, and documentation of the origin and details of individual samples. The guide is intended for scientists, students, forestry experts and technicians who appreciate the need to use sound sampling procedures in forest radioecological projects. The guide will hopefully encourage readers to participate in field studies and sampling campaigns, using robust techniques, thereby fostering competence in sampling. (au)

  16. A support vector density-based importance sampling for reliability assessment

    International Nuclear Information System (INIS)

    Dai, Hongzhe; Zhang, Hao; Wang, Wei

    2012-01-01

    An importance sampling method based on the adaptive Markov chain simulation and support vector density estimation is developed in this paper for efficient structural reliability assessment. The methodology involves the generation of samples that can adaptively populate the important region by the adaptive Metropolis algorithm, and the construction of importance sampling density by support vector density. The use of the adaptive Metropolis algorithm may effectively improve the convergence and stability of the classical Markov chain simulation. The support vector density can approximate the sampling density with fewer samples in comparison to the conventional kernel density estimation. The proposed importance sampling method can effectively reduce the number of structural analysis required for achieving a given accuracy. Examples involving both numerical and practical structural problems are given to illustrate the application and efficiency of the proposed methodology.

  17. Conformation Generation: The State of the Art.

    Science.gov (United States)

    Hawkins, Paul C D

    2017-08-28

    The generation of conformations for small molecules is a problem of continuing interest in cheminformatics and computational drug discovery. This review will present an overview of methods used to sample conformational space, focusing on those methods designed for organic molecules commonly of interest in drug discovery. Different approaches to both the sampling of conformational space and the scoring of conformational stability will be compared and contrasted, with an emphasis on those methods suitable for conformer sampling of large numbers of drug-like molecules. Particular attention will be devoted to the appropriate utilization of information from experimental solid-state structures in validating and evaluating the performance of these tools. The review will conclude with some areas worthy of further investigation.

  18. On-chip polarimetry for high-throughput screening of nanoliter and smaller sample volumes

    Science.gov (United States)

    Bachmann, Brian O. (Inventor); Bornhop, Darryl J. (Inventor); Dotson, Stephen (Inventor)

    2012-01-01

    A polarimetry technique for measuring optical activity that is particularly suited for high throughput screening employs a chip or substrate (22) having one or more microfluidic channels (26) formed therein. A polarized laser beam (14) is directed onto optically active samples that are disposed in the channels. The incident laser beam interacts with the optically active molecules in the sample, which slightly alter the polarization of the laser beam as it passes multiple times through the sample. Interference fringe patterns (28) are generated by the interaction of the laser beam with the sample and the channel walls. A photodetector (34) is positioned to receive the interference fringe patterns and generate an output signal that is input to a computer or other analyzer (38) for analyzing the signal and determining the rotation of plane polarized light by optically active material in the channel from polarization rotation calculations.

  19. Conscience of Japanese on nuclear power generation

    International Nuclear Information System (INIS)

    Hayashi, Chikio

    1995-01-01

    There are considerably many investigations and researches on the attitude of general public to nuclear power generation, but those which analyzed the contents of attitude or the research which got into the problem of what method is desirable to obtain the understanding of nuclear power generation for power generation side is rarely found. Therefore, the research on where is its cause was begun. As the result, since the attitude to nuclear power generation is related to the attitudes to many things that surround nuclear power generation in addition to that directly to nuclear power generation, it is necessary to elucidate the problem synthetically. The social investigation was carried out for the public of from 18 to 79 years old who live in the supply area of Kansai Electric Power Co., Inc. The data were obtained from those selected by probabilistic sampling, 1000 in urban area (rate of recovery 76%) and 440 in country area (rate of recovery 77%). The way of thinking on making questionnaire is shown. The investigation and the analysis of the obtained data were carried out. What do you recollect as a dangerous matter, the attitude to nuclear power generation, the structure of the conscience to nuclear power generation and its significance, the type classification of people and its features are reported and discussed. (K.I.)

  20. Development of a magnetic measurement device for thin ribbon samples

    International Nuclear Information System (INIS)

    Sato, Yuta; Todaka, Takashi; Enokizono, Masato

    2008-01-01

    This paper presents a magnetic measurement device for thin ribbon samples, which are produced by rapid cooling technique. This device enables us to measure magnetic properties easily by only inserting a ribbon sample into a sample holder. The sample holder was made by bakelite to fix any width sample. A long solenoid coil was used to generate a uniform magnetic field and the sample holder was placed at the mid part of the solenoid. The magnetic field strength was measured using a shunt resistor and the magnetic flux density and magnetization in sample ribbons were evaluated by using search coils. The accuracy of measurement was verified with an amorphous metal ribbon sample. Next, we have measured magnetic properties of some magnetic shape memory alloys, which have different compositions. The measured results are compared and we clarified the effect of Sm contents on the magnetic properties

  1. Data quality objectives summary report in support of Hanford Generating Plant

    International Nuclear Information System (INIS)

    Miller, M.S.; Marske, S.G.

    1996-09-01

    The US Department of Energy, Richland Operations Office requested that the Environmental Restoration Contractor generate a sampling and analysis plan to assist in the assessment of decontamination and decommissioning and remediation of the Hanford Generating Plant and 11 associated Solid Waste Management Units. This report summarizes the results of the Data Quality Objectives planning process as applied to HGP. This characterization data will be used to better estimate the cost for D and D and remediation. The sampling and analysis design is presented summarizing the design, number, location, analytes, and analytical methods to be used. The purpose of the sampling is to provide the nature and depth of contaminants. In locations where it was difficult to make assumptions related to extent, the design allows for limited characterization related to extent of contamination

  2. Generation, language, body mass index, and activity patterns in Hispanic children.

    Science.gov (United States)

    Taverno, Sharon E; Rollins, Brandi Y; Francis, Lori A

    2010-02-01

    The acculturation hypothesis proposes an overall disadvantage in health outcomes for Hispanic immigrants with more time spent living in the U.S., but little is known about how generational status and language may influence Hispanic children's relative weight and activity patterns. To investigate associations among generation and language with relative weight (BMI z-scores), physical activity, screen time, and participation in extracurricular activities (i.e., sports, clubs) in a U.S.-based, nationally representative sample of Hispanic children. Participants included 2012 Hispanic children aged 6-11 years from the cross-sectional 2003 National Survey of Children's Health. Children were grouped according to generational status (first, second, or third), and the primary language spoken in the home (English versus non-English). Primary analyses included adjusted logistic and multinomial logistic regression to examine the relationships among variables; all analyses were conducted between 2008 and 2009. Compared to third-generation, English speakers, first- and second-generation, non-English speakers were more than two times more likely to be obese. Moreover, first-generation, non-English speakers were half as likely to engage in regular physical activity and sports. Both first- and second-generation, non-English speakers were less likely to participate in clubs compared to second- and third-generation, English speakers. Overall, non-English-speaking groups reported less screen time compared to third-generation, English speakers. The hypothesis that Hispanics lose their health protection with more time spent in the U.S. was not supported in this sample of Hispanic children. Copyright 2010 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  3. White light photothermal lens spectrophotometer for the determination of absorption in scattering samples.

    Science.gov (United States)

    Marcano, Aristides; Alvarado, Salvador; Meng, Junwei; Caballero, Daniel; Moares, Ernesto Marín; Edziah, Raymond

    2014-01-01

    We developed a pump-probe photothermal lens spectrophotometer that uses a broadband arc-lamp and a set of interference filters to provide tunable, nearly monochromatic radiation between 370 and 730 nm as the pump light source. This light is focused onto an absorbing sample, generating a photothermal lens of millimeter dimensions. A highly collimated monochromatic probe light from a low-power He-Ne laser interrogates the generated lens, yielding a photothermal signal proportional to the absorption of light. We measure the absorption spectra of scattering dye solutions using the device. We show that the spectra are not affected by the presence of scattering, confirming that the method only measures the absorption of light that results in generation of heat. By comparing the photothermal spectra with the usual absorption spectra determined using commercial transmission spectrophotometers, we estimate the quantum yield of scattering of the sample. We discuss applications of the device for spectroscopic characterization of samples such as blood and gold nanoparticles that exhibit a complex behavior upon interaction with light.

  4. Stochastic Optimal Dispatch of Virtual Power Plant considering Correlation of Distributed Generations

    Directory of Open Access Journals (Sweden)

    Jie Yu

    2015-01-01

    Full Text Available Virtual power plant (VPP is an aggregation of multiple distributed generations, energy storage, and controllable loads. Affected by natural conditions, the uncontrollable distributed generations within VPP, such as wind and photovoltaic generations, are extremely random and relative. Considering the randomness and its correlation of uncontrollable distributed generations, this paper constructs the chance constraints stochastic optimal dispatch of VPP including stochastic variables and its random correlation. The probability distributions of independent wind and photovoltaic generations are described by empirical distribution functions, and their joint probability density model is established by Frank-copula function. And then, sample average approximation (SAA is applied to convert the chance constrained stochastic optimization model into a deterministic optimization model. Simulation cases are calculated based on the AIMMS. Simulation results of this paper mathematic model are compared with the results of deterministic optimization model without stochastic variables and stochastic optimization considering stochastic variables but not random correlation. Furthermore, this paper analyzes how SAA sampling frequency and the confidence level influence the results of stochastic optimization. The numerical example results show the effectiveness of the stochastic optimal dispatch of VPP considering the randomness and its correlations of distributed generations.

  5. Sampling and characterization of radioactive liquid wastes

    International Nuclear Information System (INIS)

    Zepeda R, C.; Monroy G, F.; Reyes A, T.; Lizcano, D.; Cruz C, A. C.

    2017-09-01

    To define the management of radioactive liquid wastes stored in 200 L drums, its isotope and physicochemical characterization is essential. An adequate sampling, that is, representative and homogeneous, is fundamental to obtain reliable analytical results, therefore, in this work, the use of a sampling mechanism that allows collecting homogenous aliquots, in a safe way and minimizing the generation of secondary waste is proposed. With this mechanism, 56 drums of radioactive liquid wastes were sampled, which were characterized by gamma spectrometry, liquid scintillation, and determined the following physicochemical properties: ph, conductivity, viscosity, density and chemical composition by gas chromatography. 67.86% of the radioactive liquid wastes contains H-3 and of these, 47.36% can be released unconditionally, since it presents activities lower than 100 Bq/g. 94% of the wastes are acidic and 48% have viscosities <50 MPa s. (Author)

  6. Genotyping and annotation of Affymetrix SNP arrays

    DEFF Research Database (Denmark)

    Lamy, Philippe; Andersen, Claus Lindbjerg; Wikman, Friedrik

    2006-01-01

    it is indicated that our method is likely to be correct in majority of these cases. In addition, we demonstrate that our method produces more SNPs that are in concordance with Hardy-Weinberg equilibrium than the method by Affymetrix. Finally, we have validated our method on HapMap data and shown...

  7. Genetic landscape of the people of India: a canvas for disease gene ...

    Indian Academy of Sciences (India)

    2008-04-09

    Apr 9, 2008 ... (>10 million individuals) and 23 isolated populations, representing a large fraction of the people of India. We observe ... populations not only overlap with the diversity of HapMap populations, but also contain population groups that are genetically ... China has resulted in a rich tapestry of socio-cultural, lin-.

  8. Biocompatibility evaluations and biomedical sensing applications of nitric oxide-releasing/generating polymeric materials

    Science.gov (United States)

    Wu, Yiduo

    Nitric oxide (NO) is a potent signaling molecule secreted by healthy vascular endothelial cells (EC) that is capable of inhibiting the activation and adhesion of platelets, preventing inflammation and inducing vasodilation. Polymeric materials that mimic the EC through the continuous release or generation of NO are expected to exhibit enhanced biocompatibility in vivo. In this dissertation research, the biocompatibility of novel NO-releasing/generating materials has been evaluated via both in vitro and in vivo studies. A new in vitro platelet adhesion assay has been designed to quantify platelet adhesion on NO-releasing/generating polymer surfaces via their innate lactate dehydrogenase (LDH) content. Using this assay, it was discovered that continuous NO fluxes of up to 7.05 x10-10 mol cm-2 min-1 emitted from the polymer surfaces could reduce platelet adhesion by almost 80%. Such an in vitro biocompatibility assay can be employed as a preliminary screening method in the development of new NO-releasing/generating materials. In addition, the first in vivo biocompatibility evaluation of NO-generating polymers was conducted in a porcine artery model for intravascular oxygen sensing catheters. The Cu(I)-catalyzed decomposition of endogenous S-nitrosothiols (RSNOs) generated NO in situ at the polymer/blood interface and offered enhanced biocompatibility to the NO-generating catheters along with more accurate analytical results for intra-arterial measurements of PO2 levels. NO-generating polymers can also be utilized to fabricate electrochemical RSNO sensors based on the amperometric detection of NO generated by the reaction of RSNOs with immobilized catalysts. Unlike conventional methodologies employed to measure labile RSNO, the advantage of the RSNO sensor method is that measurement in whole blood samples is possible and this minimizes sample processing artifacts in RSNO measurements. An electrochemical RSNO sensor with organoselenium crosslinked polyethylenimine (RSe

  9. Droplet Size-Aware and Error-Correcting Sample Preparation Using Micro-Electrode-Dot-Array Digital Microfluidic Biochips.

    Science.gov (United States)

    Li, Zipeng; Lai, Kelvin Yi-Tse; Chakrabarty, Krishnendu; Ho, Tsung-Yi; Lee, Chen-Yi

    2017-12-01

    Sample preparation in digital microfluidics refers to the generation of droplets with target concentrations for on-chip biochemical applications. In recent years, digital microfluidic biochips (DMFBs) have been adopted as a platform for sample preparation. However, there remain two major problems associated with sample preparation on a conventional DMFB. First, only a (1:1) mixing/splitting model can be used, leading to an increase in the number of fluidic operations required for sample preparation. Second, only a limited number of sensors can be integrated on a conventional DMFB; as a result, the latency for error detection during sample preparation is significant. To overcome these drawbacks, we adopt a next generation DMFB platform, referred to as micro-electrode-dot-array (MEDA), for sample preparation. We propose the first sample-preparation method that exploits the MEDA-specific advantages of fine-grained control of droplet sizes and real-time droplet sensing. Experimental demonstration using a fabricated MEDA biochip and simulation results highlight the effectiveness of the proposed sample-preparation method.

  10. BioSAXS Sample Changer: a robotic sample changer for rapid and reliable high-throughput X-ray solution scattering experiments.

    Science.gov (United States)

    Round, Adam; Felisaz, Franck; Fodinger, Lukas; Gobbo, Alexandre; Huet, Julien; Villard, Cyril; Blanchet, Clement E; Pernot, Petra; McSweeney, Sean; Roessle, Manfred; Svergun, Dmitri I; Cipriani, Florent

    2015-01-01

    Small-angle X-ray scattering (SAXS) of macromolecules in solution is in increasing demand by an ever more diverse research community, both academic and industrial. To better serve user needs, and to allow automated and high-throughput operation, a sample changer (BioSAXS Sample Changer) that is able to perform unattended measurements of up to several hundred samples per day has been developed. The Sample Changer is able to handle and expose sample volumes of down to 5 µl with a measurement/cleaning cycle of under 1 min. The samples are stored in standard 96-well plates and the data are collected in a vacuum-mounted capillary with automated positioning of the solution in the X-ray beam. Fast and efficient capillary cleaning avoids cross-contamination and ensures reproducibility of the measurements. Independent temperature control for the well storage and for the measurement capillary allows the samples to be kept cool while still collecting data at physiological temperatures. The Sample Changer has been installed at three major third-generation synchrotrons: on the BM29 beamline at the European Synchrotron Radiation Facility (ESRF), the P12 beamline at the PETRA-III synchrotron (EMBL@PETRA-III) and the I22/B21 beamlines at Diamond Light Source, with the latter being the first commercial unit supplied by Bruker ASC.

  11. LDSplitDB: a database for studies of meiotic recombination hotspots in MHC using human genomic data.

    Science.gov (United States)

    Guo, Jing; Chen, Hao; Yang, Peng; Lee, Yew Ti; Wu, Min; Przytycka, Teresa M; Kwoh, Chee Keong; Zheng, Jie

    2018-04-20

    Meiotic recombination happens during the process of meiosis when chromosomes inherited from two parents exchange genetic materials to generate chromosomes in the gamete cells. The recombination events tend to occur in narrow genomic regions called recombination hotspots. Its dysregulation could lead to serious human diseases such as birth defects. Although the regulatory mechanism of recombination events is still unclear, DNA sequence polymorphisms have been found to play crucial roles in the regulation of recombination hotspots. To facilitate the studies of the underlying mechanism, we developed a database named LDSplitDB which provides an integrative and interactive data mining and visualization platform for the genome-wide association studies of recombination hotspots. It contains the pre-computed association maps of the major histocompatibility complex (MHC) region in the 1000 Genomes Project and the HapMap Phase III datasets, and a genome-scale study of the European population from the HapMap Phase II dataset. Besides the recombination profiles, related data of genes, SNPs and different types of epigenetic modifications, which could be associated with meiotic recombination, are provided for comprehensive analysis. To meet the computational requirement of the rapidly increasing population genomics data, we prepared a lookup table of 400 haplotypes for recombination rate estimation using the well-known LDhat algorithm which includes all possible two-locus haplotype configurations. To the best of our knowledge, LDSplitDB is the first large-scale database for the association analysis of human recombination hotspots with DNA sequence polymorphisms. It provides valuable resources for the discovery of the mechanism of meiotic recombination hotspots. The information about MHC in this database could help understand the roles of recombination in human immune system. DATABASE URL: http://histone.scse.ntu.edu.sg/LDSplitDB.

  12. RFMix: A Discriminative Modeling Approach for Rapid and Robust Local-Ancestry Inference

    Science.gov (United States)

    Maples, Brian K.; Gravel, Simon; Kenny, Eimear E.; Bustamante, Carlos D.

    2013-01-01

    Local-ancestry inference is an important step in the genetic analysis of fully sequenced human genomes. Current methods can only detect continental-level ancestry (i.e., European versus African versus Asian) accurately even when using millions of markers. Here, we present RFMix, a powerful discriminative modeling approach that is faster (∼30×) and more accurate than existing methods. We accomplish this by using a conditional random field parameterized by random forests trained on reference panels. RFMix is capable of learning from the admixed samples themselves to boost performance and autocorrect phasing errors. RFMix shows high sensitivity and specificity in simulated Hispanics/Latinos and African Americans and admixed Europeans, Africans, and Asians. Finally, we demonstrate that African Americans in HapMap contain modest (but nonzero) levels of Native American ancestry (∼0.4%). PMID:23910464

  13. Reconstruction of three-dimensional porous media using generative adversarial neural networks

    Science.gov (United States)

    Mosser, Lukas; Dubrule, Olivier; Blunt, Martin J.

    2017-10-01

    To evaluate the variability of multiphase flow properties of porous media at the pore scale, it is necessary to acquire a number of representative samples of the void-solid structure. While modern x-ray computer tomography has made it possible to extract three-dimensional images of the pore space, assessment of the variability in the inherent material properties is often experimentally not feasible. We present a method to reconstruct the solid-void structure of porous media by applying a generative neural network that allows an implicit description of the probability distribution represented by three-dimensional image data sets. We show, by using an adversarial learning approach for neural networks, that this method of unsupervised learning is able to generate representative samples of porous media that honor their statistics. We successfully compare measures of pore morphology, such as the Euler characteristic, two-point statistics, and directional single-phase permeability of synthetic realizations with the calculated properties of a bead pack, Berea sandstone, and Ketton limestone. Results show that generative adversarial networks can be used to reconstruct high-resolution three-dimensional images of porous media at different scales that are representative of the morphology of the images used to train the neural network. The fully convolutional nature of the trained neural network allows the generation of large samples while maintaining computational efficiency. Compared to classical stochastic methods of image reconstruction, the implicit representation of the learned data distribution can be stored and reused to generate multiple realizations of the pore structure very rapidly.

  14. Component-Based Cartoon Face Generation

    Directory of Open Access Journals (Sweden)

    Saman Sepehri Nejad

    2016-11-01

    Full Text Available In this paper, we present a cartoon face generation method that stands on a component-based facial feature extraction approach. Given a frontal face image as an input, our proposed system has the following stages. First, face features are extracted using an extended Active Shape Model. Outlines of the components are locally modified using edge detection, template matching and Hermit interpolation. This modification enhances the diversity of output and accuracy of the component matching required for cartoon generation. Second, to bring cartoon-specific features such as shadows, highlights and, especially, stylish drawing, an array of various face photographs and corresponding hand-drawn cartoon faces are collected. These cartoon templates are automatically decomposed into cartoon components using our proposed method for parameterizing cartoon samples, which is fast and simple. Then, using shape matching methods, the appropriate cartoon component is selected and deformed to fit the input face. Finally, a cartoon face is rendered in a vector format using the rendering rules of the selected template. Experimental results demonstrate effectiveness of our approach in generating life-like cartoon faces.

  15. Speciation without chromatography using selective hydride generation: Inorganic arsenic in rice and samples of marine origin

    Czech Academy of Sciences Publication Activity Database

    Musil, Stanislav; Pétursdóttir, A. H.; Raab, A.; Gunnlaugsdóttir, H.; Krupp, E.; Feldmann, J.

    2014-01-01

    Roč. 86, č. 2 (2014), s. 993-999 ISSN 0003-2700 Grant - others:GA AV ČR(CZ) M200311271 Institutional support: RVO:68081715 Keywords : inorganic arsenic * hydride generation * inductively coupled plasma mass spectrometry Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 5.636, year: 2014

  16. Comparison of pure and 'Latinized' centroidal Voronoi tessellation against various other statistical sampling methods

    International Nuclear Information System (INIS)

    Romero, Vicente J.; Burkardt, John V.; Gunzburger, Max D.; Peterson, Janet S.

    2006-01-01

    A recently developed centroidal Voronoi tessellation (CVT) sampling method is investigated here to assess its suitability for use in statistical sampling applications. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-dimensional parameter spaces. On several 2-D test problems CVT has recently been found to provide exceedingly effective and efficient point distributions for response surface generation. Additionally, for statistical function integration and estimation of response statistics associated with uniformly distributed random-variable inputs (uncorrelated), CVT has been found in initial investigations to provide superior points sets when compared against latin-hypercube and simple-random Monte Carlo methods and Halton and Hammersley quasi-random sequence methods. In this paper, the performance of all these sampling methods and a new variant ('Latinized' CVT) are further compared for non-uniform input distributions. Specifically, given uncorrelated normal inputs in a 2-D test problem, statistical sampling efficiencies are compared for resolving various statistics of response: mean, variance, and exceedence probabilities

  17. SESAR: Addressing the need for unique sample identification in the Solid Earth Sciences

    Science.gov (United States)

    Lehnert, K. A.; Goldstein, S. L.; Lenhardt, C.; Vinayagamoorthy, S.

    2004-12-01

    The study of solid earth samples is key to our knowledge of Earth's dynamical systems and evolution. The data generated provide the basis for models and hypotheses in all disciplines of the Geosciences from tectonics to magmatic processes to mantle dynamics to paleoclimate research. Sample-based data are diverse ranging from major and trace element abundances, radiogenic and stable isotope ratios of rocks, minerals, fluid or melt inclusions, to age determinations and descriptions of lithology, texture, mineral or fossil content, stratigraphic context, physical properties. The usefulness of these data is critically dependent on their integration as a coherent data set for each sample. If different data sets for the same sample cannot be combined because the sample cannot be unambiguously recognized, valuable information is lost. The ambiguous naming of samples has been a major problem in the geosciences. Different samples are often given identical names, and there is a tendency for different people analyzing the same sample to rename it in their publications according to local conventions. This situation has generated significant confusion, with samples often losing their "history", making it difficult or impossible to link available data. This has become most evident through the compilation of geochemical data in relational databases such as PetDB, NAVDAT, and GEOROC. While the relational data structure allows linking of disparate data for samples published in different references, linkages cannot be established due to ambiguous sample names. SESAR is a response to this problem of ambiguous naming of samples. SESAR will create a common clearinghouse that provides a centralized registry of sample identifiers, to avoid ambiguity, to systematize sample designation, and ensure that all information associated with a sample would in fact be unique. The project will build a web-based digital registry for solid earth samples that will provide for the first time a way to

  18. WRAP Module 1 sampling strategy and waste characterization alternatives study

    Energy Technology Data Exchange (ETDEWEB)

    Bergeson, C.L.

    1994-09-30

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner.

  19. WRAP Module 1 sampling strategy and waste characterization alternatives study

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner

  20. Recent advances in applications of nanomaterials for sample preparation.

    Science.gov (United States)

    Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei

    2016-01-01

    Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Estimation of population mean under systematic sampling

    Science.gov (United States)

    Noor-ul-amin, Muhammad; Javaid, Amjad

    2017-11-01

    In this study we propose a generalized ratio estimator under non-response for systematic random sampling. We also generate a class of estimators through special cases of generalized estimator using different combinations of coefficients of correlation, kurtosis and variation. The mean square errors and mathematical conditions are also derived to prove the efficiency of proposed estimators. Numerical illustration is included using three populations to support the results.

  2. Electricity Self-Generation Costs for Industrial Companies in Cameroon

    Directory of Open Access Journals (Sweden)

    Diboma Benjamin Salomon

    2010-07-01

    Full Text Available Industrial production in developing countries (DC is frequently perturbed by electric energy supply difficulties. To overcome this problem, generators are used in self-generation of energy, but this leads to an increase of electricity-related expenses. This article assesses the impact of electricity self-generation on Cameroonian industrial companies. The model described in this article is based on data collected through a survey of a representative sample of industrial companies and from numerous previous thematic and statistical studies. The results of our analyses show that expenses related to electricity in industrial companies in Cameroon have increased five times due to electricity rationing and untimely power cuts. The article also suggests some solutions to improve the electricity self-generation capacity of industrial companies.

  3. Operational air sampling report, January--June 30, 1992

    International Nuclear Information System (INIS)

    Lyons, C.L.

    1992-09-01

    Nevada Test Site postshot and tunnel events generate beta/gamma fission products. The Reynolds Electrical ampersand Engineering Co., Inc. air sampling program is designed for measurement of these radionuclides at various facilities supporting these events. Monthly radon sampling is done for documentation of working levels in the tunnel complexes, which would be expected to hove the highest radon levels for on-site facilities. Out of a total of 281 air samples taken in the tunnel complexes, 25 showed airborne fission products with concentrations well below their respective Derived Air Concentrations (DAC). All of these were related to event reentry or mineback operations. Tritiated water vapor and radon levels were very similar to previously reported levels. The 975 air samples taken at the Area-6 decontamination bays and laundry were again well below any DAC calculation standard and negative for any airborne fission products from laboratory analyses

  4. Efficient Unbiased Rendering using Enlightened Local Path Sampling

    DEFF Research Database (Denmark)

    Kristensen, Anders Wang

    measurements, which are the solution to the adjoint light transport problem. The second is a representation of the distribution of radiance and importance in the scene. We also derive a new method of particle sampling, which is advantageous compared to existing methods. Together we call the resulting algorithm....... The downside to using these algorithms is that they can be slow to converge. Due to the nature of Monte Carlo methods, the results are random variables subject to variance. This manifests itself as noise in the images, which can only be reduced by generating more samples. The reason these methods are slow...... is because of a lack of eeffective methods of importance sampling. Most global illumination algorithms are based on local path sampling, which is essentially a recipe for constructing random walks. Using this procedure paths are built based on information given explicitly as part of scene description...

  5. Gender roles in social network sites from generation Y

    Directory of Open Access Journals (Sweden)

    F. Javier Rondan-Cataluña

    2017-12-01

    Full Text Available One of the fundamental and most commonly used communication tools by the generation Y or Millennials are online social networks. The first objective of this study is to model the effects that exercise social participation, community integration and trust in community satisfaction, as an antecedent of routinization. Besides, we propose as a second objective checking if gender roles proposed to underlie the different behaviors that develop social network users. An empirical study was carried out on a sample of 1,448 undergraduate students that are SNS users from Generation Y. First, we applied a structural equation modeling approach to test the proposed model. Second, we followed a methodology using a scale of masculinity and femininity to categorize the sample obtaining three groups: feminine, masculine, and androgynous.

  6. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2013-05-04

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation. However, often in applications the overall flow in the low-resolution simulation that an animator observes and intends to preserve is composed of even lower frequencies than the low resolution itself. In such cases, attempting to match the low-resolution simulation precisely is unnecessarily restrictive. We propose a new sampling technique to efficiently capture the overall flow of a fluid simulation, at the scale of user\\'s choice, in such a way that the sampled information is sufficient to represent what is virtually perceived and no more. Thus, by applying control based on the sampled data, we ensure that in the resulting high-resolution simulation, the overall flow is matched to the low-resolution simulation and the fine details on the high resolution are preserved. The samples we obtain have both spatial and temporal continuity that allows smooth keyframe matching and direct manipulation of visible elements such as smoke density through temporal blending of samples. We demonstrate that a user can easily configure a simulation with our system to achieve desired results. © 2013 Springer-Verlag Berlin Heidelberg.

  7. Commanding Generation Y: How Generation X Military Leaders Can Better Utilize Generational Tendencies

    Science.gov (United States)

    2013-03-21

    generation ( Baby Boomers ). Although the profession of arms is a time-honored tradition steeped in discipline...senior leadership generational tendencies. Command; Leadership; Generation ; Baby Boomer ; Generation X; Generation Y Unclass Unclass Unclass UU 32 USMC...enable commanders to better lead Generation Y within the U.S. military. Discussion: Baby Boomers , Generation X, and Generation Y are

  8. Computer simulation of RBS spectra from samples with surface roughness

    Energy Technology Data Exchange (ETDEWEB)

    Malinský, P., E-mail: malinsky@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic); Hnatowicz, V., E-mail: hnatowicz@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Macková, A., E-mail: mackova@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic)

    2016-03-15

    A fast code for the simulation of common RBS spectra including surface roughness effects has been written and tested on virtual samples comprising either a rough layer deposited on a smooth substrate or smooth layer deposited on a rough substrate and simulated at different geometries. The sample surface or interface relief has been described by a polyline and the simulated RBS spectrum has been obtained as the sum of many particular spectra from randomly chosen particle trajectories. The code includes several procedures generating virtual samples with random and regular (periodical) roughness. The shape of the RBS spectra has been found to change strongly with increasing sample roughness and an increasing angle of the incoming ion beam.

  9. Fiber laser-microscope system for femtosecond photodisruption of biological samples.

    Science.gov (United States)

    Yavaş, Seydi; Erdogan, Mutlu; Gürel, Kutan; Ilday, F Ömer; Eldeniz, Y Burak; Tazebay, Uygar H

    2012-03-01

    We report on the development of a ultrafast fiber laser-microscope system for femtosecond photodisruption of biological targets. A mode-locked Yb-fiber laser oscillator generates few-nJ pulses at 32.7 MHz repetition rate, amplified up to ∼125 nJ at 1030 nm. Following dechirping in a grating compressor, ∼240 fs-long pulses are delivered to the sample through a diffraction-limited microscope, which allows real-time imaging and control. The laser can generate arbitrary pulse patterns, formed by two acousto-optic modulators (AOM) controlled by a custom-developed field-programmable gate array (FPGA) controller. This capability opens the route to fine optimization of the ablation processes and management of thermal effects. Sample position, exposure time and imaging are all computerized. The capability of the system to perform femtosecond photodisruption is demonstrated through experiments on tissue and individual cells.

  10. Gas Generation from K East Basin Sludges - Series I Testing

    International Nuclear Information System (INIS)

    Delegard, Calvin H.; Bryan, Samuel A.; Schmidt, Andrew J.; Bredt, Paul R.; King, Christopher M.; Sell, Rachel L.; Burger, Leland L.; Silvers, Kurt L.

    2000-01-01

    This report describes work to examine the gas generation behavior of actual K East (KE) Basin floor and canister sludge. The path forward for management of the K Basin Sludge is to retrieve, ship, and store the sludge at T Plant until final processing at some future date. Gas generation will impact the designs and costs of systems associated with retrieval, transportation and storage of sludge. The overall goals for this testing were to collect detailed gas generation rate and composition data to ascertain the quantity and reactivity of the metallic uranium (and other reactive species) present in the K Basin sludge. The gas generation evaluation included four large-scale vessels (850 ml) and eight small-scale vessels (30 ml) in an all-metal, leak tight system. The tests were conducted for several thousand hours at ambient and elevated temperatures (32 C, 40 C, 60 C, 80 C, and 95 C) to accelerated the reactions and provide conclusive gas generation data within a reasonable testing period. The sludge used for these tests was collected from the KE Basin floor and canister barrels (containing damaged spent fuel elements) using a consolidated sampling technique (i.e., material from several locations was combined to form ''consolidated samples''). Portions of these samples were sieved to separate particles greater than 250 m (P250) from particle less than 250 m (M250). This separation was performed to mimic the separation operations that are planned during the retrieval of certain K Basin sludge types and to gain a better understanding of how uranium metal is distributed in the sludge. The corrosion rate of the uranium metal particles in the sludge was found to agree reasonably well with corrosion rates reported in the literature

  11. Trends and perspectives of flow injection/sequential injection on-line sample-pretreatment schemes coupled to ETAAS

    DEFF Research Database (Denmark)

    Wang, Jianhua; Hansen, Elo Harald

    2005-01-01

    Flow injection (FI) analysis, the first generation of this technique, became in the 1990s supplemented by its second generation, sequential injection (SI), and most recently by the third generation (i.e.,Lab-on-Valve). The dominant role played by FI in automatic, on-line, sample pretreatments in ...

  12. Stability of volatile organics in environmental soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Maskarinec, M.P.; Bayne, C.K.; Jenkins, R.A.; Johnson, L.H.; Holladay, S.K.

    1992-11-01

    This report focuses on data generated for the purpose of establishing the stability of 19 volatile organic compounds in environmental soil samples. The study was carried out over a 56 day (for two soils) and a 111 day (for one reference soil) time frame and took into account as many variables as possible within the constraints of budget and time. The objectives of the study were: 1) to provide a data base which could be used to provide guidance on pre-analytical holding times for regulatory purposes; and 2) to provide a basis for the evaluation of data which is generated outside of the currently allowable holding times.

  13. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  14. Genomic scans for selective sweeps using SNP data

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Williamson, Scott; Kim, Yuseob

    2005-01-01

    of the selection coefficient. To illustrate the method, we apply our approach to data from the Seattle SNP project and to Chromosome 2 data from the HapMap project. In Chromosome 2, the most extreme signal is found in the lactase gene, which previously has been shown to be undergoing positive selection. Evidence...

  15. The Idea to Synchronize Measuring Paths for Two Different Current Generators Operated as Position-Voltage Converters

    Directory of Open Access Journals (Sweden)

    Gębura Andrzej

    2016-08-01

    Full Text Available The study deals with the issue how to apply two different types of current generators at the same time, namely a three-phase AC generator and a DC generator of commutator type. What is more interesting, the two generators would be able to collaborate both during the phase of electromechanical sampling and the phase of electronic sampling. It will enable structural improvement of sensitivity and resolution parameters and shall open new opportunities to investigate new types if mechanical phenomena (that have not been already tracked by means of the FAM-C and FDM-A methods, such as torsion of torque transmission shafts.

  16. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia.

    Science.gov (United States)

    Hernandez-Valladares, Maria; Aasebø, Elise; Selheim, Frode; Berven, Frode S; Bruserud, Øystein

    2016-08-22

    Global mass spectrometry (MS)-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML) biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC) or metal oxide affinity chromatography (MOAC). We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP) as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  17. Second-Generation Prisoners and the Transmission of Domestic Violence.

    Science.gov (United States)

    Will, Joanna L; Loper, Ann B; Jackson, Shelly L

    2016-01-01

    Adult inmates who experienced the incarceration of a parent, known as "second-generation prisoners," experience unique challenges and are at heightened risk for experiencing other adversities throughout the life span. Our study investigated one specific, and previously unexplored, type of adversity--domestic violence--within a sample of 293 incarcerated adults. We examined the relation between generation status (first- or second-generation prisoners), childhood exposure to domestic violence, and participation in adult relationship violence prior to incarceration. Results indicate that prisoners who had been exposed to domestic violence in childhood were more likely to engage in intimate partner violence resulting in inflicted and received injury. Relative to first-generation prisoners, second-generation prisoners reported more childhood domestic violence exposure and were more likely to have been injured by a relationship partner. However, this relation between second-generation status and injury victimization was mediated by domestic violence exposure. These results support an intergenerational pattern of domestic violence and suggest that second-generation prisoners are a unique population worthy of future investigation and mental health intervention. © The Author(s) 2014.

  18. Wilsonville wastewater sampling program. Final report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-10-01

    As part of its contrast to design, build and operate the SRC-1 Demonstration Plant in cooperation with the US Department of Energy (DOE), International Coal Refining Company (ICRC) was required to collect and evaluate data related to wastewater streams and wastewater treatment procedures at the SRC-1 Pilot Plant facility. The pilot plant is located at Wilsonville, Alabama and is operated by Catalytic, Inc. under the direction of Southern Company Services. The plant is funded in part by the Electric Power Research Institute and the DOE. ICRC contracted with Catalytic, Inc. to conduct wastewater sampling. Tasks 1 through 5 included sampling and analysis of various wastewater sources and points of different steps in the biological treatment facility at the plant. The sampling program ran from May 1 to July 31, 1982. Also included in the sampling program was the generation and analysis of leachate from SRC product using standard laboratory leaching procedures. For Task 6, available plant wastewater data covering the period from February 1978 to December 1981 was analyzed to gain information that might be useful for a demonstration plant design basis. This report contains a tabulation of the analytical data, a summary tabulation of the historical operating data that was evaluated and comments concerning the data. The procedures used during the sampling program are also documented.

  19. THz generation from a nanocrystalline silicon-based photoconductive device

    International Nuclear Information System (INIS)

    Daghestani, N S; Persheyev, S; Cataluna, M A; Rose, M J; Ross, G

    2011-01-01

    Terahertz generation has been achieved from a photoconductive switch based on hydrogenated nanocrystalline silicon (nc-Si:H), gated by a femtosecond laser. The nc-Si:H samples were produced by a hot wire chemical vapour deposition process, a process with low production costs owing to its higher growth rate and manufacturing simplicity. Although promising ultrafast carrier dynamics of nc-Si have been previously demonstrated, this is the first report on THz generation from a nc-Si:H material

  20. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.