WorldWideScience

Sample records for hapmap samples generating

  1. Generating samples for association studies based on HapMap data

    Directory of Open Access Journals (Sweden)

    Chen Yixuan

    2008-01-01

    Full Text Available Abstract Background With the completion of the HapMap project, a variety of computational algorithms and tools have been proposed for haplotype inference, tag SNP selection and genome-wide association studies. Simulated data are commonly used in evaluating these new developed approaches. In addition to simulations based on population models, empirical data generated by perturbing real data, has also been used because it may inherit specific properties from real data. However, there is no tool that is publicly available to generate large scale simulated variation data by taking into account knowledge from the HapMap project. Results A computer program (gs was developed to quickly generate a large number of samples based on real data that are useful for a variety of purposes, including evaluating methods for haplotype inference, tag SNP selection and association studies. Two approaches have been implemented to generate dense SNP haplotype/genotype data that share similar local linkage disequilibrium (LD patterns as those in human populations. The first approach takes haplotype pairs from samples as inputs, and the second approach takes patterns of haplotype block structures as inputs. Both quantitative and qualitative traits have been incorporated in the program. Phenotypes are generated based on a disease model, or based on the effect of a quantitative trait nucleotide, both of which can be specified by users. In addition to single-locus disease models, two-locus disease models have also been implemented that can incorporate any degree of epistasis. Users are allowed to specify all nine parameters in a 3 × 3 penetrance table. For several commonly used two-locus disease models, the program can automatically calculate penetrances based on the population prevalence and marginal effects of a disease that users can conveniently specify. Conclusion The program gs can effectively generate large scale genetic and phenotypic variation data that can be

  2. Geographical affinities of the HapMap samples.

    Directory of Open Access Journals (Sweden)

    Miao He

    Full Text Available The HapMap samples were collected for medical-genetic studies, but are also widely used in population-genetic and evolutionary investigations. Yet the ascertainment of the samples differs from most population-genetic studies which collect individuals who live in the same local region as their ancestors. What effects could this non-standard ascertainment have on the interpretation of HapMap results?We compared the HapMap samples with more conventionally-ascertained samples used in population- and forensic-genetic studies, including the HGDP-CEPH panel, making use of published genome-wide autosomal SNP data and Y-STR haplotypes, as well as producing new Y-STR data. We found that the HapMap samples were representative of their broad geographical regions of ancestry according to all tests applied. The YRI and JPT were indistinguishable from independent samples of Yoruba and Japanese in all ways investigated. However, both the CHB and the CEU were distinguishable from all other HGDP-CEPH populations with autosomal markers, and both showed Y-STR similarities to unusually large numbers of populations, perhaps reflecting their admixed origins.The CHB and JPT are readily distinguished from one another with both autosomal and Y-chromosomal markers, and results obtained after combining them into a single sample should be interpreted with caution. The CEU are better described as being of Western European ancestry than of Northern European ancestry as often reported. Both the CHB and CEU show subtle but detectable signs of admixture. Thus the YRI and JPT samples are well-suited to standard population-genetic studies, but the CHB and CEU less so.

  3. Semantic Modeling for SNPs Associated with Ethnic Disparities in HapMap Samples

    Directory of Open Access Journals (Sweden)

    HyoYoung Kim

    2014-03-01

    Full Text Available Single-nucleotide polymorphisms (SNPs have been emerging out of the efforts to research human diseases and ethnic disparities. A semantic network is needed for in-depth understanding of the impacts of SNPs, because phenotypes are modulated by complex networks, including biochemical and physiological pathways. We identified ethnicity-specific SNPs by eliminating overlapped SNPs from HapMap samples, and the ethnicity-specific SNPs were mapped to the UCSC RefGene lists. Ethnicity-specific genes were identified as follows: 22 genes in the USA (CEU individuals, 25 genes in the Japanese (JPT individuals, and 332 genes in the African (YRI individuals. To analyze the biologically functional implications for ethnicity-specific SNPs, we focused on constructing a semantic network model. Entities for the network represented by "Gene," "Pathway," "Disease," "Chemical," "Drug," "ClinicalTrials," "SNP," and relationships between entity-entity were obtained through curation. Our semantic modeling for ethnicity-specific SNPs showed interesting results in the three categories, including three diseases ("AIDS-associated nephropathy," "Hypertension," and "Pelvic infection", one drug ("Methylphenidate", and five pathways ("Hemostasis," "Systemic lupus erythematosus," "Prostate cancer," "Hepatitis C virus," and "Rheumatoid arthritis". We found ethnicity-specific genes using the semantic modeling, and the majority of our findings was consistent with the previous studies - that an understanding of genetic variability explained ethnicity-specific disparities.

  4. An evaluation of the performance of tag SNPs derived from HapMap in a Caucasian population.

    Directory of Open Access Journals (Sweden)

    Alexandre Montpetit

    2006-03-01

    Full Text Available The Haplotype Map (HapMap project recently generated genotype data for more than 1 million single-nucleotide polymorphisms (SNPs in four population samples. The main application of the data is in the selection of tag single-nucleotide polymorphisms (tSNPs to use in association studies. The usefulness of this selection process needs to be verified in populations outside those used for the HapMap project. In addition, it is not known how well the data represent the general population, as only 90-120 chromosomes were used for each population and since the genotyped SNPs were selected so as to have high frequencies. In this study, we analyzed more than 1,000 individuals from Estonia. The population of this northern European country has been influenced by many different waves of migrations from Europe and Russia. We genotyped 1,536 randomly selected SNPs from two 500-kbp ENCODE regions on Chromosome 2. We observed that the tSNPs selected from the CEPH (Centre d'Etude du Polymorphisme Humain from Utah (CEU HapMap samples (derived from US residents with northern and western European ancestry captured most of the variation in the Estonia sample. (Between 90% and 95% of the SNPs with a minor allele frequency of more than 5% have an r2 of at least 0.8 with one of the CEU tSNPs. Using the reverse approach, tags selected from the Estonia sample could almost equally well describe the CEU sample. Finally, we observed that the sample size, the allelic frequency, and the SNP density in the dataset used to select the tags each have important effects on the tagging performance. Overall, our study supports the use of HapMap data in other Caucasian populations, but the SNP density and the bias towards high-frequency SNPs have to be taken into account when designing association studies.

  5. Genotype Imputation for Latinos Using the HapMap and 1000 Genomes Project Reference Panels

    Directory of Open Access Journals (Sweden)

    Xiaoyi eGao

    2012-06-01

    Full Text Available Genotype imputation is a vital tool in genome-wide association studies (GWAS and meta-analyses of multiple GWAS results. Imputation enables researchers to increase genomic coverage and to pool data generated using different genotyping platforms. HapMap samples are often employed as the reference panel. More recently, the 1000 Genomes Project resource is becoming the primary source for reference panels. Multiple GWAS and meta-analyses are targeting Latinos, the most populous and fastest growing minority group in the US. However, genotype imputation resources for Latinos are rather limited compared to individuals of European ancestry at present, largely because of the lack of good reference data. One choice of reference panel for Latinos is one derived from the population of Mexican individuals in Los Angeles contained in the HapMap Phase 3 project and the 1000 Genomes Project. However, a detailed evaluation of the quality of the imputed genotypes derived from the public reference panels has not yet been reported. Using simulation studies, the Illumina OmniExpress GWAS data from the Los Angles Latino Eye Study and the MACH software package, we evaluated the accuracy of genotype imputation in Latinos. Our results show that the 1000 Genomes Project AMR+CEU+YRI reference panel provides the highest imputation accuracy for Latinos, and that also including Asian samples in the panel can reduce imputation accuracy. We also provide the imputation accuracy for each autosomal chromosome using the 1000 Genomes Project panel for Latinos. Our results serve as a guide to future imputation-based analysis in Latinos.

  6. SNPexp - A web tool for calculating and visualizing correlation between HapMap genotypes and gene expression levels

    Directory of Open Access Journals (Sweden)

    Franke Andre

    2010-12-01

    Full Text Available Abstract Background Expression levels for 47294 transcripts in lymphoblastoid cell lines from all 270 HapMap phase II individuals, and genotypes (both HapMap phase II and III of 3.96 million single nucleotide polymorphisms (SNPs in the same individuals are publicly available. We aimed to generate a user-friendly web based tool for visualization of the correlation between SNP genotypes within a specified genomic region and a gene of interest, which is also well-known as an expression quantitative trait locus (eQTL analysis. Results SNPexp is implemented as a server-side script, and publicly available on this website: http://tinyurl.com/snpexp. Correlation between genotype and transcript expression levels are calculated by performing linear regression and the Wald test as implemented in PLINK and visualized using the UCSC Genome Browser. Validation of SNPexp using previously published eQTLs yielded comparable results. Conclusions SNPexp provides a convenient and platform-independent way to calculate and visualize the correlation between HapMap genotypes within a specified genetic region anywhere in the genome and gene expression levels. This allows for investigation of both cis and trans effects. The web interface and utilization of publicly available and widely used software resources makes it an attractive supplement to more advanced bioinformatic tools. For the advanced user the program can be used on a local computer on custom datasets.

  7. Unexpected Relationships and Inbreeding in HapMap Phase III Populations

    Science.gov (United States)

    Stevens, Eric L.; Baugher, Joseph D.; Shirley, Matthew D.; Frelin, Laurence P.; Pevsner, Jonathan

    2012-01-01

    Correct annotation of the genetic relationships between samples is essential for population genomic studies, which could be biased by errors or omissions. To this end, we used identity-by-state (IBS) and identity-by-descent (IBD) methods to assess genetic relatedness of individuals within HapMap phase III data. We analyzed data from 1,397 individuals across 11 ethnic populations. Our results support previous studies (Pemberton et al., 2010; Kyriazopoulou-Panagiotopoulou et al., 2011) assessing unknown relatedness present within this population. Additionally, we present evidence for 1,657 novel pairwise relationships across 9 populations. Surprisingly, significant Cotterman's coefficients of relatedness K1 (IBD1) values were detected between pairs of known parents. Furthermore, significant K2 (IBD2) values were detected in 32 previously annotated parent-child relationships. Consistent with a hypothesis of inbreeding, regions of homozygosity (ROH) were identified in the offspring of related parents, of which a subset overlapped those reported in previous studies (Gibson et al. 2010; Johnson et al. 2011). In total, we inferred 28 inbred individuals with ROH that overlapped areas of relatedness between the parents and/or IBD2 sharing at a different genomic locus between a child and a parent. Finally, 8 previously annotated parent-child relationships had unexpected K0 (IBD0) values (resulting from a chromosomal abnormality or genotype error), and 10 previously annotated second-degree relationships along with 38 other novel pairwise relationships had unexpected IBD2 (indicating two separate paths of recent ancestry). These newly described types of relatedness may impact the outcome of previous studies and should inform the design of future studies relying on the HapMap Phase III resource. PMID:23185369

  8. PWR steam generator tubing sample library

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    In order to compile the tubing sample library, two approaches were employed: (a) tubing sample replication by either chemical or mechanical means, based on field tube data and metallography reports for tubes already destructively examined; and (b) acquisition of field tubes removed from operating or retired steam generators. In addition, a unique mercury modeling concept is in use to guide the selection of replica samples. A compendium was compiled that summarizes field observations and morphologies of steam generator tube degradation types based on available NDE, destructive examinations, and field reports. This compendium was used in selecting candidate degradation types that were manufactured for inclusion in the tube library

  9. GLIDERS - A web-based search engine for genome-wide linkage disequilibrium between HapMap SNPs

    Directory of Open Access Journals (Sweden)

    Broxholme John

    2009-10-01

    Full Text Available Abstract Background A number of tools for the examination of linkage disequilibrium (LD patterns between nearby alleles exist, but none are available for quickly and easily investigating LD at longer ranges (>500 kb. We have developed a web-based query tool (GLIDERS: Genome-wide LInkage DisEquilibrium Repository and Search engine that enables the retrieval of pairwise associations with r2 ≥ 0.3 across the human genome for any SNP genotyped within HapMap phase 2 and 3, regardless of distance between the markers. Description GLIDERS is an easy to use web tool that only requires the user to enter rs numbers of SNPs they want to retrieve genome-wide LD for (both nearby and long-range. The intuitive web interface handles both manual entry of SNP IDs as well as allowing users to upload files of SNP IDs. The user can limit the resulting inter SNP associations with easy to use menu options. These include MAF limit (5-45%, distance limits between SNPs (minimum and maximum, r2 (0.3 to 1, HapMap population sample (CEU, YRI and JPT+CHB combined and HapMap build/release. All resulting genome-wide inter-SNP associations are displayed on a single output page, which has a link to a downloadable tab delimited text file. Conclusion GLIDERS is a quick and easy way to retrieve genome-wide inter-SNP associations and to explore LD patterns for any number of SNPs of interest. GLIDERS can be useful in identifying SNPs with long-range LD. This can highlight mis-mapping or other potential association signal localisation problems.

  10. Gas generation from Hanford grout samples

    International Nuclear Information System (INIS)

    Jonah, C.D.; Kapoor, S.; Matheson, M.S.; Mulac, W.A.; Meisel, D.

    1996-01-01

    In an extension of our work on the radiolytic processes that occur in the waste tanks at the Hanford site, we studied the gas generation from grout samples that contained nuclear waste simulants. Grout is one option for the long-term storage of low-level nuclear waste solutions but the radiolytic effects on grout have not been thoroughly defined. In particular, the generation of potentially flammable and hazardous gases required quantification. A research team at Argonne examined this issue and found that the total amount of gases generated radiolytically from the WHC samples was an order of magnitude higher than predicted. This implies that novel pathways fro charge migration from the solid grout to the associated water are responsible for gas evolution. The grout samples produced hydrogen, nitrous oxide, and carbon monoxide as well as nitrogen and oxygen. Yields of each of these substances were determined for doses that are equivalent to about 80 years storage of the grout. Carbon monoxide, which was produced in 2% yield, is of particular importance because even small amounts may adversely affect catalytic conversion instrumentation that has been planned for installation in the storage vaults

  11. New Generation Flask Sampling Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Smith, James R. [AOS, Inc., Colorado Springs, CO (United States)

    2017-11-09

    Scientists are turning their focus to the Arctic, site of one of the strongest climate change signals. A new generation of technologies is required to function within that harsh environment, chart evolution of its trace gases and provide new kinds of information for models of the atmosphere. Our response to the solicitation tracks how global atmospheric monitoring was launched more than a half century ago; namely, acquisition of discrete samples of air by flask and subsequent analysis in the laboratory. AOS is proposing to develop a new generation of flask sampling technology. It will enable the new Arctic programs to begin with objective high density sampling of the atmosphere by UAS. The Phase I program will build the prototype flask technology and show that it can acquire and store mol fractions of CH4 and CO2 and value of δ13C with good fidelity. A CAD model will be produced for the entire platform including a package with 100 flasks and the airframe with auto-pilot, electronic propulsion and ground-to-air communications. A mobile flask analysis station will be prototyped in Phase I and designed to final form in Phase II. It expends very small sample per analysis and will interface directly to the flask package integrated permanently into the UAS fuselage. Commercial Applications and Other Benefits: • The New Generation Flask Sampling Technology able to provide a hundred or more samples of air per UAS mission. • A mobile analysis station expending far less sample than the existing ones and small enough to be stationed at the remote sites of Arctic operations. • A new form of validation for continuous trace gas observations from all platforms including the small UAS. • Further demonstration to potential customers of the AOS capabilities to invent, build, deploy and exploit entire platforms for observations of Earth’s atmosphere and ocean. Key Words: Flask Sampler, Mobile Analysis Station, Trace Gas, CO2, CH4, δC13, UAS, Baseline Airborne Observatory

  12. Data analysis for steam generator tubing samples

    International Nuclear Information System (INIS)

    Dodd, C.V.

    1996-07-01

    The objective of the Improved Eddy-Current ISI for Steam Generators program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for inservice inspection of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC's mobile NDE laboratory and staff. This report provides a description of the application of advanced eddy-current neural network analysis methods for the detection and evaluation of common steam generator tubing flaws including axial and circumferential outer-diameter stress-corrosion cracking and intergranular attack. The report describes the training of the neural networks on tubing samples with known defects and the subsequent evaluation results for unknown samples. Evaluations were done in the presence of artifacts. Computer programs are given in the appendix

  13. Single-molecule optical genome mapping of a human HapMap and a colorectal cancer cell line.

    Science.gov (United States)

    Teo, Audrey S M; Verzotto, Davide; Yao, Fei; Nagarajan, Niranjan; Hillmer, Axel M

    2015-01-01

    Next-generation sequencing (NGS) technologies have changed our understanding of the variability of the human genome. However, the identification of genome structural variations based on NGS approaches with read lengths of 35-300 bases remains a challenge. Single-molecule optical mapping technologies allow the analysis of DNA molecules of up to 2 Mb and as such are suitable for the identification of large-scale genome structural variations, and for de novo genome assemblies when combined with short-read NGS data. Here we present optical mapping data for two human genomes: the HapMap cell line GM12878 and the colorectal cancer cell line HCT116. High molecular weight DNA was obtained by embedding GM12878 and HCT116 cells, respectively, in agarose plugs, followed by DNA extraction under mild conditions. Genomic DNA was digested with KpnI and 310,000 and 296,000 DNA molecules (≥ 150 kb and 10 restriction fragments), respectively, were analyzed per cell line using the Argus optical mapping system. Maps were aligned to the human reference by OPTIMA, a new glocal alignment method. Genome coverage of 6.8× and 5.7× was obtained, respectively; 2.9× and 1.7× more than the coverage obtained with previously available software. Optical mapping allows the resolution of large-scale structural variations of the genome, and the scaffold extension of NGS-based de novo assemblies. OPTIMA is an efficient new alignment method; our optical mapping data provide a resource for genome structure analyses of the human HapMap reference cell line GM12878, and the colorectal cancer cell line HCT116.

  14. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  15. Comparison of HapMap and 1000 Genomes Reference Panels in a Large-Scale Genome-Wide Association Study.

    Directory of Open Access Journals (Sweden)

    Paul S de Vries

    Full Text Available An increasing number of genome-wide association (GWA studies are now using the higher resolution 1000 Genomes Project reference panel (1000G for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In order to assess the improvement of 1000G over HapMap imputation in identifying associated loci, we compared the results of GWA studies of circulating fibrinogen based on the two reference panels. Using both HapMap and 1000G imputation we performed a meta-analysis of 22 studies comprising the same 91,953 individuals. We identified six additional signals using 1000G imputation, while 29 loci were associated using both HapMap and 1000G imputation. One locus identified using HapMap imputation was not significant using 1000G imputation. The genome-wide significance threshold of 5×10-8 is based on the number of independent statistical tests using HapMap imputation, and 1000G imputation may lead to further independent tests that should be corrected for. When using a stricter Bonferroni correction for the 1000G GWA study (P-value < 2.5×10-8, the number of loci significant only using HapMap imputation increased to 4 while the number of loci significant only using 1000G decreased to 5. In conclusion, 1000G imputation enabled the identification of 20% more loci than HapMap imputation, although the advantage of 1000G imputation became less clear when a stricter Bonferroni correction was used. More generally, our results provide insights that are applicable to the implementation of other dense reference panels that are under development.

  16. Treatment of Nuclear Data Covariance Information in Sample Generation

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wieselquist, William [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division

    2017-10-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  17. Treatment of Nuclear Data Covariance Information in Sample Generation

    International Nuclear Information System (INIS)

    Swiler, Laura Painton; Adams, Brian M.; Wieselquist, William

    2017-01-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on developing a sampling capability that can handle the challenges of generating samples from nuclear cross-section data. The covariance information between energy groups tends to be very ill-conditioned and thus poses a problem using traditional methods for generated correlated samples. This report outlines a method that addresses the sample generation from cross-section matrices.

  18. Sample Scripts for Generating PaGE-OM XML [

    Lifescience Database Archive (English)

    Full Text Available Sample Scripts for Generating PaGE-OM XML This page is offering some sample scripts...on MySQL. Outline chart of procedure 6. Creating RDB tables for Generating PaGE-OM XML These scripts help yo...wnload: create_tables_sql2.zip 7. Generating PaGE-OM XML from phenotype data This sample Perl script helps y

  19. Genome-wide screen for universal individual identification SNPs based on the HapMap and 1000 Genomes databases.

    Science.gov (United States)

    Huang, Erwen; Liu, Changhui; Zheng, Jingjing; Han, Xiaolong; Du, Weian; Huang, Yuanjian; Li, Chengshi; Wang, Xiaoguang; Tong, Dayue; Ou, Xueling; Sun, Hongyu; Zeng, Zhaoshu; Liu, Chao

    2018-04-03

    Differences among SNP panels for individual identification in SNP-selecting and populations led to few common SNPs, compromising their universal applicability. To screen all universal SNPs, we performed a genome-wide SNP mining in multiple populations based on HapMap and 1000Genomes databases. SNPs with high minor allele frequencies (MAF) in 37 populations were selected. With MAF from ≥0.35 to ≥0.43, the number of selected SNPs decreased from 2769 to 0. A total of 117 SNPs with MAF ≥0.39 have no linkage disequilibrium with each other in every population. For 116 of the 117 SNPs, cumulative match probability (CMP) ranged from 2.01 × 10-48 to 1.93 × 10-50 and cumulative exclusion probability (CEP) ranged from 0.9999999996653 to 0.9999999999945. In 134 tested Han samples, 110 of the 117 SNPs remained within high MAF and conformed to Hardy-Weinberg equilibrium, with CMP = 4.70 × 10-47 and CEP = 0.999999999862. By analyzing the same number of autosomal SNPs as in the HID-Ion AmpliSeq Identity Panel, i.e. 90 randomized out of the 110 SNPs, our panel yielded preferable CMP and CEP. Taken together, the 110-SNPs panel is advantageous for forensic test, and this study provided plenty of highly informative SNPs for compiling final universal panels.

  20. Comparison of HapMap and 1000 Genomes Reference Panels in a Large-Scale Genome-Wide Association Study

    DEFF Research Database (Denmark)

    de Vries, Paul S; Sabater-Lleal, Maria; Chasman, Daniel I

    2017-01-01

    An increasing number of genome-wide association (GWA) studies are now using the higher resolution 1000 Genomes Project reference panel (1000G) for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In...

  1. Generation of complementary sampled phase-only holograms.

    Science.gov (United States)

    Tsang, P W M; Chow, Y T; Poon, T-C

    2016-10-03

    If an image is uniformly down-sampled into a sparse form and converted into a hologram, the phase component alone will be adequate to reconstruct the image. However, the appearance of the reconstructed image is degraded with numerous empty holes. In this paper, we present a low complexity and non-iterative solution to this problem. Briefly, two phase-only holograms are generated for an image, each based on a different down-sampling lattice. Subsequently, the holograms are displayed alternately at high frame rate. The reconstructed images of the 2 holograms will appear to be a single, densely sampled image with enhance visual quality.

  2. Cr(VI) generation during sample preparation of solid samples – A ...

    African Journals Online (AJOL)

    Cr(VI) generation during sample preparation of solid samples – A chromite ore case study. R.I Glastonbury, W van der Merwe, J.P Beukes, P.G van Zyl, G Lachmann, C.J.H Steenkamp, N.F Dawson, M.H Stewart ...

  3. Effective selection of informative SNPs and classification on the HapMap genotype data

    Directory of Open Access Journals (Sweden)

    Wang Lipo

    2007-12-01

    Full Text Available Abstract Background Since the single nucleotide polymorphisms (SNPs are genetic variations which determine the difference between any two unrelated individuals, the SNPs can be used to identify the correct source population of an individual. For efficient population identification with the HapMap genotype data, as few informative SNPs as possible are required from the original 4 million SNPs. Recently, Park et al. (2006 adopted the nearest shrunken centroid method to classify the three populations, i.e., Utah residents with ancestry from Northern and Western Europe (CEU, Yoruba in Ibadan, Nigeria in West Africa (YRI, and Han Chinese in Beijing together with Japanese in Tokyo (CHB+JPT, from which 100,736 SNPs were obtained and the top 82 SNPs could completely classify the three populations. Results In this paper, we propose to first rank each feature (SNP using a ranking measure, i.e., a modified t-test or F-statistics. Then from the ranking list, we form different feature subsets by sequentially choosing different numbers of features (e.g., 1, 2, 3, ..., 100. with top ranking values, train and test them by a classifier, e.g., the support vector machine (SVM, thereby finding one subset which has the highest classification accuracy. Compared to the classification method of Park et al., we obtain a better result, i.e., good classification of the 3 populations using on average 64 SNPs. Conclusion Experimental results show that the both of the modified t-test and F-statistics method are very effective in ranking SNPs about their classification capabilities. Combined with the SVM classifier, a desirable feature subset (with the minimum size and most informativeness can be quickly found in the greedy manner after ranking all SNPs. Our method is able to identify a very small number of important SNPs that can determine the populations of individuals.

  4. Classifier Directed Data Hybridization for Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-11-01

    Full Text Available Quality segment generation is a well-known challenge and research objective within Geographic Object-based Image Analysis (GEOBIA. Although methodological avenues within GEOBIA are diverse, segmentation commonly plays a central role in most approaches, influencing and being influenced by surrounding processes. A general approach using supervised quality measures, specifically user provided reference segments, suggest casting the parameters of a given segmentation algorithm as a multidimensional search problem. In such a sample supervised segment generation approach, spatial metrics observing the user provided reference segments may drive the search process. The search is commonly performed by metaheuristics. A novel sample supervised segment generation approach is presented in this work, where the spectral content of provided reference segments is queried. A one-class classification process using spectral information from inside the provided reference segments is used to generate a probability image, which in turn is employed to direct a hybridization of the original input imagery. Segmentation is performed on such a hybrid image. These processes are adjustable, interdependent and form a part of the search problem. Results are presented detailing the performances of four method variants compared to the generic sample supervised segment generation approach, under various conditions in terms of resultant segment quality, required computing time and search process characteristics. Multiple metrics, metaheuristics and segmentation algorithms are tested with this approach. Using the spectral data contained within user provided reference segments to tailor the output generally improves the results in the investigated problem contexts, but at the expense of additional required computing time.

  5. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  6. DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  7. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  8. Radiolytic and thermal generation of gases from Hanford grout samples

    Energy Technology Data Exchange (ETDEWEB)

    Meisel, D.; Jonah, C.D.; Kapoor, S.; Matheson, M.S.; Mulac, W.A.

    1993-10-01

    Gamma irradiation of WHC-supplied samples of grouted Tank 102-AP simulated nonradioactive waste has been carried out at three dose rates, 0.25, 0.63, and 130 krad/hr. The low dose rate corresponds to that in the actual grout vaults; with the high dose rate, doses equivalent to more than 40 years in the grout vault were achieved. An average G(H{sub 2}) = 0.047 molecules/100 eV was found, independent of dose rate. The rate of H2 production decreases above 80 Mrad. For other gases, G(N{sub 2}) = 0.12, G(O{sub 2}) = 0.026, G(N{sub 2}O) = 0.011 and G(CO) = 0.0042 at 130 krad/hr were determined. At lower dose rates, N{sub 2} and O{sub 2} could not be measured because of interference by trapped air. The value of G(H{sub 2}) is higher than expected, suggesting segregation of water from nitrate and nitrite salts in the grout. The total pressure generated by the radiolysis at 130 krad/h has been independently measured, and total amounts of gases generated were calculated from this measurement. Good agreement between this measurement and the sum of all the gases that were independently determined was obtained. Therefore, the individual gas measurements account for most of the major components that are generated by the radiolysis. At 90 {degree}C, H{sub 2}, N{sub 2}, and N{sub 2}O were generated at a rate that could be described by exponential formation of each of the gases. Gases measured at the lower temperatures were probably residual trapped gases. An as yet unknown product interfered with oxygen determinations at temperatures above ambient. The thermal results do not affect the radiolytic findings.

  9. Comparisons of methods for generating conditional Poisson samples and Sampford samples

    OpenAIRE

    Grafström, Anton

    2005-01-01

    Methods for conditional Poisson sampling (CP-sampling) and Sampford sampling are compared and the focus is on the efficiency of the methods. The efficiency is investigated by simulation in different sampling situations. It was of interest to compare methods since new methods for both CP-sampling and Sampford sampling were introduced by Bondesson, Traat & Lundqvist in 2004. The new methods are acceptance rejection methods that use the efficient Pareto sampling method. They are found to be ...

  10. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  11. Comparison of Antidepressant Efficacy-related SNPs Among Taiwanese and Four Populations in the HapMap Database

    Directory of Open Access Journals (Sweden)

    Mei-Hung Chi

    2011-07-01

    Full Text Available The genetic influence of single nucleotide polymorphisms (SNPs on antidepressant efficacy has been previously demonstrated. To evaluate whether there are ethnic differences, we compared the allele frequencies of antidepressant efficacy-related SNPs between the Taiwanese population and four other populations in the HapMap database. We recruited 198 Taiwanese major depression patients and 106 Taiwanese controls. A panel of possible relevant SNPs (in brain-derived neurotrophic factor, 5-hydroxytryptamine receptor 2A, interleukin 1 beta, and G-protein beta 3 subunit genes was selected for comparisons of allele frequencies using the χ2 test. Our results suggested no difference between Taiwanese patients and controls, but there were significant differences among Taiwanese controls and the other four ethnic groups in brain-derived neurotrophic factor, 5-hydroxytryptamine receptor 2A, interleukin 1 beta and G-protein beta 3 subunit genes. We conclude that there are ethnic differences in the allele frequencies of antidepressant efficacy-related SNPs, and that the degree of variations is consistent with geographic distances. Further investigation is required to verify the attribution of genetic differences to ethnic-specific antidepressant responses.

  12. Generation of Rayleigh waves into mortar and concrete samples.

    Science.gov (United States)

    Piwakowski, B; Fnine, Abdelilah; Goueygou, M; Buyle-Bodin, F

    2004-04-01

    The paper deals with a non-destructive method for characterizing the degraded cover of concrete structures using high-frequency ultrasound. In a preliminary study, the authors emphasized on the interest of using higher frequency Rayleigh waves (within the 0.2-1 MHz frequency band) for on-site inspection of concrete structures with subsurface damage. The present study represents a continuation of the previous work and aims at optimizing the generation and reception of Rayleigh waves into mortar and concrete be means of wedge transducers. This is performed experimentally by checking the influence of the wedge material and coupling agent on the surface wave parameters. The selection of the best combination wedge/coupling is performed by searching separately for the best wedge material and the best coupling material. Three wedge materials and five coupling agents were tested. For each setup the five parameters obtained from the surface wave measurement i.e. the frequency band, the maximal available central frequency, the group velocity error and its standard deviation and finally the error in velocity dispersion characteristic were investigated and classed as a function of the wedge material and the coupling agent. The selection criteria were chosen so as to minimize the absorption of both materials, the randomness of measurements and the systematic error of the group velocity and of dispersion characteristic. Among the three tested wedge materials, Teflon was found to be the best. The investigation on the coupling agent shows that the gel type materials are the best solutions. The "thick" materials displaying higher viscosity were found as the worst. The results show also that the use of a thin plastic film combined with the coupling agent even increases the bandwidth and decreases the uncertainty of measurements.

  13. An automated synthesis-purification-sample-management platform for the accelerated generation of pharmaceutical candidates.

    Science.gov (United States)

    Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y

    2014-04-01

    A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.

  14. Sampling and analysis plan for sampling of liquid waste streams generated by 222-S Laboratory Complex operations

    International Nuclear Information System (INIS)

    Benally, A.B.

    1997-01-01

    This Sampling and Analysis Plan (SAP) establishes the requirements and guidelines to be used by the Waste Management Federal Services of Hanford, Inc. personnel in characterizing liquid waste generated at the 222-S Laboratory Complex. The characterization process to verify the accuracy of process knowledge used for designation and subsequent management of wastes consists of three steps: to prepare the technical rationale and the appendix in accordance with the steps outlined in this SAP; to implement the SAP by sampling and analyzing the requested waste streams; and to compile the report and evaluate the findings to the objectives of this SAP. This SAP applies to portions of the 222-S Laboratory Complex defined as Generator under the Resource Conservation and Recovery Act (RCRA). Any portion of the 222-S Laboratory Complex that is defined or permitted under RCRA as a treatment, storage, or disposal (TSD) facility is excluded from this document. This SAP applies to the liquid waste generated in the 222-S Laboratory Complex. Because the analytical data obtained will be used to manage waste properly, including waste compatibility and waste designation, this SAP will provide directions for obtaining and maintaining the information as required by WAC173-303

  15. Experimental technique to measure thoron generation rate of building material samples using RAD7 detector

    International Nuclear Information System (INIS)

    Csige, I.; Szabó, Zs.; Szabó, Cs.

    2013-01-01

    Thoron ( 220 Rn) is the second most abundant radon isotope in our living environment. In some dwellings it is present in significant amount which calls for its identification and remediation. Indoor thoron originates mainly from building materials. In this work we have developed and tested an experimental technique to measure thoron generation rate in building material samples using RAD7 radon-thoron detector. The mathematical model of the measurement technique provides the thoron concentration response of RAD7 as a function of the sample thickness. For experimental validation of the technique an adobe building material sample was selected for measuring the thoron concentration at nineteen different sample thicknesses. Fitting the parameters of the model to the measurement results, both the generation rate and the diffusion length of thoron was estimated. We have also determined the optimal sample thickness for estimating the thoron generation rate from a single measurement. -- Highlights: • RAD7 is used for the determination of thoron generation rate (emanation). • The described model takes into account the thoron decay and attenuation. • The model describes well the experimental results. • A single point measurement method is offered at a determined sample thickness

  16. Fault Sample Generation for Virtual Testability Demonstration Test Subject to Minimal Maintenance and Scheduled Replacement

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2015-01-01

    Full Text Available Virtual testability demonstration test brings new requirements to the fault sample generation. First, fault occurrence process is described by stochastic process theory. It is discussed that fault occurrence process subject to minimal repair is nonhomogeneous Poisson process (NHPP. Second, the interarrival time distribution function of the next fault event is proposed and three typical kinds of parameterized NHPP are discussed. Third, the procedure of fault sample generation is put forward with the assumptions of minimal maintenance and scheduled replacement. The fault modes and their occurrence time subject to specified conditions and time period can be obtained. Finally, an antenna driving subsystem in automatic pointing and tracking platform is taken as a case to illustrate the proposed method. Results indicate that both the size and structure of the fault samples generated by the proposed method are reasonable and effective. The proposed method can be applied to virtual testability demonstration test well.

  17. Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of genetic marker alleles associated with a trait indicative of fertility of the bovine subject and/or off-spring

    DEFF Research Database (Denmark)

    2009-01-01

    NOVELTY - Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of two or more genetic marker alleles that are associated with a trait indicative of fertility of the bovine subject and/or off-spring. USE - The methods are useful...... for determining fertility in a bovine subject; and selecting bovine subjects for breeding purposes (all claimed). DETAILED DESCRIPTION - Determining fertility in a bovine subject comprises detecting in a sample from the bovine subject the presence or absence of two or more genetic marker alleles...... that are associated with a trait indicative of fertility of the bovine subject and/or off-spring, where the two or more genetic marker alleles are single nucleotide polymorphisms selected from Hapmap60827-rs29019866, ARS-BFGL-NGS-40979, Hapmap47854-BTA-119090, ARS-BFGL-NGS-114679, Hapmap43841-BTA-34601, Hapmap43407...

  18. The determination of arsenic, selenium, antimony, and tin in complex environmental samples by hydride generation AAS

    International Nuclear Information System (INIS)

    Johnson, D.; Beach, C.

    1990-01-01

    Hydride generation techniques are used routinely for the determination of As, Se, Sb and Sn in water samples. Advantages include high sensitivity, simplicity, and relative freedom from interferences. Continuous-flow designs greatly reduce analysis time as well as improve precision and allow for automation. However the accurate analysis of more complex environmental samples such as industrial sludges, soil samples, river sediments, and fly ash remains difficult. Numerous contributing factors influence the accuracy of the hydride technique. Sample digestion methods and sample preparation procedures are of critical importance. The digestion must adequately solubilize the elements of interest without loss by volatilization. Sample preparation procedures that guarantee the proper analyte oxidation state and eliminate the nitric acid and inter-element interferences are needed. In this study, difficult environmental samples were analyzed for As, Se, Sb, and Sn by continuous flow hydride generation. Sample preparation methods were optimized to eliminate interferences. The results of spike recovery studies will be presented. Data from the analysis of the same samples by graphite furnace AAS will be presented for comparison of accuracy, precision, and analysis time

  19. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    OpenAIRE

    Christoff Fourie; Elisabeth Schoepfer

    2014-01-01

    Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA). Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, c...

  20. Stuttering Attitudes among Turkish Family Generations and Neighbors from Representative Samples

    Science.gov (United States)

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: Attitudes toward stuttering, measured by the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S"), are compared among (a) two different representative samples; (b) family generations (children, parents, and either grandparents or uncles and aunts) and neighbors; (c) children, parents, grandparents/adult…

  1. A generational perspective on work values in a South African sample

    Directory of Open Access Journals (Sweden)

    Petronella Jonck

    2017-01-01

    Full Text Available Orientation: In order to ensure harmonious relationships in the workplace, work values of different generational cohorts need to be investigated and understood. Research purpose: The purpose of this study was to investigate the work values of a South African sample from a generational perspective, in order to foster an understanding of the similarities and differences of different generational cohorts in terms of work values. Motivation of the study: Understanding the work values of different generational cohorts could assist organisations to manage and retain human capital in an increasingly competitive environment. Furthermore, it could assist organisations to develop an advanced understanding of employee behaviour, which should inform conflict-resolution strategies to deal with reported conflict between different generational cohorts. Research design, approach and method: The study was conducted within the positivist paradigm and was quantitative in nature. Data were gathered from 301 employees representing three different generational cohorts, namely the Baby Boomers, Generation X and Generation Y. A cross-sectional study was conducted, and data were collected once off by means of the Values Scale. The psychometric properties of the Values Scale have a reliability coefficient of 0.95, and the scale has been applied successfully in various iterations. Main findings: The findings indicate statistically significant differences and similarities between the various generational cohorts in terms of work values. More specifically, similarities and differences between the various generational cohorts were observed with regard to the values of authority, creativity, risk and social interaction in the work context. Practical/managerial implications: Organisations can use the findings of the study to strengthen employee interaction within the work environment. In addition, the findings can be used to inform retention and management strategies, in order

  2. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  3. Evaluation of sampling plans for in-service inspection of steam generator tubes

    International Nuclear Information System (INIS)

    Kurtz, R.J.; Heasler, P.G.; Baird, D.B.

    1994-02-01

    This report summarizes the results of three previous studies to evaluate and compare the effectiveness of sampling plans for steam generator tube inspections. An analytical evaluation and Monte Carlo simulation techniques were the methods used to evaluate sampling plan performance. To test the performance of candidate sampling plans under a variety of conditions, ranges of inspection system reliability were considered along with different distributions of tube degradation. Results from the eddy current reliability studies performed with the retired-from-service Surry 2A steam generator were utilized to guide the selection of appropriate probability of detection and flaw sizing models for use in the analysis. Different distributions of tube degradation were selected to span the range of conditions that might exist in operating steam generators. The principal means of evaluating sampling performance was to determine the effectiveness of the sampling plan for detecting and plugging defective tubes. A summary of key results from the eddy current reliability studies is presented. The analytical and Monte Carlo simulation analyses are discussed along with a synopsis of key results and conclusions

  4. Nitrogen Detection in Bulk Samples Using a D-D Reaction-Based Portable Neutron Generator

    Directory of Open Access Journals (Sweden)

    A. A. Naqvi

    2013-01-01

    Full Text Available Nitrogen concentration was measured via 2.52 MeV nitrogen gamma ray from melamine, caffeine, urea, and disperse orange bulk samples using a newly designed D-D portable neutron generator-based prompt gamma ray setup. Inspite of low flux of thermal neutrons produced by D-D reaction-based portable neutron generator and interference of 2.52 MeV gamma rays from nitrogen in bulk samples with 2.50 MeV gamma ray from bismuth in BGO detector material, an excellent agreement between the experimental and calculated yields of nitrogen gamma rays indicates satisfactory performance of the setup for detection of nitrogen in bulk samples.

  5. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    Science.gov (United States)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  6. Efficient Sampling of the Structure of Crypto Generators' State Transition Graphs

    Science.gov (United States)

    Keller, Jörg

    Cryptographic generators, e.g. stream cipher generators like the A5/1 used in GSM networks or pseudo-random number generators, are widely used in cryptographic network protocols. Basically, they are finite state machines with deterministic transition functions. Their state transition graphs typically cannot be analyzed analytically, nor can they be explored completely because of their size which typically is at least n = 264. Yet, their structure, i.e. number and sizes of weakly connected components, is of interest because a structure deviating significantly from expected values for random graphs may form a distinguishing attack that indicates a weakness or backdoor. By sampling, one randomly chooses k nodes, derives their distribution onto connected components by graph exploration, and extrapolates these results to the complete graph. In known algorithms, the computational cost to determine the component for one randomly chosen node is up to O(√n), which severely restricts the sample size k. We present an algorithm where the computational cost to find the connected component for one randomly chosen node is O(1), so that a much larger sample size k can be analyzed in a given time. We report on the performance of a prototype implementation, and about preliminary analysis for several generators.

  7. A Frequency Matching Method for Generation of a Priori Sample Models from Training Images

    DEFF Research Database (Denmark)

    Lange, Katrine; Cordua, Knud Skou; Frydendall, Jan

    2011-01-01

    This paper presents a Frequency Matching Method (FMM) for generation of a priori sample models based on training images and illustrates its use by an example. In geostatistics, training images are used to represent a priori knowledge or expectations of models, and the FMM can be used to generate...... new images that share the same multi-point statistics as a given training image. The FMM proceeds by iteratively updating voxel values of an image until the frequency of patterns in the image matches the frequency of patterns in the training image; making the resulting image statistically...... indistinguishable from the training image....

  8. Simultaneous analysis of arsenic, antimony, selenium and tellurium in environmental samples using hydride generation ICPMS

    International Nuclear Information System (INIS)

    Jankowski, L.M.; Breidenbach, R.; Bakker, I.J.I.; Epema, O.J.

    2009-01-01

    Full text: A quantitative method for simultaneous analysis of arsenic, antimony, selenium and tellurium in environmental samples is being developed using hydride generation ICPMS. These elements must be first transformed into hydride-forming oxidation states. This is particularly challenging for selenium and antimony because selenium is susceptible to reduction to the non-hydride-forming elemental state and antimony requires strong reducing conditions. The effectiveness of three reducing agents (KI, thiourea, cysteine) is studied. A comparison is made between addition of reducing agent to the sample and addition of KI to the NaBH 4 solution. Best results were obtained with the latter approach. (author)

  9. SNP calling, genotype calling, and sample allele frequency estimation from new-generation sequencing data

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Korneliussen, Thorfinn Sand; Albrechtsen, Anders

    2012-01-01

    We present a statistical framework for estimation and application of sample allele frequency spectra from New-Generation Sequencing (NGS) data. In this method, we first estimate the allele frequency spectrum using maximum likelihood. In contrast to previous methods, the likelihood function is cal...... be extended to various other cases including cases with deviations from Hardy-Weinberg equilibrium. We evaluate the statistical properties of the methods using simulations and by application to a real data set....

  10. Towards a Mobile Ecogenomic sensor: the Third Generation Environmental Sample Processor (3G-ESP).

    Science.gov (United States)

    Birch, J. M.; Pargett, D.; Jensen, S.; Roman, B.; Preston, C. M.; Ussler, W.; Yamahara, K.; Marin, R., III; Hobson, B.; Zhang, Y.; Ryan, J. P.; Scholin, C. A.

    2016-02-01

    Researchers are increasingly using one or more autonomous platforms to characterize ocean processes that change in both space and time. Conceptually, studying processes that change quickly both spatially and temporally seems relatively straightforward. One needs to sample in many locations synoptically over time, or follow a coherent water mass and sample it repeatedly. However, implementing either approach presents many challenges. For example, acquiring samples over days to weeks far from shore, without human intervention, requires multiple systems to work together seamlessly, and the level of autonomy, navigation and communications needed to conduct the work exposes the complexity of these requirements. We are addressing these challenges by developing a new generation of robotic systems that are primarily aimed at studies of microbial-mediated processes. As a step towards realizing this new capability, we have taken lessons learned from our second-generation Environmental Sample Processor (2G-ESP), a robotic microbiology "lab-in-a-can" and have re-engineered the system for use on a Tethys-class Long Range AUV (LRAUV). The new instrument is called the third-generation ESP (3G-ESP), and its integration with the LRAUV provides mobility and a persistent presence not seen before in microbial oceanography. The 3G-ESP autonomously filters a water sample and then either preserves that material for eventual return to a laboratory, or processes the sample in real-time for further downstream molecular analytical analyses. The 3G ESP modularizes hardware needed for the collection and preparation of a sample from subsequent molecular analyses by the use of self-contained "cartridges". Cartridges currently come in two forms: one for the preservation of a sample, and the other for onboard homogenization and handoff for downstream processing via one or more analytical devices. The 3G-ESP is designed as a stand-alone instrument, and thus could be deployed on a variety of

  11. Foam generation and sample composition optimization for the FOAM-C experiment of the ISS

    International Nuclear Information System (INIS)

    Carpy, R; Picker, G; Amann, B; Ranebo, H; Vincent-Bonnieu, S; Minster, O; Winter, J; Dettmann, J; Castiglione, L; Höhler, R; Langevin, D

    2011-01-01

    End of 2009 and early 2010 a sealed cell, for foam generation and observation, has been designed and manufactured at Astrium Friedrichshafen facilities. With the use of this cell, different sample compositions of 'wet foams' have been optimized for mixtures of chemicals such as water, dodecanol, pluronic, aethoxisclerol, glycerol, CTAB, SDS, as well as glass beads. This development is performed in the frame of the breadboarding development activities of the Experiment Container FOAM-C for operation in the ISS Fluid Science Laboratory (ISS). The sample cell supports multiple observation methods such as: Diffusing-Wave and Diffuse Transmission Spectrometry, Time Resolved Correlation Spectroscopy and microscope observation, all of these methods are applied in the cell with a relatively small experiment volume 3 . These units, will be on orbit replaceable sets, that will allow multiple sample compositions processing (in the range of >40).

  12. Evaluating multiplexed next-generation sequencing as a method in palynology for mixed pollen samples.

    Science.gov (United States)

    Keller, A; Danner, N; Grimmer, G; Ankenbrand, M; von der Ohe, K; von der Ohe, W; Rost, S; Härtel, S; Steffan-Dewenter, I

    2015-03-01

    The identification of pollen plays an important role in ecology, palaeo-climatology, honey quality control and other areas. Currently, expert knowledge and reference collections are essential to identify pollen origin through light microscopy. Pollen identification through molecular sequencing and DNA barcoding has been proposed as an alternative approach, but the assessment of mixed pollen samples originating from multiple plant species is still a tedious and error-prone task. Next-generation sequencing has been proposed to avoid this hindrance. In this study we assessed mixed pollen probes through next-generation sequencing of amplicons from the highly variable, species-specific internal transcribed spacer 2 region of nuclear ribosomal DNA. Further, we developed a bioinformatic workflow to analyse these high-throughput data with a newly created reference database. To evaluate the feasibility, we compared results from classical identification based on light microscopy from the same samples with our sequencing results. We assessed in total 16 mixed pollen samples, 14 originated from honeybee colonies and two from solitary bee nests. The sequencing technique resulted in higher taxon richness (deeper assignments and more identified taxa) compared to light microscopy. Abundance estimations from sequencing data were significantly correlated with counted abundances through light microscopy. Simulation analyses of taxon specificity and sensitivity indicate that 96% of taxa present in the database are correctly identifiable at the genus level and 70% at the species level. Next-generation sequencing thus presents a useful and efficient workflow to identify pollen at the genus and species level without requiring specialised palynological expert knowledge. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  13. Foam generation and sample composition optimization for the FOAM-C experiment of the ISS

    Science.gov (United States)

    Carpy, R.; Picker, G.; Amann, B.; Ranebo, H.; Vincent-Bonnieu, S.; Minster, O.; Winter, J.; Dettmann, J.; Castiglione, L.; Höhler, R.; Langevin, D.

    2011-12-01

    End of 2009 and early 2010 a sealed cell, for foam generation and observation, has been designed and manufactured at Astrium Friedrichshafen facilities. With the use of this cell, different sample compositions of "wet foams" have been optimized for mixtures of chemicals such as water, dodecanol, pluronic, aethoxisclerol, glycerol, CTAB, SDS, as well as glass beads. This development is performed in the frame of the breadboarding development activities of the Experiment Container FOAM-C for operation in the ISS Fluid Science Laboratory (ISS). The sample cell supports multiple observation methods such as: Diffusing-Wave and Diffuse Transmission Spectrometry, Time Resolved Correlation Spectroscopy [1] and microscope observation, all of these methods are applied in the cell with a relatively small experiment volume 40).

  14. Project and construction of a pneumatic system for the transference of samples to a neutron generator

    International Nuclear Information System (INIS)

    Carvalho, A.N. de

    1983-01-01

    A prototype of a system for the transport of irradiated samples to and from a neutron generator, was constructed, using compressed air as propeller agent. Compressed air was injected through electrically driven values. The sample, transported by the pressure wave, was inserted into a PVC tube 50m long and weighing 23.0 g. The first tests were carried out in order to determine the times needed to transport the above-mentioned PVC support along a PVC tube of 3m length and 3/4 diameter for different air pressures applied; it was verified that for pressures between 3.0 and 8.0 kgf/cm 2 , transport times were always smaller than 2 seconds. These results showed the viability of constructing a definitive system, already projected. (C.L.B.) [pt

  15. Molecular typing of lung adenocarcinoma on cytological samples using a multigene next generation sequencing panel.

    Directory of Open Access Journals (Sweden)

    Aldo Scarpa

    Full Text Available Identification of driver mutations in lung adenocarcinoma has led to development of targeted agents that are already approved for clinical use or are in clinical trials. Therefore, the number of biomarkers that will be needed to assess is expected to rapidly increase. This calls for the implementation of methods probing the mutational status of multiple genes for inoperable cases, for which limited cytological or bioptic material is available. Cytology specimens from 38 lung adenocarcinomas were subjected to the simultaneous assessment of 504 mutational hotspots of 22 lung cancer-associated genes using 10 nanograms of DNA and Ion Torrent PGM next-generation sequencing. Thirty-six cases were successfully sequenced (95%. In 24/36 cases (67% at least one mutated gene was observed, including EGFR, KRAS, PIK3CA, BRAF, TP53, PTEN, MET, SMAD4, FGFR3, STK11, MAP2K1. EGFR and KRAS mutations, respectively found in 6/36 (16% and 10/36 (28% cases, were mutually exclusive. Nine samples (25% showed concurrent alterations in different genes. The next-generation sequencing test used is superior to current standard methodologies, as it interrogates multiple genes and requires limited amounts of DNA. Its applicability to routine cytology samples might allow a significant increase in the fraction of lung cancer patients eligible for personalized therapy.

  16. Quantitative second-harmonic generation imaging to detect osteogenesis imperfecta in human skin samples

    Science.gov (United States)

    Adur, J.; Ferreira, A. E.; D'Souza-Li, L.; Pelegati, V. B.; de Thomaz, A. A.; Almeida, D. B.; Baratti, M. O.; Carvalho, H. F.; Cesar, C. L.

    2012-03-01

    Osteogenesis Imperfecta (OI) is a genetic disorder that leads to bone fractures due to mutations in the Col1A1 or Col1A2 genes that affect the primary structure of the collagen I chain with the ultimate outcome in collagen I fibrils that are either reduced in quantity or abnormally organized in the whole body. A quick test screening of the patients would largely reduce the sample number to be studied by the time consuming molecular genetics techniques. For this reason an assessment of the human skin collagen structure by Second Harmonic Generation (SHG) can be used as a screening technique to speed up the correlation of genetics/phenotype/OI types understanding. In the present work we have used quantitative second harmonic generation (SHG) imaging microscopy to investigate the collagen matrix organization of the OI human skin samples comparing with normal control patients. By comparing fibril collagen distribution and spatial organization, we calculated the anisotropy and texture patterns of this structural protein. The analysis of the anisotropy was performed by means of the two-dimensional Discrete Fourier Transform and image pattern analysis with Gray-Level Co-occurrence Matrix (GLCM). From these results, we show that statistically different results are obtained for the normal and disease states of OI.

  17. 1-Hydroxypyrene Levels in Blood Samples of Rats After Exposure to Generator Fumes

    Science.gov (United States)

    Ifegwu, Clinton; Igwo-Ezikpe, Miriam N.; Anyakora, Chimezie; Osuntoki, Akinniyi; Oseni, Kafayat A.; Alao, Eragbae O.

    2013-01-01

    Polynuclear Aromatic Hydrocarbons (PAHs) are a major component of fuel generator fumes. Carcinogenicity of these compounds has long been established. In this study, 37 Swiss albino rats were exposed to generator fumes at varied distances for 8 hours per day for a period of 42 days and the level of 1-hydroxypyrene in their blood was evaluated. This study also tried to correlate the level of blood 1-hyroxypyrene with the distance from the source of pollution. Plasma was collected by centrifuging the whole blood sample followed by complete hydrolysis of the conjugated 1-hydroxypyrene glucuronide to yield the analyte of interest, 1-hydroxypyrene, which was achieved using beta glucuronidase. High performance liquid chromatography (HPLC) with UV detector was used to determine the 1-hydroxypyrene concentrations in the blood samples. The mobile phase was water:methanol (12:88 v/v) isocratic run at the flow rate of 1.2 mL/min with CI8 stationary phase at 250 nm. After 42 days of exposure, blood concentration level of 1-hydroxypyrene ranged from 34 μg/mL to 26.29 μg/mL depending on the distance from source of exposure. The control group had no 1-hydroxypyrene in their blood. After the period of exposure, percentage of death correlated with the distance from the source of exposure. Percentage of death ranged from 56% to zero depending on the proximity to source of pollution. PMID:24179393

  18. Development of graph self-generating program of radiation sampling for geophysical prospecting with AutoLISP

    International Nuclear Information System (INIS)

    Zhou Hongsheng

    2009-01-01

    A program of self-generating graph of radiation sampling for geophysical prospecting is developed with AutoLISP, which is developed wholly by the author and can self-generate and explain sampling graphs. The program has largely increased drawing efficiency and can avoid the graph errors due to manual drawing. (authors)

  19. FRAGSION: ultra-fast protein fragment library generation by IOHMM sampling.

    Science.gov (United States)

    Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2016-07-01

    Speed, accuracy and robustness of building protein fragment library have important implications in de novo protein structure prediction since fragment-based methods are one of the most successful approaches in template-free modeling (FM). Majority of the existing fragment detection methods rely on database-driven search strategies to identify candidate fragments, which are inherently time-consuming and often hinder the possibility to locate longer fragments due to the limited sizes of databases. Also, it is difficult to alleviate the effect of noisy sequence-based predicted features such as secondary structures on the quality of fragment. Here, we present FRAGSION, a database-free method to efficiently generate protein fragment library by sampling from an Input-Output Hidden Markov Model. FRAGSION offers some unique features compared to existing approaches in that it (i) is lightning-fast, consuming only few seconds of CPU time to generate fragment library for a protein of typical length (300 residues); (ii) can generate dynamic-size fragments of any length (even for the whole protein sequence) and (iii) offers ways to handle noise in predicted secondary structure during fragment sampling. On a FM dataset from the most recent Critical Assessment of Structure Prediction, we demonstrate that FGRAGSION provides advantages over the state-of-the-art fragment picking protocol of ROSETTA suite by speeding up computation by several orders of magnitude while achieving comparable performance in fragment quality. Source code and executable versions of FRAGSION for Linux and MacOS is freely available to non-commercial users at http://sysbio.rnet.missouri.edu/FRAGSION/ It is bundled with a manual and example data. chengji@missouri.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Detecting representative data and generating synthetic samples to improve learning accuracy with imbalanced data sets.

    Directory of Open Access Journals (Sweden)

    Der-Chiang Li

    Full Text Available It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP method merging in the D3C method (PPDP+D3C with those of the one-sided selection (OSS, the well-known SMOTEBoost (SB study, and the normal distribution-based oversampling (NDO approach, and the proposed data pre-processing (PPDP method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.

  1. Generator and Setup for Emulating Exposures of Biological Samples to Lightning Strokes.

    Science.gov (United States)

    Rebersek, Matej; Marjanovic, Igor; Begus, Samo; Pillet, Flavien; Rols, Marie-Pierre; Miklavcic, Damijan; Kotnik, Tadej

    2015-10-01

    We aimed to develop a system for controlled exposure of biological samples to conditions they experience when lightning strikes their habitats. We based the generator on a capacitor charged via a bridge rectifier and a dc-dc converter, and discharged via a relay, delivering arcs similar to natural lightning strokes in electric current waveform and similarly accompanied by acoustic shock waves. We coupled the generator to our exposure chamber described previously, measured electrical and acoustic properties of arc discharges delivered, and assessed their ability to inactivate bacterial spores. Submicrosecond discharges descended vertically from the conical emitting electrode across the air gap, entering the sample centrally and dissipating radially toward the ring-shaped receiving electrode. In contrast, longer discharges tended to short-circuit the electrodes. Recording at 341 000 FPS with Vision Research Phantom v2010 camera revealed that initial arc descent was still vertical, but became accompanied by arcs leaning increasingly sideways; after 8-12 μs, as the first of these arcs formed direct contact with the receiving electrode, it evolved into a channel of plasmified air and short-circuited the electrodes. We eliminated this artefact by incorporating an insulating cylinder concentrically between the electrodes, precluding short-circuiting between them. While bacterial spores are highly resistant to electric pulses delivered through direct contact, we showed that with arc discharges accompanied by an acoustic shock wave, spore inactivation is readily obtained. The presented system allows scientific investigation of effects of arc discharges on biological samples. This system will allow realistic experimental studies of lightning-triggered horizontal gene transfer and assessment of its role in evolution.

  2. Transcriptome sequencing of the Microarray Quality Control (MAQC RNA reference samples using next generation sequencing

    Directory of Open Access Journals (Sweden)

    Thierry-Mieg Danielle

    2009-06-01

    Full Text Available Abstract Background Transcriptome sequencing using next-generation sequencing platforms will soon be competing with DNA microarray technologies for global gene expression analysis. As a preliminary evaluation of these promising technologies, we performed deep sequencing of cDNA synthesized from the Microarray Quality Control (MAQC reference RNA samples using Roche's 454 Genome Sequencer FLX. Results We generated more that 3.6 million sequence reads of average length 250 bp for the MAQC A and B samples and introduced a data analysis pipeline for translating cDNA read counts into gene expression levels. Using BLAST, 90% of the reads mapped to the human genome and 64% of the reads mapped to the RefSeq database of well annotated genes with e-values ≤ 10-20. We measured gene expression levels in the A and B samples by counting the numbers of reads that mapped to individual RefSeq genes in multiple sequencing runs to evaluate the MAQC quality metrics for reproducibility, sensitivity, specificity, and accuracy and compared the results with DNA microarrays and Quantitative RT-PCR (QRTPCR from the MAQC studies. In addition, 88% of the reads were successfully aligned directly to the human genome using the AceView alignment programs with an average 90% sequence similarity to identify 137,899 unique exon junctions, including 22,193 new exon junctions not yet contained in the RefSeq database. Conclusion Using the MAQC metrics for evaluating the performance of gene expression platforms, the ExpressSeq results for gene expression levels showed excellent reproducibility, sensitivity, and specificity that improved systematically with increasing shotgun sequencing depth, and quantitative accuracy that was comparable to DNA microarrays and QRTPCR. In addition, a careful mapping of the reads to the genome using the AceView alignment programs shed new light on the complexity of the human transcriptome including the discovery of thousands of new splice variants.

  3. Constrained approximation of effective generators for multiscale stochastic reaction networks and application to conditioned path sampling

    Energy Technology Data Exchange (ETDEWEB)

    Cotter, Simon L., E-mail: simon.cotter@manchester.ac.uk

    2016-10-15

    Efficient analysis and simulation of multiscale stochastic systems of chemical kinetics is an ongoing area for research, and is the source of many theoretical and computational challenges. In this paper, we present a significant improvement to the constrained approach, which is a method for computing effective dynamics of slowly changing quantities in these systems, but which does not rely on the quasi-steady-state assumption (QSSA). The QSSA can cause errors in the estimation of effective dynamics for systems where the difference in timescales between the “fast” and “slow” variables is not so pronounced. This new application of the constrained approach allows us to compute the effective generator of the slow variables, without the need for expensive stochastic simulations. This is achieved by finding the null space of the generator of the constrained system. For complex systems where this is not possible, or where the constrained subsystem is itself multiscale, the constrained approach can then be applied iteratively. This results in breaking the problem down into finding the solutions to many small eigenvalue problems, which can be efficiently solved using standard methods. Since this methodology does not rely on the quasi steady-state assumption, the effective dynamics that are approximated are highly accurate, and in the case of systems with only monomolecular reactions, are exact. We will demonstrate this with some numerics, and also use the effective generators to sample paths of the slow variables which are conditioned on their endpoints, a task which would be computationally intractable for the generator of the full system.

  4. Second generation laser-heated microfurnace for the preparation of microgram-sized graphite samples

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Bin; Smith, A.M.; Long, S.

    2015-10-15

    We present construction details and test results for two second-generation laser-heated microfurnaces (LHF-II) used to prepare graphite samples for Accelerator Mass Spectrometry (AMS) at ANSTO. Based on systematic studies aimed at optimising the performance of our prototype laser-heated microfurnace (LHF-I) (Smith et al., 2007 [1]; Smith et al., 2010 [2,3]; Yang et al., 2014 [4]), we have designed the LHF-II to have the following features: (i) it has a small reactor volume of 0.25 mL allowing us to completely graphitise carbon dioxide samples containing as little as 2 μg of C, (ii) it can operate over a large pressure range (0–3 bar) and so has the capacity to graphitise CO{sub 2} samples containing up to 100 μg of C; (iii) it is compact, with three valves integrated into the microfurnace body, (iv) it is compatible with our new miniaturised conventional graphitisation furnaces (MCF), also designed for small samples, and shares a common vacuum system. Early tests have shown that the extraneous carbon added during graphitisation in each LHF-II is of the order of 0.05 μg, assuming 100 pMC activity, similar to that of the prototype unit. We use a ‘budget’ fibre packaged array for the diode laser with custom built focusing optics. The use of a new infrared (IR) thermometer with a short focal length has allowed us to decrease the height of the light-proof safety enclosure. These innovations have produced a cheaper and more compact device. As with the LHF-I, feedback control of the catalyst temperature and logging of the reaction parameters is managed by a LabVIEW interface.

  5. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  6. Imputation of variants from the 1000 Genomes Project modestly improves known associations and can identify low-frequency variant-phenotype associations undetected by HapMap based imputation.

    Science.gov (United States)

    Wood, Andrew R; Perry, John R B; Tanaka, Toshiko; Hernandez, Dena G; Zheng, Hou-Feng; Melzer, David; Gibbs, J Raphael; Nalls, Michael A; Weedon, Michael N; Spector, Tim D; Richards, J Brent; Bandinelli, Stefania; Ferrucci, Luigi; Singleton, Andrew B; Frayling, Timothy M

    2013-01-01

    Genome-wide association (GWA) studies have been limited by the reliance on common variants present on microarrays or imputable from the HapMap Project data. More recently, the completion of the 1000 Genomes Project has provided variant and haplotype information for several million variants derived from sequencing over 1,000 individuals. To help understand the extent to which more variants (including low frequency (1% ≤ MAF 1000 Genomes imputation, respectively, and 9 and 11 that reached a stricter, likely conservative, threshold of P1000 Genomes genotype data modestly improved the strength of known associations. Of 20 associations detected at P1000 Genomes imputed data and one was nominally more strongly associated in HapMap imputed data. We also detected an association between a low frequency variant and phenotype that was previously missed by HapMap based imputation approaches. An association between rs112635299 and alpha-1 globulin near the SERPINA gene represented the known association between rs28929474 (MAF = 0.007) and alpha1-antitrypsin that predisposes to emphysema (P = 2.5×10(-12)). Our data provide important proof of principle that 1000 Genomes imputation will detect novel, low frequency-large effect associations.

  7. Data Transformation Functions for Expanded Search Spaces in Geographic Sample Supervised Segment Generation

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2014-04-01

    Full Text Available Sample supervised image analysis, in particular sample supervised segment generation, shows promise as a methodological avenue applicable within Geographic Object-Based Image Analysis (GEOBIA. Segmentation is acknowledged as a constituent component within typically expansive image analysis processes. A general extension to the basic formulation of an empirical discrepancy measure directed segmentation algorithm parameter tuning approach is proposed. An expanded search landscape is defined, consisting not only of the segmentation algorithm parameters, but also of low-level, parameterized image processing functions. Such higher dimensional search landscapes potentially allow for achieving better segmentation accuracies. The proposed method is tested with a range of low-level image transformation functions and two segmentation algorithms. The general effectiveness of such an approach is demonstrated compared to a variant only optimising segmentation algorithm parameters. Further, it is shown that the resultant search landscapes obtained from combining mid- and low-level image processing parameter domains, in our problem contexts, are sufficiently complex to warrant the use of population based stochastic search methods. Interdependencies of these two parameter domains are also demonstrated, necessitating simultaneous optimization.

  8. Gel-aided sample preparation (GASP)--a simplified method for gel-assisted proteomic sample generation from protein extracts and intact cells.

    Science.gov (United States)

    Fischer, Roman; Kessler, Benedikt M

    2015-04-01

    We describe a "gel-assisted" proteomic sample preparation method for MS analysis. Solubilized protein extracts or intact cells are copolymerized with acrylamide, facilitating denaturation, reduction, quantitative cysteine alkylation, and matrix formation. Gel-aided sample preparation has been optimized to be highly flexible, scalable, and to allow reproducible sample generation from 50 cells to milligrams of protein extracts. This methodology is fast, sensitive, easy-to-use on a wide range of sample types, and accessible to nonspecialists. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. MAFsnp: A Multi-Sample Accurate and Flexible SNP Caller Using Next-Generation Sequencing Data

    Science.gov (United States)

    Hu, Jiyuan; Li, Tengfei; Xiu, Zidi; Zhang, Hong

    2015-01-01

    Most existing statistical methods developed for calling single nucleotide polymorphisms (SNPs) using next-generation sequencing (NGS) data are based on Bayesian frameworks, and there does not exist any SNP caller that produces p-values for calling SNPs in a frequentist framework. To fill in this gap, we develop a new method MAFsnp, a Multiple-sample based Accurate and Flexible algorithm for calling SNPs with NGS data. MAFsnp is based on an estimated likelihood ratio test (eLRT) statistic. In practical situation, the involved parameter is very close to the boundary of the parametric space, so the standard large sample property is not suitable to evaluate the finite-sample distribution of the eLRT statistic. Observing that the distribution of the test statistic is a mixture of zero and a continuous part, we propose to model the test statistic with a novel two-parameter mixture distribution. Once the parameters in the mixture distribution are estimated, p-values can be easily calculated for detecting SNPs, and the multiple-testing corrected p-values can be used to control false discovery rate (FDR) at any pre-specified level. With simulated data, MAFsnp is shown to have much better control of FDR than the existing SNP callers. Through the application to two real datasets, MAFsnp is also shown to outperform the existing SNP callers in terms of calling accuracy. An R package “MAFsnp” implementing the new SNP caller is freely available at http://homepage.fudan.edu.cn/zhangh/softwares/. PMID:26309201

  10. Progress on using deuteron-deuteron fusion generated neutrons for 40Ar/39Ar sample irradiation

    Science.gov (United States)

    Rutte, Daniel; Renne, Paul R.; Becker, Tim; Waltz, Cory; Ayllon Unzueta, Mauricio; Zimmerman, Susan; Hidy, Alan; Finkel, Robert; Bauer, Joseph D.; Bernstein, Lee; van Bibber, Karl

    2017-04-01

    We present progress on the development and proof of concept of a deuteron-deuteron fusion based neutron generator for 40Ar/39Ar sample irradiation. Irradiation with deuteron-deuteron fusion neutrons is anticipated to reduce Ar recoil and Ar production from interfering reactions. This will allow dating of smaller grains and increase accuracy and precision of the method. The instrument currently achieves neutron fluxes of ˜9×107 cm-2s-1 as determined by irradiation of indium foils and use of the activation reaction 115In(n,n')115mIn. Multiple foils and simulations were used to determine flux gradients in the sample chamber. A first experiment quantifying the loss of 39Ar is underway and will likely be available at the time of the presentation of this abstract. In ancillary experiments via irradiation of K salts and subsequent mass spectrometric analysis we determined the cross-sections of the 39K(n,p)39Ar reaction at ˜2.8 MeV to be 160 ± 35 mb (1σ). This result is in good agreement with bracketing cross-section data of ˜96 mb at ˜2.45 MeV and ˜270 mb at ˜4 MeV [Johnson et al., 1967; Dixon and Aitken, 1961 and Bass et al. 1964]. Our data disfavor a much lower value of ˜45 mb at 2.59 MeV [Lindström & Neuer, 1958]. In another ancillary experiment the cross section for 39K(n,α)36Cl at ˜2.8 MeV was determined as 11.7 ± 0.5 mb (1σ), which is significant for 40Ar/39Ar geochronology due to subsequent decay to 36Ar as well as for the determination of production rates of cosmogenic 36Cl. Additional experiments resolving the cross section functions on 39K between 1.5 and 3.6 MeV are on their way using the LICORNE neutron source of the IPN Orsay tandem accelerator. Results will likely be available at the time of the presentation of this abstract. While the neutron generator is designed for fluxes of ˜109 cm-2s-1, arcing in the sample chamber currently limits the power—straightforwardly correlated to the neutron flux—the generator can safely be run at. Further

  11. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  12. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    Science.gov (United States)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  13. Propionibacterium acnes: disease-causing agent or common contaminant? Detection in diverse patient samples by next generation sequencing

    DEFF Research Database (Denmark)

    Mollerup, Sarah; Friis-Nielsen, Jens; Vinner, Lasse

    2016-01-01

    Propionibacterium acnes is the most abundant bacterium on human skin, particularly in sebaceous areas. P. acnes is suggested to be an opportunistic pathogen involved in the development of diverse medical conditions, but is also a proven contaminant of human samples and surgical wounds. Its...... significance as a pathogen is consequently a matter of debate.In the present study we investigated the presence of P. acnes DNA in 250 next generation sequencing datasets generated from 180 samples of 20 different sample types, mostly of cancerous origin. The samples were either subjected to microbial...... enrichment, involving nuclease treatment to reduce the amount of host nucleic acids, or shotgun-sequenced.We detected high proportions of P. acnes in enriched samples, particularly skin derived and other tissue samples, with levels being higher in enriched compared to shotgun-sequenced samples. P. acnes...

  14. Determination of total mercury and methylmercury in biological samples by photochemical vapor generation

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Mariana A.; Ribeiro, Anderson S.; Curtius, Adilson J. [Universidade Federal de Santa Catarina, Departamento de Quimica, Florianopolis, SC (Brazil); Sturgeon, Ralph E. [National Research Council Canada, Institute for National Measurement Standards, Ottawa, ON (Canada)

    2007-06-15

    Cold vapor atomic absorption spectrometry (CV-AAS) based on photochemical reduction by exposure to UV radiation is described for the determination of methylmercury and total mercury in biological samples. Two approaches were investigated: (a) tissues were digested in either formic acid or tetramethylammonium hydroxide (TMAH), and total mercury was determined following reduction of both species by exposure of the solution to UV irradiation; (b) tissues were solubilized in TMAH, diluted to a final concentration of 0.125% m/v TMAH by addition of 10% v/v acetic acid and CH{sub 3}Hg{sup +} was selectively quantitated, or the initial digests were diluted to 0.125% m/v TMAH by addition of deionized water, adjusted to pH 0.3 by addition of HCl and CH{sub 3}Hg{sup +} was selectively quantitated. For each case, the optimum conditions for photochemical vapor generation (photo-CVG) were investigated. The photochemical reduction efficiency was estimated to be {proportional_to}95% by comparing the response with traditional SnCl{sub 2} chemical reduction. The method was validated by analysis of several biological Certified Reference Materials, DORM-1, DORM-2, DOLT-2 and DOLT-3, using calibration against aqueous solutions of Hg{sup 2+}; results showed good agreement with the certified values for total and methylmercury in all cases. Limits of detection of 6 ng/g for total mercury using formic acid, 8 ng/g for total mercury and 10 ng/g for methylmercury using TMAH were obtained. The proposed methodology is sensitive, simple and inexpensive, and promotes ''green'' chemistry. The potential for application to other sample types and analytes is evident. (orig.)

  15. Optimal sampling period of the digital control system for the nuclear power plant steam generator water level control

    International Nuclear Information System (INIS)

    Hur, Woo Sung; Seong, Poong Hyun

    1995-01-01

    A great effort has been made to improve the nuclear plant control system by use of digital technologies and a long term schedule for the control system upgrade has been prepared with an aim to implementation in the next generation nuclear plants. In case of digital control system, it is important to decide the sampling period for analysis and design of the system, because the performance and the stability of a digital control system depend on the value of the sampling period of the digital control system. There is, however, currently no systematic method used universally for determining the sampling period of the digital control system. Generally, a traditional way to select the sampling frequency is to use 20 to 30 times the bandwidth of the analog control system which has the same system configuration and parameters as the digital one. In this paper, a new method to select the sampling period is suggested which takes into account of the performance as well as the stability of the digital control system. By use of the Irving's model steam generator, the optimal sampling period of an assumptive digital control system for steam generator level control is estimated and is actually verified in the digital control simulation system for Kori-2 nuclear power plant steam generator level control. Consequently, we conclude the optimal sampling period of the digital control system for Kori-2 nuclear power plant steam generator level control is 1 second for all power ranges. 7 figs., 3 tabs., 8 refs. (Author)

  16. Arsenic speciation in environmental samples by hydride generation and electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Anawar, Hossain Md

    2012-01-15

    For the past few years many studies have been performed to determine arsenic (As) speciation in drinking water, food chain and other environmental samples due to its well-recognized carcinogenic and toxic effects relating to its chemical forms and oxidation states. This review provides an overview of analytical methods, preconcentration and separation techniques, developed up to now, using HGAAS and ETAAS for the determination of inorganic As and organoarsenic species in environmental samples. Specific advantages, disadvantages, selectivity, sensitivity, efficiency, rapidity, detection limit (DL), and some aspects of recent improvements and modifications for different analytical and separation techniques, that can define their application for a particular sample analysis, are highlighted. HG-AAS has high sensitivity, selectivity and low DL using suitable separation techniques; and it is a more suitable, affordable and much less expensive technique than other detectors. The concentrations of HCl and NaBH(4) have a critical effect on the HG response of As species. Use of l-cysteine as pre-reductant is advantageous over KI to obtain the same signal response for different As species under the same, optimum and mild acid concentration, and to reduce the interference of transition metals on the arsine generation. Use of different pretreatment, digestion, separation techniques and surfactants can determine As species with DL from ngL(-1) to μgL(-1). Out of all the chromatographic techniques coupled with HGAAS/ETAAS, ion-pair reversed-phase chromatography (IP-RP) is the most popular due to its higher separation efficiency, resolution selectivity, simplicity, and ability to separate up to seven As species for both non-ionic and ionic compounds in a signal run using the same column and short time. However, a combination of anion- and cation-exchange chromatography seems the most promising for complete resolution up to eight As species. The ETAAS method using different

  17. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  18. Assessing Generative Braille Responding Following Training in a Matching-to-Sample Format

    Science.gov (United States)

    Putnam, Brittany C.; Tiger, Jeffrey H.

    2016-01-01

    We evaluated the effects of teaching sighted college students to select printed text letters given a braille sample stimulus in a matching-to-sample (MTS) format on the emergence of untrained (a) construction of print characters given braille samples, (b) construction of braille characters given print samples, (c) transcription of print characters…

  19. ANALYSIS OF MONTE CARLO SIMULATION SAMPLING TECHNIQUES ON SMALL SIGNAL STABILITY OF WIND GENERATOR- CONNECTED POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    TEMITOPE RAPHAEL AYODELE

    2016-04-01

    Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.

  20. Cerium Oxide Nanoparticle Nose-Only Inhalation Exposures Using a Low-Sample-Consumption String Generator

    Science.gov (United States)

    There is a critical need to assess the health effects associated with exposure of commercially produced NPs across the size ranges reflective of that detected in the industrial sectors that are generating, as well as incorporating, NPs into products. Generation of stable and low ...

  1. Validation of a next-generation sequencing assay for clinical molecular oncology.

    Science.gov (United States)

    Cottrell, Catherine E; Al-Kateb, Hussam; Bredemeyer, Andrew J; Duncavage, Eric J; Spencer, David H; Abel, Haley J; Lockwood, Christina M; Hagemann, Ian S; O'Guin, Stephanie M; Burcea, Lauren C; Sawyer, Christopher S; Oschwald, Dayna M; Stratman, Jennifer L; Sher, Dorie A; Johnson, Mark R; Brown, Justin T; Cliften, Paul F; George, Bijoy; McIntosh, Leslie D; Shrivastava, Savita; Nguyen, Tudung T; Payton, Jacqueline E; Watson, Mark A; Crosby, Seth D; Head, Richard D; Mitra, Robi D; Nagarajan, Rakesh; Kulkarni, Shashikant; Seibert, Karen; Virgin, Herbert W; Milbrandt, Jeffrey; Pfeifer, John D

    2014-01-01

    Currently, oncology testing includes molecular studies and cytogenetic analysis to detect genetic aberrations of clinical significance. Next-generation sequencing (NGS) allows rapid analysis of multiple genes for clinically actionable somatic variants. The WUCaMP assay uses targeted capture for NGS analysis of 25 cancer-associated genes to detect mutations at actionable loci. We present clinical validation of the assay and a detailed framework for design and validation of similar clinical assays. Deep sequencing of 78 tumor specimens (≥ 1000× average unique coverage across the capture region) achieved high sensitivity for detecting somatic variants at low allele fraction (AF). Validation revealed sensitivities and specificities of 100% for detection of single-nucleotide variants (SNVs) within coding regions, compared with SNP array sequence data (95% CI = 83.4-100.0 for sensitivity and 94.2-100.0 for specificity) or whole-genome sequencing (95% CI = 89.1-100.0 for sensitivity and 99.9-100.0 for specificity) of HapMap samples. Sensitivity for detecting variants at an observed 10% AF was 100% (95% CI = 93.2-100.0) in HapMap mixes. Analysis of 15 masked specimens harboring clinically reported variants yielded concordant calls for 13/13 variants at AF of ≥ 15%. The WUCaMP assay is a robust and sensitive method to detect somatic variants of clinical significance in molecular oncology laboratories, with reduced time and cost of genetic analysis allowing for strategic patient management. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  2. Evaluation of sampling schemes for in-service inspection of steam generator tubing

    International Nuclear Information System (INIS)

    Hanlen, R.C.

    1990-03-01

    This report is a follow-on of work initially sponsored by the US Nuclear Regulatory Commission (Bowen et al. 1989). The work presented here is funded by EPRI and is jointly sponsored by the Electric Power Research Institute (EPRI) and the US Nuclear Regulatory Commission (NRC). The goal of this research was to evaluate fourteen sampling schemes or plans. The main criterion used for evaluating plan performance was the effectiveness for sampling, detecting and plugging defective tubes. The performance criterion was evaluated across several choices of distributions of degraded/defective tubes, probability of detection (POD) curves and eddy-current sizing models. Conclusions from this study are dependent upon the tube defect distributions, sample size, and expansion rules considered. As degraded/defective tubes form ''clusters'' (i.e., maps 6A, 8A and 13A), the smaller sample sizes provide a capability of detecting and sizing defective tubes that approaches 100% inspection. When there is little or no clustering (i.e., maps 1A, 20 and 21), sample efficiency is approximately equal to the initial sample size taken. Thee is an indication (though not statistically significant) that the systematic sampling plans are better than the random sampling plans for equivalent initial sample size. There was no indication of an effect due to modifying the threshold value for the second stage expansion. The lack of an indication is likely due to the specific tube flaw sizes considered for the six tube maps. 1 ref., 11 figs., 19 tabs

  3. The effect of therapeutic x-radiation on a sample of pacemaker generators

    International Nuclear Information System (INIS)

    Maxted, K.J.

    1984-01-01

    Tests were made on nineteen generators, comprising seventeen types from five manufacturers and including four programmable units, and x-ray energies of about 4 MeV. The bipolar generators suffered no measureable damage and radiotherapy patients using these would appear not to be exposed to any hazard. Nor were any of the generators using entirely CMOS circuitry, including the programmable types, affected. Generators using hybrid bipolar and CMOS circuitry were damaged by radiation exposure, the majority of these being Medtronic pacemakers. Transient recovery of function followed by total failure suggested that even a transient loss of function must be regarded as a precursor to permanent damage. This transient effect indicates that it is most likely the CMOS circuit element that is affected. (U.K.)

  4. Entropy generation as an environmental impact indicator and a sample application to freshwater ecosystems eutrophication

    International Nuclear Information System (INIS)

    Diaz-Mendez, S.E.; Sierra-Grajeda, J.M.T.; Hernandez-Guerrero, A.; Rodriguez-Lelis, J.M.

    2013-01-01

    Generally speaking, an ecosystem is seen as a complex set, it is composed by different biotic and abiotic parts. Naturally, each part has specifics functions related with mass and energy, those functions have influence between the parts directly and indirectly, and these functions are subjected to the basic laws of thermodynamics. If each part of the ecosystem is taken as thermodynamics system its entropy generation could be evaluated, then the total entropy generation of the ecosystem must be sum of the entropy generation in each part, to be in accordance with the Gouy-Stodola theorem. With this in mind, in this work an environmental indicator, for any kind of ecosystems, can be determined as a function of the ratio of total entropy generation for reference state, for instance a healthy forest; and the entropy generation of new different state of the same ecosystem can take, for instance a deforestation. Thus, thermodynamics concepts are applied to study the eutrophication of freshwater ecosystems; the strategy is based on techniques that integrate assumptions of the methodology of entropy generation inside ecosystems. The results show that if the amount of entropy generation is small respect a reference state; the sustainability of the ecosystem will be greater. - Highlights: • We estimate an environmental impact indicator using the concept of entropy generation. • It can be a useful tool for assessing the environmental impacts of the society over the environment. • It can be a useful tool to compare new technologies and improve their efficiencies even more. • It can help for a better understanding of the concept of entropy and its role among various classes of processes. • It can help to reduce environmental concerns and increase the sustainability of the planet

  5. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  6. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection

    Czech Academy of Sciences Publication Activity Database

    Huber, C. S.; Vale, M. G. R.; Dessuy, M. B.; Svoboda, Milan; Musil, Stanislav; Dědina, Jiří

    2017-01-01

    Roč. 175, DEC (2017), s. 406 -412 ISSN 0039-9140 R&D Projects: GA MŠk(CZ) LH15174 Institutional support: RVO:68081715 Keywords : slurry sampling * methyl-substituted arsenic species * hydride generation-cryotrapping-atomic absorption spectrometry Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 4.162, year: 2016

  7. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection

    Czech Academy of Sciences Publication Activity Database

    Huber, C. S.; Vale, M. G. R.; Dessuy, M. B.; Svoboda, Milan; Musil, Stanislav; Dědina, Jiří

    2017-01-01

    Roč. 175, DEC (2017), s. 406-412 ISSN 0039-9140 R&D Projects: GA MŠk(CZ) LH15174 Institutional support: RVO:68081715 Keywords : slurry sampling * methyl-substituted arsenic species * hydride generation-cryotrapping-atomic absorption spectrometry Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 4.162, year: 2016

  8. Experience with Aerosol Generation During Rotary Mode Core Sampling in the Hanford Single Shell Waste Tanks

    International Nuclear Information System (INIS)

    SCHOFIELD, J.S.

    1999-01-01

    This document provides data on aerosol concentrations in tank head spaces, total mass of aerosols in the tank head space and mass of aerosols sent to the exhauster during Rotary Mode Core Sampling from November 1994 through April 1999

  9. Measurements of tritium (HTO, TFWT, OBT) in environmental samples at varying distances from a nuclear generating station

    International Nuclear Information System (INIS)

    Kotzer, T.G.; Workman, W.J.G.

    1999-12-01

    Concentrations of tritium have been measured in environmental samples (vegetation, water, soil, air) from sites distal and proximal to a CANDU nuclear generating station in Southern Ontario (OPG-Pickering). Levels of tissue-free water tritium (TFWT) and organically bound tritium (OBT) in vegetation are as high as 24,000 TU immediately adjacent to the nuclear generating station and rapidly decrease to levels of tritium which are comparable to natural ambient concentrations for tritium in the environment (approximately ≤ 60 TU). Tritium concentrations (OBT, TFTW) have also been measured in samples of vegetation and tree rings growing substantial distances away from nuclear generating stations and are within a factor of 1 to 2 of the ambient levels of tritium measured in precipitation in several parts of Canada (approximately ≤30 TU). (author)

  10. Measurements of tritium (HTO, TFWT, OBT) in environmental samples at varying distances from a nuclear generating station

    Energy Technology Data Exchange (ETDEWEB)

    Kotzer, T.G.; Workman, W.J.G

    1999-12-01

    Concentrations of tritium have been measured in environmental samples (vegetation, water, soil, air) from sites distal and proximal to a CANDU nuclear generating station in Southern Ontario (OPG-Pickering). Levels of tissue-free water tritium (TFWT) and organically bound tritium (OBT) in vegetation are as high as 24,000 TU immediately adjacent to the nuclear generating station and rapidly decrease to levels of tritium which are comparable to natural ambient concentrations for tritium in the environment (approximately {<=} 60 TU). Tritium concentrations (OBT, TFTW) have also been measured in samples of vegetation and tree rings growing substantial distances away from nuclear generating stations and are within a factor of 1 to 2 of the ambient levels of tritium measured in precipitation in several parts of Canada (approximately {<=}30 TU). (author)

  11. Speciation without chromatography using selective hydride generation: Inorganic arsenic in rice and samples of marine origin

    Czech Academy of Sciences Publication Activity Database

    Musil, Stanislav; Pétursdóttir, A. H.; Raab, A.; Gunnlaugsdóttir, H.; Krupp, E.; Feldmann, J.

    2014-01-01

    Roč. 86, č. 2 (2014), s. 993-999 ISSN 0003-2700 Grant - others:GA AV ČR(CZ) M200311271 Institutional support: RVO:68081715 Keywords : inorganic arsenic * hydride generation * inductively coupled plasma mass spectrometry Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 5.636, year: 2014

  12. Efficient generation of volatile species for cadmium analysis in seafood and rice samples by a modified chemical vapor generation system coupled with atomic fluorescence spectrometry

    International Nuclear Information System (INIS)

    Yang, Xin-an; Chi, Miao-bin; Wang, Qing-qing; Zhang, Wang-bing

    2015-01-01

    Highlights: • We develop a modified chemical vapor generation method coupled with AFS for the determination of cadmium. • The response of Cd could be increased at least four-fold compared to conventional thiourea and Co(II) system. • A simple mixing sequences experiment is designed to study the reaction mechanism. • The interference of transition metal ions can be easily eliminated by adding DDTC. • The method is successfully applied in seafood samples and rice samples. - Abstract: A vapor generation procedure to determine Cd by atomic fluorescence spectrometry (AFS) has been established. Volatile species of Cd are generated by following reaction of acidified sample containing Fe(II) and L-cysteine (Cys) with sodium tetrahydroborate (NaBH 4 ). The presence of 5 mg L −1 Fe(II) and 0.05% m/v Cys improves the efficiency of Cd vapor generation substantially about four-fold compared with conventional thiourea and Co(II) system. Three experiments with different mixing sequences and reaction times are designed to study the reaction mechanism. The results document that the stability of Cd(II)–Cys complexes is better than Cys–THB complexes (THB means NaBH 4 ) while the Cys–THB complexes have more contribution to improve the Cd vapor generation efficiency than Cd(II)–Cys complexes. Meanwhile, the adding of Fe(II) can catalyze the Cd vapor generation. Under the optimized conditions, the detection limit of Cd is 0.012 μg L −1 ; relative standard deviations vary between 0.8% and 5.5% for replicate measurements of the standard solution. In the presence of 0.01% DDTC, Cu(II), Pb(II) and Zn(II) have no significant influence up to 5 mg L −1 , 10 mg L −1 and 10 mg L −1 , respectively. The accuracy of the method is verified through analysis of the certificated reference materials and the proposed method has been applied in the determination of Cd in seafood and rice samples

  13. Efficient generation of volatile species for cadmium analysis in seafood and rice samples by a modified chemical vapor generation system coupled with atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xin-an, E-mail: 13087641@qq.com; Chi, Miao-bin, E-mail: 1161306667@qq.com; Wang, Qing-qing, E-mail: wangqq8812@163.com; Zhang, Wang-bing, E-mail: ahutwbzh@163.com

    2015-04-15

    Highlights: • We develop a modified chemical vapor generation method coupled with AFS for the determination of cadmium. • The response of Cd could be increased at least four-fold compared to conventional thiourea and Co(II) system. • A simple mixing sequences experiment is designed to study the reaction mechanism. • The interference of transition metal ions can be easily eliminated by adding DDTC. • The method is successfully applied in seafood samples and rice samples. - Abstract: A vapor generation procedure to determine Cd by atomic fluorescence spectrometry (AFS) has been established. Volatile species of Cd are generated by following reaction of acidified sample containing Fe(II) and L-cysteine (Cys) with sodium tetrahydroborate (NaBH{sub 4}). The presence of 5 mg L{sup −1} Fe(II) and 0.05% m/v Cys improves the efficiency of Cd vapor generation substantially about four-fold compared with conventional thiourea and Co(II) system. Three experiments with different mixing sequences and reaction times are designed to study the reaction mechanism. The results document that the stability of Cd(II)–Cys complexes is better than Cys–THB complexes (THB means NaBH{sub 4}) while the Cys–THB complexes have more contribution to improve the Cd vapor generation efficiency than Cd(II)–Cys complexes. Meanwhile, the adding of Fe(II) can catalyze the Cd vapor generation. Under the optimized conditions, the detection limit of Cd is 0.012 μg L{sup −1}; relative standard deviations vary between 0.8% and 5.5% for replicate measurements of the standard solution. In the presence of 0.01% DDTC, Cu(II), Pb(II) and Zn(II) have no significant influence up to 5 mg L{sup −1}, 10 mg L{sup −1}and 10 mg L{sup −1}, respectively. The accuracy of the method is verified through analysis of the certificated reference materials and the proposed method has been applied in the determination of Cd in seafood and rice samples.

  14. Distribution Coeficients (Kd) Generated From A Core Sample Collected From The Saltstone Disposal Facility

    International Nuclear Information System (INIS)

    Almond, P.; Kaplan, D.

    2011-01-01

    Core samples originating from Vault 4, Cell E of the Saltstone Disposal Facility (SDF) were collected in September of 2008 (Hansen and Crawford 2009, Smith 2008) and sent to SRNL to measure chemical and physical properties of the material including visual uniformity, mineralogy, microstructure, density, porosity, distribution coefficients (K d ), and chemical composition. Some data from these experiments have been reported (Cozzi and Duncan 2010). In this study, leaching experiments were conducted with a single core sample under conditions that are representative of saltstone performance. In separate experiments, reducing and oxidizing environments were targeted to obtain solubility and Kd values from the measurable species identified in the solid and aqueous leachate. This study was designed to provide insight into how readily species immobilized in saltstone will leach from the saltstone under oxidizing conditions simulating the edge of a saltstone monolith and under reducing conditions, targeting conditions within the saltstone monolith. Core samples were taken from saltstone poured in December of 2007 giving a cure time of nine months in the cell and a total of thirty months before leaching experiments began in June 2010. The saltstone from Vault 4, Cell E is comprised of blast furnace slag, class F fly ash, portland cement, and Deliquification, Dissolution, and Adjustment (DDA) Batch 2 salt solution. The salt solution was previously analyzed from a sample of Tank 50 salt solution and characterized in the 4QCY07 Waste Acceptance Criteria (WAC) report (Zeigler and Bibler 2009). Subsequent to Tank 50 analysis, additional solution was added to the tank solution from the Effluent Treatment Project as well as from inleakage from Tank 50 pump bearings (Cozzi and Duncan 2010). Core samples were taken from three locations and at three depths at each location using a two-inch diameter concrete coring bit (1-1, 1-2, 1-3; 2-1, 2-2, 2-3; 3-1, 3-2, 3-3) (Hansen and Crawford

  15. DISTRIBUTION COEFICIENTS (KD) GENERATED FROM A CORE SAMPLE COLLECTED FROM THE SALTSTONE DISPOSAL FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Almond, P.; Kaplan, D.

    2011-04-25

    Core samples originating from Vault 4, Cell E of the Saltstone Disposal Facility (SDF) were collected in September of 2008 (Hansen and Crawford 2009, Smith 2008) and sent to SRNL to measure chemical and physical properties of the material including visual uniformity, mineralogy, microstructure, density, porosity, distribution coefficients (K{sub d}), and chemical composition. Some data from these experiments have been reported (Cozzi and Duncan 2010). In this study, leaching experiments were conducted with a single core sample under conditions that are representative of saltstone performance. In separate experiments, reducing and oxidizing environments were targeted to obtain solubility and Kd values from the measurable species identified in the solid and aqueous leachate. This study was designed to provide insight into how readily species immobilized in saltstone will leach from the saltstone under oxidizing conditions simulating the edge of a saltstone monolith and under reducing conditions, targeting conditions within the saltstone monolith. Core samples were taken from saltstone poured in December of 2007 giving a cure time of nine months in the cell and a total of thirty months before leaching experiments began in June 2010. The saltstone from Vault 4, Cell E is comprised of blast furnace slag, class F fly ash, portland cement, and Deliquification, Dissolution, and Adjustment (DDA) Batch 2 salt solution. The salt solution was previously analyzed from a sample of Tank 50 salt solution and characterized in the 4QCY07 Waste Acceptance Criteria (WAC) report (Zeigler and Bibler 2009). Subsequent to Tank 50 analysis, additional solution was added to the tank solution from the Effluent Treatment Project as well as from inleakage from Tank 50 pump bearings (Cozzi and Duncan 2010). Core samples were taken from three locations and at three depths at each location using a two-inch diameter concrete coring bit (1-1, 1-2, 1-3; 2-1, 2-2, 2-3; 3-1, 3-2, 3-3) (Hansen and

  16. Observation of a physical matrix effect during cold vapour generation measurement of mercury in emissions samples

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Richard J.C., E-mail: richard.brown@npl.co.uk; Webb, William R.; Goddard, Sharon L.

    2014-05-01

    Highlights: • A matrix effect for CV-AFS measurement of mercury in emissions samples is reported. • This results from the different efficiencies of liberation of reduced mercury. • There is a good correlation between solution density and the size of the effect. • Several methods to overcome the bias are presented and discussed. - Abstract: The observation of a physical matrix effect during the cold vapour generation–atomic fluorescence measurement of mercury in emissions samples is reported. The effect is as a result of the different efficiencies of liberation of reduced mercury from solution as the matrix of the solution under test varies. The result of this is that peak area to peak height ratios decease as matrix concentration increases, passing through a minimum, before the ratio then increases as matrix concentration further increases. In the test matrices examined – acidified potassium dichromate and sodium chloride solutions – the possible biases caused by differences between the calibration standard matrix and the test sample matrix were as large as 2.8% (relative) representing peak area to peak height ratios for calibration standards and matrix samples of 45 and 43.75, respectively. For the system considered there is a good correlation between the density of the matrix and point of optimum liberation of dissolved mercury for both matrix types. Several methods employing matrix matching and mathematical correction to overcome the bias are presented and their relative merits discussed; the most promising being the use of peak area, rather than peak height, for quantification.

  17. Experience with Aerosol Generation During Rotary Mode Core Sampling in the Hanford Single Shell Waste Tanks

    International Nuclear Information System (INIS)

    SCHOFIELD, J.S.

    2000-01-01

    This document provides data on aerosol concentrations in tank head spaces, total mass of aerosols in the tank head space and mass of aerosols sent to the exhauster during Rotary Mode Core Sampling from November 1994 through June 1999. A decontamination factor for the RMCS exhauster filter housing is calculated based on operation data

  18. Convolutional neural network using generated data for SAR ATR with limited samples

    Science.gov (United States)

    Cong, Longjian; Gao, Lei; Zhang, Hui; Sun, Peng

    2018-03-01

    Being able to adapt all weather at all times, it has been a hot research topic that using Synthetic Aperture Radar(SAR) for remote sensing. Despite all the well-known advantages of SAR, it is hard to extract features because of its unique imaging methodology, and this challenge attracts the research interest of traditional Automatic Target Recognition(ATR) methods. With the development of deep learning technologies, convolutional neural networks(CNNs) give us another way out to detect and recognize targets, when a huge number of samples are available, but this premise is often not hold, when it comes to monitoring a specific type of ships. In this paper, we propose a method to enhance the performance of Faster R-CNN with limited samples to detect and recognize ships in SAR images.

  19. Glyphosate–rich air samples induce IL–33, TSLP and generate IL–13 dependent airway inflammation

    Science.gov (United States)

    Kumar, Sudhir; Khodoun, Marat; Kettleson, Eric M.; McKnight, Christopher; Reponen, Tiina; Grinshpun, Sergey A.; Adhikari, Atin

    2014-01-01

    Several low weight molecules have often been implicated in the induction of occupational asthma. Glyphosate, a small molecule herbicide, is widely used in the world. There is a controversy regarding a role of glyphosate in developing asthma and rhinitis among farmers, the mechanism of which is unexplored. The aim of this study was to explore the mechanisms of glyphosate induced pulmonary pathology by utilizing murine models and real environmental samples. C57BL/6, TLR4−/−, and IL-13−/− mice inhaled extracts of glyphosate-rich air samples collected on farms during spraying of herbicides or inhaled different doses of glyphosate and ovalbumin. The cellular response, humoral response, and lung function of exposed mice were evaluated. Exposure to glyphosate-rich air samples as well as glyphosate alone to the lungs increased: eosinophil and neutrophil counts, mast cell degranulation, and production of IL-33, TSLP, IL-13, and IL-5. In contrast, in vivo systemic IL-4 production was not increased. Co-administration of ovalbumin with glyphosate did not substantially change the inflammatory immune response. However, IL-13-deficiency resulted in diminished inflammatory response but did not have a significant effect on airway resistance upon methacholine challenge after 7 or 21 days of glyphosate exposure. Glyphosate-rich farm air samples as well as glyphosate alone were found to induce pulmonary IL-13-dependent inflammation and promote Th2 type cytokines, but not IL-4 for glyphosate alone. This study, for the first time, provides evidence for the mechanism of glyphosate-induced occupational lung disease. PMID:25172162

  20. Next Generation Offline Approaches to Trace Gas-Phase Organic Compound Speciation: Sample Collection and Analysis

    Science.gov (United States)

    Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.

    2017-12-01

    Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.

  1. Getting DNA copy numbers without control samples

    Directory of Open Access Journals (Sweden)

    Ortiz-Estevez Maria

    2012-08-01

    Full Text Available Abstract Background The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias. We propose NSA (Normality Search Algorithm, a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Results Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM, Ovarian, Prostate and Lung Cancer experiments have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs. These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. Conclusions NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the

  2. Getting DNA copy numbers without control samples.

    Science.gov (United States)

    Ortiz-Estevez, Maria; Aramburu, Ander; Rubio, Angel

    2012-08-16

    The selection of the reference to scale the data in a copy number analysis has paramount importance to achieve accurate estimates. Usually this reference is generated using control samples included in the study. However, these control samples are not always available and in these cases, an artificial reference must be created. A proper generation of this signal is crucial in terms of both noise and bias.We propose NSA (Normality Search Algorithm), a scaling method that works with and without control samples. It is based on the assumption that genomic regions enriched in SNPs with identical copy numbers in both alleles are likely to be normal. These normal regions are predicted for each sample individually and used to calculate the final reference signal. NSA can be applied to any CN data regardless the microarray technology and preprocessing method. It also finds an optimal weighting of the samples minimizing possible batch effects. Five human datasets (a subset of HapMap samples, Glioblastoma Multiforme (GBM), Ovarian, Prostate and Lung Cancer experiments) have been analyzed. It is shown that using only tumoral samples, NSA is able to remove the bias in the copy number estimation, to reduce the noise and therefore, to increase the ability to detect copy number aberrations (CNAs). These improvements allow NSA to also detect recurrent aberrations more accurately than other state of the art methods. NSA provides a robust and accurate reference for scaling probe signals data to CN values without the need of control samples. It minimizes the problems of bias, noise and batch effects in the estimation of CNs. Therefore, NSA scaling approach helps to better detect recurrent CNAs than current methods. The automatic selection of references makes it useful to perform bulk analysis of many GEO or ArrayExpress experiments without the need of developing a parser to find the normal samples or possible batches within the data. The method is available in the open-source R package

  3. Use of respondent driven sampling (RDS generates a very diverse sample of men who have sex with men (MSM in Buenos Aires, Argentina.

    Directory of Open Access Journals (Sweden)

    Alex Carballo-Diéguez

    Full Text Available Prior research focusing on men who have sex with men (MSM conducted in Buenos Aires, Argentina, used convenience samples that included mainly gay identified men. To increase MSM sample representativeness, we used Respondent Driven Sampling (RDS for the first time in Argentina. Using RDS, under certain specified conditions, the observed estimates for the percentage of the population with a specific trait are asymptotically unbiased. We describe, the diversity of the recruited sample, from the point of view of sexual orientation, and contrast the different subgroups in terms of their HIV sexual risk behavior.500 MSM were recruited using RDS. Behavioral data were collected through face-to-face interviews and Web-based CASI.In contrast with prior studies, RDS generated a very diverse sample of MSM from a sexual identity perspective. Only 24.5% of participants identified as gay; 36.2% identified as bisexual, 21.9% as heterosexual, and 17.4% were grouped as "other." Gay and non-gay identified MSM differed significantly in their sexual behavior, the former having higher numbers of partners, more frequent sexual contacts and less frequency of condom use. One third of the men (gay, 3%; bisexual, 34%, heterosexual, 51%; other, 49% reported having had sex with men, women and transvestites in the two months prior to the interview. This population requires further study and, potentially, HIV prevention strategies tailored to such diversity of partnerships. Our results highlight the potential effectiveness of using RDS to reach non-gay identified MSM. They also present lessons learned in the implementation of RDS to recruit MSM concerning both the importance and limitations of formative work, the need to tailor incentives to circumstances of the less affluent potential participants, the need to prevent masking, and the challenge of assessing network size.

  4. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  5. exTAS - next-generation TAS for small samples and extreme conditions

    International Nuclear Information System (INIS)

    Kulda, J.; Hiess, A.

    2011-01-01

    The currently used implementation of horizontally and vertically focusing optics in three-axis spectrometers (TAS) permits efficient studies of excitations in sub-cm 3 - sized single crystals]. With the present proposal we wish to stimulate a further paradigm shift into the domain of mm 3 -sized samples. exTAS combines highly focused mm-sized focal spots, boosting the sensitivity limits, with a spectrometer layout down-scaled to a table-top size to provide high flexibility in optimizing acceptance angles and to achieve sub-millimeter positioning accuracy. (authors)

  6. cn.MOPS: mixture of Poissons for discovering copy number variations in next-generation sequencing data with a low false discovery rate.

    Science.gov (United States)

    Klambauer, Günter; Schwarzbauer, Karin; Mayr, Andreas; Clevert, Djork-Arné; Mitterecker, Andreas; Bodenhofer, Ulrich; Hochreiter, Sepp

    2012-05-01

    Quantitative analyses of next-generation sequencing (NGS) data, such as the detection of copy number variations (CNVs), remain challenging. Current methods detect CNVs as changes in the depth of coverage along chromosomes. Technological or genomic variations in the depth of coverage thus lead to a high false discovery rate (FDR), even upon correction for GC content. In the context of association studies between CNVs and disease, a high FDR means many false CNVs, thereby decreasing the discovery power of the study after correction for multiple testing. We propose 'Copy Number estimation by a Mixture Of PoissonS' (cn.MOPS), a data processing pipeline for CNV detection in NGS data. In contrast to previous approaches, cn.MOPS incorporates modeling of depths of coverage across samples at each genomic position. Therefore, cn.MOPS is not affected by read count variations along chromosomes. Using a Bayesian approach, cn.MOPS decomposes variations in the depth of coverage across samples into integer copy numbers and noise by means of its mixture components and Poisson distributions, respectively. The noise estimate allows for reducing the FDR by filtering out detections having high noise that are likely to be false detections. We compared cn.MOPS with the five most popular methods for CNV detection in NGS data using four benchmark datasets: (i) simulated data, (ii) NGS data from a male HapMap individual with implanted CNVs from the X chromosome, (iii) data from HapMap individuals with known CNVs, (iv) high coverage data from the 1000 Genomes Project. cn.MOPS outperformed its five competitors in terms of precision (1-FDR) and recall for both gains and losses in all benchmark data sets. The software cn.MOPS is publicly available as an R package at http://www.bioinf.jku.at/software/cnmops/ and at Bioconductor.

  7. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Real-time colour hologram generation based on ray-sampling plane with multi-GPU acceleration.

    Science.gov (United States)

    Sato, Hirochika; Kakue, Takashi; Ichihashi, Yasuyuki; Endo, Yutaka; Wakunami, Koki; Oi, Ryutaro; Yamamoto, Kenji; Nakayama, Hirotaka; Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2018-01-24

    Although electro-holography can reconstruct three-dimensional (3D) motion pictures, its computational cost is too heavy to allow for real-time reconstruction of 3D motion pictures. This study explores accelerating colour hologram generation using light-ray information on a ray-sampling (RS) plane with a graphics processing unit (GPU) to realise a real-time holographic display system. We refer to an image corresponding to light-ray information as an RS image. Colour holograms were generated from three RS images with resolutions of 2,048 × 2,048; 3,072 × 3,072 and 4,096 × 4,096 pixels. The computational results indicate that the generation of the colour holograms using multiple GPUs (NVIDIA Geforce GTX 1080) was approximately 300-500 times faster than those generated using a central processing unit. In addition, the results demonstrate that 3D motion pictures were successfully reconstructed from RS images of 3,072 × 3,072 pixels at approximately 15 frames per second using an electro-holographic reconstruction system in which colour holograms were generated from RS images in real time.

  9. Generation, Characterization and Application of Antibodies Directed against HERV-H Gag Protein in Colorectal Samples.

    Science.gov (United States)

    Mullins, Christina S; Hühns, Maja; Krohn, Mathias; Peters, Sven; Cheynet, Valérie; Oriol, Guy; Guillotte, Michèle; Ducrot, Sandrine; Mallet, François; Linnebacher, Michael

    2016-01-01

    A substantial part of the human genome originates from transposable elements, remnants of ancient retroviral infections. Roughly 8% of the human genome consists of about 400,000 LTR elements including human endogenous retrovirus (HERV) sequences. Mainly, the interplay between epigenetic and post-transcriptional mechanisms is thought to silence HERV expression in most physiological contexts. Interestingly, aberrant reactivation of several HERV-H loci appears specific to colorectal carcinoma (CRC). The expression of HERV-H Gag proteins (Gag-H) was assessed using novel monoclonal mouse anti Gag-H antibodies. In a flow cytometry screen four antibody clones were tested on a panel of primary CRC cell lines and the most well performing ones were subsequently validated in western blot analysis. Finally, Gag-H protein expression was analyzed by immune histology on cell line cytospins and on clinical samples. There, we found a heterogeneous staining pattern with no background staining of endothelial, stromal and infiltrating immune cells but diffuse staining of the cytoplasm for positive tumor and normal crypt cells of the colonic epithelium. Taken together, the Gag-H antibody clone(s) present a valuable tool for staining of cells with colonic origin and thus form the basis for future more detailed investigations. The observed Gag-H protein staining in colonic epithelium crypt cells demands profound analyses of a potential role for Gag-H in the normal physiology of the human gut.

  10. Generation, Characterization and Application of Antibodies Directed against HERV-H Gag Protein in Colorectal Samples.

    Directory of Open Access Journals (Sweden)

    Christina S Mullins

    Full Text Available A substantial part of the human genome originates from transposable elements, remnants of ancient retroviral infections. Roughly 8% of the human genome consists of about 400,000 LTR elements including human endogenous retrovirus (HERV sequences. Mainly, the interplay between epigenetic and post-transcriptional mechanisms is thought to silence HERV expression in most physiological contexts. Interestingly, aberrant reactivation of several HERV-H loci appears specific to colorectal carcinoma (CRC.The expression of HERV-H Gag proteins (Gag-H was assessed using novel monoclonal mouse anti Gag-H antibodies. In a flow cytometry screen four antibody clones were tested on a panel of primary CRC cell lines and the most well performing ones were subsequently validated in western blot analysis. Finally, Gag-H protein expression was analyzed by immune histology on cell line cytospins and on clinical samples. There, we found a heterogeneous staining pattern with no background staining of endothelial, stromal and infiltrating immune cells but diffuse staining of the cytoplasm for positive tumor and normal crypt cells of the colonic epithelium.Taken together, the Gag-H antibody clone(s present a valuable tool for staining of cells with colonic origin and thus form the basis for future more detailed investigations. The observed Gag-H protein staining in colonic epithelium crypt cells demands profound analyses of a potential role for Gag-H in the normal physiology of the human gut.

  11. Generation, Characterization and Application of Antibodies Directed against HERV-H Gag Protein in Colorectal Samples

    Science.gov (United States)

    Mullins, Christina S.; Hühns, Maja; Krohn, Mathias; Peters, Sven; Cheynet, Valérie; Oriol, Guy; Guillotte, Michèle; Ducrot, Sandrine; Mallet, François; Linnebacher, Michael

    2016-01-01

    Introduction A substantial part of the human genome originates from transposable elements, remnants of ancient retroviral infections. Roughly 8% of the human genome consists of about 400,000 LTR elements including human endogenous retrovirus (HERV) sequences. Mainly, the interplay between epigenetic and post-transcriptional mechanisms is thought to silence HERV expression in most physiological contexts. Interestingly, aberrant reactivation of several HERV-H loci appears specific to colorectal carcinoma (CRC). Results The expression of HERV-H Gag proteins (Gag-H) was assessed using novel monoclonal mouse anti Gag-H antibodies. In a flow cytometry screen four antibody clones were tested on a panel of primary CRC cell lines and the most well performing ones were subsequently validated in western blot analysis. Finally, Gag-H protein expression was analyzed by immune histology on cell line cytospins and on clinical samples. There, we found a heterogeneous staining pattern with no background staining of endothelial, stromal and infiltrating immune cells but diffuse staining of the cytoplasm for positive tumor and normal crypt cells of the colonic epithelium. Conclusion Taken together, the Gag-H antibody clone(s) present a valuable tool for staining of cells with colonic origin and thus form the basis for future more detailed investigations. The observed Gag-H protein staining in colonic epithelium crypt cells demands profound analyses of a potential role for Gag-H in the normal physiology of the human gut. PMID:27119520

  12. Evaluation of gene expression data generated from expired Affymetrix GeneChip® microarrays using MAQC reference RNA samples

    Directory of Open Access Journals (Sweden)

    Tong Weida

    2010-10-01

    Full Text Available Abstract Background The Affymetrix GeneChip® system is a commonly used platform for microarray analysis but the technology is inherently expensive. Unfortunately, changes in experimental planning and execution, such as the unavailability of previously anticipated samples or a shift in research focus, may render significant numbers of pre-purchased GeneChip® microarrays unprocessed before their manufacturer’s expiration dates. Researchers and microarray core facilities wonder whether expired microarrays are still useful for gene expression analysis. In addition, it was not clear whether the two human reference RNA samples established by the MAQC project in 2005 still maintained their transcriptome integrity over a period of four years. Experiments were conducted to answer these questions. Results Microarray data were generated in 2009 in three replicates for each of the two MAQC samples with either expired Affymetrix U133A or unexpired U133Plus2 microarrays. These results were compared with data obtained in 2005 on the U133Plus2 microarray. The percentage of overlap between the lists of differentially expressed genes (DEGs from U133Plus2 microarray data generated in 2009 and in 2005 was 97.44%. While there was some degree of fold change compression in the expired U133A microarrays, the percentage of overlap between the lists of DEGs from the expired and unexpired microarrays was as high as 96.99%. Moreover, the microarray data generated using the expired U133A microarrays in 2009 were highly concordant with microarray and TaqMan® data generated by the MAQC project in 2005. Conclusions Our results demonstrated that microarray data generated using U133A microarrays, which were more than four years past the manufacturer’s expiration date, were highly specific and consistent with those from unexpired microarrays in identifying DEGs despite some appreciable fold change compression and decrease in sensitivity. Our data also suggested that the

  13. Near-optimal alternative generation using modified hit-and-run sampling for non-linear, non-convex problems

    Science.gov (United States)

    Rosenberg, D. E.; Alafifi, A.

    2016-12-01

    Water resources systems analysis often focuses on finding optimal solutions. Yet an optimal solution is optimal only for the modelled issues and managers often seek near-optimal alternatives that address un-modelled objectives, preferences, limits, uncertainties, and other issues. Early on, Modelling to Generate Alternatives (MGA) formalized near-optimal as the region comprising the original problem constraints plus a new constraint that allowed performance within a specified tolerance of the optimal objective function value. MGA identified a few maximally-different alternatives from the near-optimal region. Subsequent work applied Markov Chain Monte Carlo (MCMC) sampling to generate a larger number of alternatives that span the near-optimal region of linear problems or select portions for non-linear problems. We extend the MCMC Hit-And-Run method to generate alternatives that span the full extent of the near-optimal region for non-linear, non-convex problems. First, start at a feasible hit point within the near-optimal region, then run a random distance in a random direction to a new hit point. Next, repeat until generating the desired number of alternatives. The key step at each iterate is to run a random distance along the line in the specified direction to a new hit point. If linear equity constraints exist, we construct an orthogonal basis and use a null space transformation to confine hits and runs to a lower-dimensional space. Linear inequity constraints define the convex bounds on the line that runs through the current hit point in the specified direction. We then use slice sampling to identify a new hit point along the line within bounds defined by the non-linear inequity constraints. This technique is computationally efficient compared to prior near-optimal alternative generation techniques such MGA, MCMC Metropolis-Hastings, evolutionary, or firefly algorithms because search at each iteration is confined to the hit line, the algorithm can move in one

  14. EXONSAMPLER: a computer program for genome-wide and candidate gene exon sampling for targeted next-generation sequencing.

    Science.gov (United States)

    Cosart, Ted; Beja-Pereira, Albano; Luikart, Gordon

    2014-11-01

    The computer program EXONSAMPLER automates the sampling of thousands of exon sequences from publicly available reference genome sequences and gene annotation databases. It was designed to provide exon sequences for the efficient, next-generation gene sequencing method called exon capture. The exon sequences can be sampled by a list of gene name abbreviations (e.g. IFNG, TLR1), or by sampling exons from genes spaced evenly across chromosomes. It provides a list of genomic coordinates (a bed file), as well as a set of sequences in fasta format. User-adjustable parameters for collecting exon sequences include a minimum and maximum acceptable exon length, maximum number of exonic base pairs (bp) to sample per gene, and maximum total bp for the entire collection. It allows for partial sampling of very large exons. It can preferentially sample upstream (5 prime) exons, downstream (3 prime) exons, both external exons, or all internal exons. It is written in the Python programming language using its free libraries. We describe the use of EXONSAMPLER to collect exon sequences from the domestic cow (Bos taurus) genome for the design of an exon-capture microarray to sequence exons from related species, including the zebu cow and wild bison. We collected ~10% of the exome (~3 million bp), including 155 candidate genes, and ~16,000 exons evenly spaced genomewide. We prioritized the collection of 5 prime exons to facilitate discovery and genotyping of SNPs near upstream gene regulatory DNA sequences, which control gene expression and are often under natural selection. © 2014 John Wiley & Sons Ltd.

  15. Determination of ultra trace arsenic species in water samples by hydride generation atomic absorption spectrometry after cloud point extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ulusoy, Halil Ibrahim, E-mail: hiulusoy@yahoo.com [University of Cumhuriyet, Faculty of Science, Department of Chemistry, TR-58140, Sivas (Turkey); Akcay, Mehmet; Ulusoy, Songuel; Guerkan, Ramazan [University of Cumhuriyet, Faculty of Science, Department of Chemistry, TR-58140, Sivas (Turkey)

    2011-10-10

    Graphical abstract: The possible complex formation mechanism for ultra-trace As determination. Highlights: {yields} CPE/HGAAS system for arsenic determination and speciation in real samples has been applied first time until now. {yields} The proposed method has the lowest detection limit when compared with those of similar CPE studies present in literature. {yields} The linear range of the method is highly wide and suitable for its application to real samples. - Abstract: Cloud point extraction (CPE) methodology has successfully been employed for the preconcentration of ultra-trace arsenic species in aqueous samples prior to hydride generation atomic absorption spectrometry (HGAAS). As(III) has formed an ion-pairing complex with Pyronine B in presence of sodium dodecyl sulfate (SDS) at pH 10.0 and extracted into the non-ionic surfactant, polyethylene glycol tert-octylphenyl ether (Triton X-114). After phase separation, the surfactant-rich phase was diluted with 2 mL of 1 M HCl and 0.5 mL of 3.0% (w/v) Antifoam A. Under the optimized conditions, a preconcentration factor of 60 and a detection limit of 0.008 {mu}g L{sup -1} with a correlation coefficient of 0.9918 was obtained with a calibration curve in the range of 0.03-4.00 {mu}g L{sup -1}. The proposed preconcentration procedure was successfully applied to the determination of As(III) ions in certified standard water samples (TMDA-53.3 and NIST 1643e, a low level fortified standard for trace elements) and some real samples including natural drinking water and tap water samples.

  16. Synthesis of high generation thermo-sensitive dendrimers for extraction of rivaroxaban from human fluid and pharmaceutic samples.

    Science.gov (United States)

    Parham, Negin; Panahi, Homayon Ahmad; Feizbakhsh, Alireza; Moniri, Elham

    2018-04-13

    In this present study, poly (N-isopropylacrylamide) as a thermo-sensitive agent was grafted onto magnetic nanoparticles, then ethylenediamine and methylmethacrylate were used to synthesize the first generation of poly amidoamine (PAMAM) dendrimers successively and the process continued alternatively until the ten generations of dendrimers. The synthesized nanocomposite was investigated using Fourier transform infrared spectrometry, thermalgravimetry analysis, X-ray diffractometry, elemental analysis and vibrating-sample magnetometer. The particle size and morphology were characterized using dynamic light scattering, field emission scanning electron microscopy and transmission electron microscopy. Batch experiments were conducted to investigate the parameters affecting adsorption and desorption of rivaroxaban by synthesized nanocomposite. The maximum sorption of rivaroxaban by the synthesized nanocomposite was obtained at pH of 8. The resulting grafted magnetic nanoparticle dendrimers were applied for extraction of rivaroxaban from biologic human liquids and medicinal samples. The specifications of rivaroxaban sorbed by a magnetic nanoparticle dendrimer showed good accessibility and high capacity of the active sites within the dendrimers. Urine and drug matrix extraction recoveries of more than 92.5 and 99.8 were obtained, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Post-Flight Microbial Analysis of Samples from the International Space Station Water Recovery System and Oxygen Generation System

    Science.gov (United States)

    Birmele, Michele N.

    2011-01-01

    The Regenerative, Environmental Control and Life Support System (ECLSS) on the International Space Station (ISS) includes the the Water Recovery System (WRS) and the Oxygen Generation System (OGS). The WRS consists of a Urine Processor Assembly (UPA) and Water Processor Assembly (WPA). This report describes microbial characterization of wastewater and surface samples collected from the WRS and OGS subsystems, returned to KSC, JSC, and MSFC on consecutive shuttle flights (STS-129 and STS-130) in 2009-10. STS-129 returned two filters that contained fluid samples from the WPA Waste Tank Orbital Recovery Unit (ORU), one from the waste tank and the other from the ISS humidity condensate. Direct count by microscopic enumeration revealed 8.38 x 104 cells per mL in the humidity condensate sample, but none of those cells were recoverable on solid agar media. In contrast, 3.32 x lOs cells per mL were measured from a surface swab of the WRS waste tank, including viable bacteria and fungi recovered after S12 days of incubation on solid agar media. Based on rDNA sequencing and phenotypic characterization, a fungus recovered from the filter was determined to be Lecythophora mutabilis. The bacterial isolate was identified by rDNA sequence data to be Methylobacterium radiotolerans. Additional UPA subsystem samples were returned on STS-130 for analysis. Both liquid and solid samples were collected from the Russian urine container (EDV), Distillation Assembly (DA) and Recycle Filter Tank Assembly (RFTA) for post-flight analysis. The bacterium Pseudomonas aeruginosa and fungus Chaetomium brasiliense were isolated from the EDV samples. No viable bacteria or fungi were recovered from RFTA brine samples (N= 6), but multiple samples (N = 11) from the DA and RFTA were found to contain fungal and bacterial cells. Many recovered cells have been identified to genus by rDNA sequencing and carbon source utilization profiling (BiOLOG Gen III). The presence of viable bacteria and fungi from WRS

  18. NGSCheckMate: software for validating sample identity in next-generation sequencing studies within and across data types.

    Science.gov (United States)

    Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J

    2017-06-20

    In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Sample preparation for arsenic speciation analysis in baby food by generation of substituted arsines with atomic absorption spectrometry detection.

    Science.gov (United States)

    Huber, Charles S; Vale, Maria Goreti R; Dessuy, Morgana B; Svoboda, Milan; Musil, Stanislav; Dědina, Jiři

    2017-12-01

    A slurry sampling procedure for arsenic speciation analysis in baby food by arsane generation, cryogenic trapping and detection with atomic absorption spectrometry is presented. Several procedures were tested for slurry preparation, including different reagents (HNO 3 , HCl and tetramethylammonium hydroxide - TMAH) and their concentrations, water bath heating and ultrasound-assisted agitation. The best results for inorganic arsenic (iAs) and dimethylarsinate (DMA) were reached when using 3molL -1 HCl under heating and ultrasound-assisted agitation. The developed method was applied for the analysis of five porridge powder and six baby meal samples. The trueness of the method was checked with a certified reference material (CRM) of total arsenic (tAs), iAs and DMA in rice (ERM-BC211). Arsenic recoveries (mass balance) for all samples and CRM were performed by the determination of the tAs by inductively coupled plasma mass spectrometry (ICP-MS) after microwave-assisted digestion and its comparison against the sum of the results from the speciation analysis. The relative limits of detection were 0.44, 0.24 and 0.16µgkg -1 for iAs, methylarsonate and DMA, respectively. The concentrations of the most toxic arsenic species (iAs) in the analyzed baby food samples ranged between 4.2 and 99µgkg -1 which were below the limits of 300, 200 and 100µgkg -1 set by the Brazilian, Chinese and European legislation, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Next-Generation Sequencing Workflow for NSCLC Critical Samples Using a Targeted Sequencing Approach by Ion Torrent PGM™ Platform.

    Science.gov (United States)

    Vanni, Irene; Coco, Simona; Truini, Anna; Rusmini, Marta; Dal Bello, Maria Giovanna; Alama, Angela; Banelli, Barbara; Mora, Marco; Rijavec, Erika; Barletta, Giulia; Genova, Carlo; Biello, Federica; Maggioni, Claudia; Grossi, Francesco

    2015-12-03

    Next-generation sequencing (NGS) is a cost-effective technology capable of screening several genes simultaneously; however, its application in a clinical context requires an established workflow to acquire reliable sequencing results. Here, we report an optimized NGS workflow analyzing 22 lung cancer-related genes to sequence critical samples such as DNA from formalin-fixed paraffin-embedded (FFPE) blocks and circulating free DNA (cfDNA). Snap frozen and matched FFPE gDNA from 12 non-small cell lung cancer (NSCLC) patients, whose gDNA fragmentation status was previously evaluated using a multiplex PCR-based quality control, were successfully sequenced with Ion Torrent PGM™. The robust bioinformatic pipeline allowed us to correctly call both Single Nucleotide Variants (SNVs) and indels with a detection limit of 5%, achieving 100% specificity and 96% sensitivity. This workflow was also validated in 13 FFPE NSCLC biopsies. Furthermore, a specific protocol for low input gDNA capable of producing good sequencing data with high coverage, high uniformity, and a low error rate was also optimized. In conclusion, we demonstrate the feasibility of obtaining gDNA from FFPE samples suitable for NGS by performing appropriate quality controls. The optimized workflow, capable of screening low input gDNA, highlights NGS as a potential tool in the detection, disease monitoring, and treatment of NSCLC.

  1. Cloud point extraction for trace inorganic arsenic speciation analysis in water samples by hydride generation atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Li, Shan, E-mail: ls_tuzi@163.com; Wang, Mei, E-mail: wmei02@163.com; Zhong, Yizhou, E-mail: yizhz@21cn.com; Zhang, Zehua, E-mail: kazuki.0101@aliyun.com; Yang, Bingyi, E-mail: e_yby@163.com

    2015-09-01

    A new cloud point extraction technique was established and used for the determination of trace inorganic arsenic species in water samples combined with hydride generation atomic fluorescence spectrometry (HGAFS). As(III) and As(V) were complexed with ammonium pyrrolidinedithiocarbamate and molybdate, respectively. The complexes were quantitatively extracted with the non-ionic surfactant (Triton X-114) by centrifugation. After addition of antifoam, the surfactant-rich phase containing As(III) was diluted with 5% HCl for HGAFS determination. For As(V) determination, 50% HCl was added to the surfactant-rich phase, and the mixture was placed in an ultrasonic bath at 70 °C for 30 min. As(V) was reduced to As(III) with thiourea–ascorbic acid solution, followed by HGAFS. Under the optimum conditions, limits of detection of 0.009 and 0.012 μg/L were obtained for As(III) and As(V), respectively. Concentration factors of 9.3 and 7.9, respectively, were obtained for a 50 mL sample. The precisions were 2.1% for As(III) and 2.3% for As(V). The proposed method was successfully used for the determination of trace As(III) and As(V) in water samples, with satisfactory recoveries. - Highlights: • Cloud point extraction was firstly established to determine trace inorganic arsenic(As) species combining with HGAFS. • Separate As(III) and As(V) determinations improve the accuracy. • Ultrasonic release of complexed As(V) enables complete As(V) reduction to As(III). • Direct HGAFS analysis can be performed.

  2. An in vitro tag-and-modify protein sample generation method for single-molecule fluorescence resonance energy transfer.

    Science.gov (United States)

    Hamadani, Kambiz M; Howe, Jesse; Jensen, Madeleine K; Wu, Peng; Cate, Jamie H D; Marqusee, Susan

    2017-09-22

    Biomolecular systems exhibit many dynamic and biologically relevant properties, such as conformational fluctuations, multistep catalysis, transient interactions, folding, and allosteric structural transitions. These properties are challenging to detect and engineer using standard ensemble-based techniques. To address this drawback, single-molecule methods offer a way to access conformational distributions, transient states, and asynchronous dynamics inaccessible to these standard techniques. Fluorescence-based single-molecule approaches are parallelizable and compatible with multiplexed detection; to date, however, they have remained limited to serial screens of small protein libraries. This stems from the current absence of methods for generating either individual dual-labeled protein samples at high throughputs or protein libraries compatible with multiplexed screening platforms. Here, we demonstrate that by combining purified and reconstituted in vitro translation, quantitative unnatural amino acid incorporation via AUG codon reassignment, and copper-catalyzed azide-alkyne cycloaddition, we can overcome these challenges for target proteins that are, or can be, methionine-depleted. We present an in vitro parallelizable approach that does not require laborious target-specific purification to generate dual-labeled proteins and ribosome-nascent chain libraries suitable for single-molecule FRET-based conformational phenotyping. We demonstrate the power of this approach by tracking the effects of mutations, C-terminal extensions, and ribosomal tethering on the structure and stability of three protein model systems: barnase, spectrin, and T4 lysozyme. Importantly, dual-labeled ribosome-nascent chain libraries enable single-molecule co-localization of genotypes with phenotypes, are well suited for multiplexed single-molecule screening of protein libraries, and should enable the in vitro directed evolution of proteins with designer single-molecule conformational

  3. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  4. Use of oxidative and reducing vapor generation for reducing the detection limits of iodine in biological samples by inductively coupled plasma atomic emission spectrometry

    International Nuclear Information System (INIS)

    Vtorushina, Eh.A.; Saprykin, A.I.; Knapp, G.

    2009-01-01

    Procedures of microwave combustion in an oxygen flow and microwave acid decomposition of biological samples were optimized for the subsequent determination of iodine. A new method was proposed for the generation of molecular iodine from periodate iona using hydrogen peroxide as a reductant. Procedures were developed for determining iodine in biological samples by inductively coupled plasma atomic emission spectrometry (ICP-AES) using oxidative and reducing vapor generation; these allowed the detection limit for iodine to be lowered by 3-4 orders of magnitude. The developed procedures were used to analyze certified reference materials of milk (Skim Milk Powder BCR 150) and seaweed (Sea Lettuce BCR 279) and a Supradyn vitamin complex

  5. On Matrix Sampling and Imputation of Context Questionnaires with Implications for the Generation of Plausible Values in Large-Scale Assessments

    Science.gov (United States)

    Kaplan, David; Su, Dan

    2016-01-01

    This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…

  6. Studies of gel metal-oxide composite samples as filling materials for W-188/Re-188 generator column

    Czech Academy of Sciences Publication Activity Database

    Iller, E.; Polkowska-Motrenko, H.; Lada, W.; Wawszczak, D.; Sypula, M.; Doner, K.; Konior, M.; Milczarek, J.; Zoladek, J.; Ráliš, Jan

    2009-01-01

    Roč. 281, č. 1 (2009), s. 83-86 ISSN 0236-5731. [9th International Conference on Nuclear Analytical Methods in the Life Sciences. Lisbon, 07.09.2008-12.09.2008] Institutional research plan: CEZ:AV0Z10480505 Keywords : W-188/Re-188 generator * W-Zr gels * W-Zr composites * Sol-gel process Subject RIV: CH - Nuclear ; Quantum Chemistry Impact factor: 0.631, year: 2009

  7. Direct determination of arsenic in soil samples by fast pyrolysis–chemical vapor generation using sodium formate as a reductant followed by nondispersive atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Xuchuan; Zhang, Jingya; Bu, Fanlong

    2015-09-01

    This new study shows for the first time that sodium formate can react with trace arsenic to form volatile species via fast pyrolysis – chemical vapor generation. We found that the presence of thiourea greatly enhanced the generation efficiency and eliminated the interference of copper. We studied the reaction temperature, the volume of sodium formate, the reaction acidity, and the carried argon rate using nondispersive atomic fluorescence spectrometry. Under optimal conditions of T = 500 °C, the volumes of 30% sodium formate and 10% thiourea were 0.2 ml and 0.05 ml, respectively. The carrier argon rate was 300 ml min{sup −1} and the detection limit and precision of arsenic were 0.39 ng and 3.25%, respectively. The amount of arsenic in soil can be directly determined by adding trace amount of hydrochloric acid as a decomposition reagent without any sample pretreatment. The method was successfully applied to determine trace amount of arsenic in two soil-certified reference materials (GBW07453 and GBW07450), and the results were found to be in agreement with certified reference values. - Highlights: • Sodium formate can react with trace arsenic to form volatile species via pyrolysis–chemical vapor generation. • Thiourea can enhance the generation efficiency and eliminate the interference of copper. • Arsenic in soil Sample can be directly determined without sample pretreatment.

  8. Air sampling to assess potential generation of aerosolized viable bacteria during flow cytometric analysis of unfixed bacterial suspensions

    Science.gov (United States)

    Carson, Christine F; Inglis, Timothy JJ

    2018-01-01

    This study investigated aerosolized viable bacteria in a university research laboratory during operation of an acoustic-assisted flow cytometer for antimicrobial susceptibility testing by sampling room air before, during and after flow cytometer use. The aim was to assess the risk associated with use of an acoustic-assisted flow cytometer analyzing unfixed bacterial suspensions. Air sampling in a nearby clinical laboratory was conducted during the same period to provide context for the existing background of microorganisms that would be detected in the air. The three species of bacteria undergoing analysis by flow cytometer in the research laboratory were Klebsiella pneumoniae, Burkholderia thailandensis and Streptococcus pneumoniae. None of these was detected from multiple 1000 L air samples acquired in the research laboratory environment. The main cultured bacteria in both locations were skin commensal and environmental bacteria, presumed to have been disturbed or dispersed in laboratory air by personnel movements during routine laboratory activities. The concentrations of bacteria detected in research laboratory air samples were reduced after interventional cleaning measures were introduced and were lower than those in the diagnostic clinical microbiology laboratory. We conclude that our flow cytometric analyses of unfixed suspensions of K. pneumoniae, B. thailandensis and S. pneumoniae do not pose a risk to cytometer operators or other personnel in the laboratory but caution against extrapolation of our results to other bacteria and/or different flow cytometric experimental procedures. PMID:29608197

  9. Evaluation of the 99Mo contamination in eluates samples generated of 99mTc in a clinic of Recife, Brazil

    International Nuclear Information System (INIS)

    Andrade, W.G.; Lima, F.F.

    2008-01-01

    This study evaluates the 99 Mo content in eluates of 99 Mo/ 99 m Tc generators, used in a nuclear medicine service in Recife. To do this, were collected eluates samples from 5 elution of 10 different generators using the attenuation method in own nuclear medicine service which provided routine activimeter CRC-127R model, manufactured by Capintec. The samples were measured, and the activities of 99m Tc and 99 Mo were determined and calculated the MBT (molybdenum break through) for 1 st , 3 rd , 5 th , 7 th and 9 th elution of each generator. It was observed in a sample the presence of molybdenum in the amount near the limit set by the United States Pharmacopoeia (USP), 0,15μCi/mCi). A second sample presented good high value, more than double the USP limit. The results obtained demonstrate the possibility of finding 99 Mo in the eluted solution, which reinforces the need to deploy the control test of the molybdenum content in all elution in quality control programs of service nuclear medicine

  10. The use of 99Mo/99mTc generators in the analysis of low levels of 99Tc in environmental samples by radiochemical methods

    International Nuclear Information System (INIS)

    Dowdall, M.; Selnaes, Oe.G.; Lind, B.; Gwynn, J.P.

    2010-01-01

    The analysis of low levels of 99 Tc in environmental samples presents special challenges, particularly with respect to the selection of an appropriate and practicable chemical yield tracer. Of all the tracers available, 99m Tc eluted from 99 Mo/ 99m Tc generators appears to be the most practicable in terms of availability, ease of use and cost. These factors have led to an increase in the use of such generators for the provision of 99m Tc as yield tracer for 99 Tc. For the analysis of low levels ( 3 or kg) of 99 Tc in environmental samples, consideration must be given to the radiochemical purity of the tracer solution with respect to contamination with both 99 Tc and other radionuclides. Due to the variable nature of the extent of the interference from tracer solution to tracer solution, it is unwise to try and establish a correction factor for any single generator. The only practical solution to the problem therefore is to run a 'blank' sample with each batch of samples drawn from a single tracer solution. (LN)

  11. Disambiguate: An open-source application for disambiguating two species in next generation sequencing data from grafted samples.

    Science.gov (United States)

    Ahdesmäki, Miika J; Gray, Simon R; Johnson, Justin H; Lai, Zhongwu

    2016-01-01

    Grafting of cell lines and primary tumours is a crucial step in the drug development process between cell line studies and clinical trials. Disambiguate is a program for computationally separating the sequencing reads of two species derived from grafted samples. Disambiguate operates on DNA or RNA-seq alignments to the two species and separates the components at very high sensitivity and specificity as illustrated in artificially mixed human-mouse samples. This allows for maximum recovery of data from target tumours for more accurate variant calling and gene expression quantification. Given that no general use open source algorithm accessible to the bioinformatics community exists for the purposes of separating the two species data, the proposed Disambiguate tool presents a novel approach and improvement to performing sequence analysis of grafted samples. Both Python and C++ implementations are available and they are integrated into several open and closed source pipelines. Disambiguate is open source and is freely available at https://github.com/AstraZeneca-NGS/disambiguate.

  12. Comprehensive study on the pressure dependence of shock wave plasma generation under TEA CO2 laser bombardment on metal sample

    International Nuclear Information System (INIS)

    Marpaung, A.M.; Kurniawan, H.; Tjia, M.O.; Kagawa, K.

    2001-01-01

    An experimental study has been carried out on the dynamical process taking place in the plasma generated by a TEA CO 2 laser (400 mJ, 100 ns) on a zinc target when surrounded by helium gas of pressure ranging from 2 Torr to 1 atm. Plasma characteristics were examined in detail on the emission lines of Zn I 481.0 nm and He I 587.6 nm by means of an unique time-resolved spatial distribution technique in addition to an ordinary time-resolved emission measurement technique. The results reveal, for the first time, persistent shock wave characteristics in all cases throughout the entire pressure range considered. Further analysis of the data has clarified the distinct characteristics of laser plasmas generated in different ranges of gas pressure. It is concluded that three types of shock wave plasma can be identified; namely, a target shock wave plasma in the pressure range from 2 Torr to around 50 Torr; a coupling shock wave plasma in the pressure range from around 50 Torr to 200 Torr and a gas breakdown shock wave plasma in the pressure range from around 200 Torr to 1 atm. These distinct characteristics are found to be ascribable to the different extents of the gas breakdown process taking place at the different gas pressures. These results, obtained for a TEA CO 2 laser, will provide a useful basis for the analyses of plasmas induced by other lasers. (author)

  13. Experimental and numerical examination of eddy (Foucault) currents in rotating micro-coils: Generation of heat and its impact on sample temperature

    Science.gov (United States)

    Aguiar, Pedro M.; Jacquinot, Jacques-François; Sakellariou, Dimitris

    2009-09-01

    The application of nuclear magnetic resonance (NMR) to systems of limited quantity has stimulated the use of micro-coils (diameter Foucault (eddy) currents, which generate heat. We report the first data acquired with a 4 mm MACS system and spinning up to 10 kHz. The need to spin faster necessitates improved methods to control heating. We propose an approximate solution to calculate the power losses (heat) from the eddy currents for a solenoidal coil, in order to provide insight into the functional dependencies of Foucault currents. Experimental tests of the dependencies reveal conditions which result in reduced sample heating and negligible temperature distributions over the sample volume.

  14. Experimental and numerical examination of eddy (Foucault) currents in rotating micro-coils: Generation of heat and its impact on sample temperature.

    Science.gov (United States)

    Aguiar, Pedro M; Jacquinot, Jacques-François; Sakellariou, Dimitris

    2009-09-01

    The application of nuclear magnetic resonance (NMR) to systems of limited quantity has stimulated the use of micro-coils (diameter Foucault (eddy) currents, which generate heat. We report the first data acquired with a 4mm MACS system and spinning up to 10kHz. The need to spin faster necessitates improved methods to control heating. We propose an approximate solution to calculate the power losses (heat) from the eddy currents for a solenoidal coil, in order to provide insight into the functional dependencies of Foucault currents. Experimental tests of the dependencies reveal conditions which result in reduced sample heating and negligible temperature distributions over the sample volume.

  15. An Exploration of Mate Similarity for Criminal Offending Behaviors: Results from a Multi-Generation Sample of Dutch Spouses.

    Science.gov (United States)

    van de Weijer, Steve G A; Beaver, Kevin M

    2017-09-01

    There has been a growing body of research examining mate and spousal similarity on antisocial behaviors. The results of these studies have shown varying degrees of similarity between mates and spouses, but the precise mechanisms accounting for such similarity have remained somewhat elusive. The current study builds off this line of research and examines spousal similarity on criminal offending behaviors. Moreover, we also examine the potential factors that might account for spousal similarity. This study analyzed data drawn from two generations of Dutch spouses. The analyses revealed statistically significant associations between mates on criminal offending prior to marriage, a finding that is directly in line with an assortative mating explanation of spousal similarity. In addition, the analyses also revealed that criminal offending between spouses becomes even more similar after marriage, a finding that is line with a behavioral contagion explanation of spousal similarity. We conclude by discussing the limitations of the study along with the implications that these findings have for criminological research.

  16. Next-generation sampling: Pairing genomics with herbarium specimens provides species-level signal in Solidago (Asteraceae).

    Science.gov (United States)

    Beck, James B; Semple, John C

    2015-06-01

    The ability to conduct species delimitation and phylogeny reconstruction with genomic data sets obtained exclusively from herbarium specimens would rapidly enhance our knowledge of large, taxonomically contentious plant genera. In this study, the utility of genotyping by sequencing is assessed in the notoriously difficult genus Solidago (Asteraceae) by attempting to obtain an informative single-nucleotide polymorphism data set from a set of specimens collected between 1970 and 2010. Reduced representation libraries were prepared and Illumina-sequenced from 95 Solidago herbarium specimen DNAs, and resulting reads were processed with the nonreference Universal Network-Enabled Analysis Kit (UNEAK) pipeline. Multidimensional clustering was used to assess the correspondence between genetic groups and morphologically defined species. Library construction and sequencing were successful in 93 of 95 samples. The UNEAK pipeline identified 8470 single-nucleotide polymorphisms, and a filtered data set was analyzed for each of three Solidago subsections. Although results varied, clustering identified genomic groups that often corresponded to currently recognized species or groups of closely related species. These results suggest that genotyping by sequencing is broadly applicable to DNAs obtained from herbarium specimens. The data obtained and their biological signal suggest that pairing genomics with large-scale herbarium sampling is a promising strategy in species-rich plant groups.

  17. The beauty of being (label)-free: sample preparation methods for SWATH-MS and next-generation targeted proteomics

    Science.gov (United States)

    Campbell, Kate; Deery, Michael J.; Lilley, Kathryn S.; Ralser, Markus

    2014-01-01

    The combination of qualitative analysis with label-free quantification has greatly facilitated the throughput and flexibility of novel proteomic techniques. However, such methods rely heavily on robust and reproducible sample preparation procedures. Here, we benchmark a selection of in gel, on filter, and in solution digestion workflows for their application in label-free proteomics. Each procedure was associated with differing advantages and disadvantages. The in gel methods interrogated were cost effective, but were limited in throughput and digest efficiency. Filter-aided sample preparations facilitated reasonable processing times and yielded a balanced representation of membrane proteins, but led to a high signal variation in quantification experiments. Two in solution digest protocols, however, gave optimal performance for label-free proteomics. A protocol based on the detergent RapiGest led to the highest number of detected proteins at second-best signal stability, while a protocol based on acetonitrile-digestion, RapidACN, scored best in throughput and signal stability but came second in protein identification. In addition, we compared label-free data dependent (DDA) and data independent (SWATH) acquisition on a TripleTOF 5600 instrument. While largely similar in protein detection, SWATH outperformed DDA in quantification, reducing signal variation and markedly increasing the number of precisely quantified peptides. PMID:24741437

  18. Technical management plan for sample generation, analysis, and data review for Phase 2 of the Clinch River Environmental Restoration Program

    International Nuclear Information System (INIS)

    Brandt, C.C.; Benson, S.B.; Beeler, D.A.

    1994-03-01

    The Clinch River Remedial Investigation (CRRI) is designed to address the transport, fate, and distribution of waterborne contaminants (radionuclides, metals, and organic compounds) released from the US Department of Energy's (DOE's) Oak Ridge Reservation (ORR) and to assess potential risks to human health and the environment associated with these contaminants. The remedial investigation is entering Phase 2, which has the following items as its objectives: define the nature and extent of the contamination in areas downstream from the DOE ORR, evaluate the human health and ecological risks posed by these contaminants, and perform preliminary identification and evaluation of potential remediation alternatives. This plan describes the requirements, responsibilities, and roles of personnel during sampling, analysis, and data review for the Clinch River Environmental Restoration Program (CR-ERP). The purpose of the plan is to formalize the process for obtaining analytical services, tracking sampling and analysis documentation, and assessing the overall quality of the CR-ERP data collection program to ensure that it will provide the necessary building blocks for the program decision-making process

  19. Determination of antimony by electrochemical hydride generation atomic absorption spectrometry in samples with high iron content using chelating resins as on-line removal system

    International Nuclear Information System (INIS)

    Bolea, E.; Arroyo, D.; Laborda, F.; Castillo, J.R.

    2006-01-01

    A method for the removal of the interference caused by iron on electrochemical generation of stibine is proposed. It consists of a chelating resin Chelex 100 column integrated into a flow injection system and coupled to the electrochemical hydride generator quartz tube atomic absorption spectrometer (EcHG-QT-AAS). Iron, as Fe(II), is retained in the column with high efficiency, close to 99.9% under optimal conditions. No significant retention was observed for Sb(III) under same conditions and a 97 ± 5% signal recovery was achieved. An electrochemical hydride generator with a concentric configuration and a reticulated vitreous carbon cathode was employed. The system is able to determine antimony concentrations in the range of ng ml -1 in presence of iron concentrations up to 400 mg l -1 . The procedure was validated by analyzing PACS-2 marine sediments reference material with a 4% (w/w) iron content and a [Fe]:[Sb] ratio of 4000:1, which caused total antimony signal suppression on the electrochemical hydride generation system. A compost sample with high iron content (0.7%, w/w), was also analyzed. A good agreement was found on both samples with the certified value and the antimony concentration determined by ICP-MS, respectively

  20. Pre-fractionated Microbial Samples – The Second Generation Natural Products Library at Wyeth

    Directory of Open Access Journals (Sweden)

    Melissa M. Wagenaar

    2008-06-01

    Full Text Available From the beginning of the antibiotic era in the 1940s to the present, Wyeth has sustained an active research program in the area of natural products discovery. This program has continually evolved through the years in order to best align with the “current” drug discovery paradigm in the pharmaceutical industry. The introduction of highthroughput screening and the miniaturization of assays have created a need to optimize natural product samples to better suit these new technologies. Furthermore, natural product programs are faced with an ever shortening time period from hit detection to lead characterization. To address these issues, Wyeth has created a pre-fractionated natural products library using reversed-phase HPLC to complement their existing library of crude extracts. The details of the pre-fractionated library and a cost-benefit analysis will be presented in this review.

  1. Next generation sensing platforms for extended deployments in large-scale, multidisciplinary, adaptive sampling and observational networks

    Science.gov (United States)

    Cross, J. N.; Meinig, C.; Mordy, C. W.; Lawrence-Slavas, N.; Cokelet, E. D.; Jenkins, R.; Tabisola, H. M.; Stabeno, P. J.

    2016-12-01

    New autonomous sensors have dramatically increased the resolution and accuracy of oceanographic data collection, enabling rapid sampling over extremely fine scales. Innovative new autonomous platofrms like floats, gliders, drones, and crawling moorings leverage the full potential of these new sensors by extending spatiotemporal reach across varied environments. During 2015 and 2016, The Innovative Technology for Arctic Exploration Program at the Pacific Marine Environmental Laboratory tested several new types of fully autonomous platforms with increased speed, durability, and power and payload capacity designed to deliver cutting-edge ecosystem assessment sensors to remote or inaccessible environments. The Expendable Ice-Tracking (EXIT) gloat developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) is moored near bottom during the ice-free season and released on an autonomous timer beneath the ice during the following winter. The float collects a rapid profile during ascent, and continues to collect critical, poorly-accessible under-ice data until melt, when data is transmitted via satellite. The autonomous Oculus sub-surface glider developed by the University of Washington and PMEL has a large power and payload capacity and an enhanced buoyancy engine. This 'coastal truck' is designed for the rapid water column ascent required by optical imaging systems. The Saildrone is a solar and wind powered ocean unmanned surface vessel (USV) developed by Saildrone, Inc. in partnership with PMEL. This large-payload (200 lbs), fast (1-7 kts), durable (46 kts winds) platform was equipped with 15 sensors designed for ecosystem assessment during 2016, including passive and active acoustic systems specially redesigned for autonomous vehicle deployments. The senors deployed on these platforms achieved rigorous accuracy and precision standards. These innovative platforms provide new sampling capabilities and cost efficiencies in high-resolution sensor deployment

  2. A new nebulization device with exchangeable aerosol generation mode as a useful tool to investigate sample introduction processes in inductively coupled plasma atomic emission spectrometry

    International Nuclear Information System (INIS)

    Grotti, Marco; Lagomarsino, Cristina; Frache, Roberto

    2004-01-01

    A new sample introduction device has been designed in order to differentiate between the effects of the aerosol production and its following desolvation on analytical performances of an inductively coupled plasma optical spectrometer. This research tool allows to easily switch between the pneumatic and ultrasonic aerosol generation mode and to use a joint desolvation chamber. In this way, a real comparison between aerosol production systems may be attained and the influence of aerosol generation process on analytical figures clearly distinguished from that of the desolvation process. In this work, the separate effects of the aerosol generation and desolvation processes on analytical sensitivity and tolerance towards matrix effects have been investigated. Concerning sensitivity, it was found that both the processes play an important role in determining emission intensities, being the increase in sensitivity due to desolvation higher than that due to the improved aerosol generation efficiency. Concerning the matrix effects, a predominant role of the desolvation system was found, while the influence of the aerosol generation mode was much less important. For nitric acid, the decreasing effect was mitigated by the presence of a desolvation system, due to partial removal of the acid. On the contrary, the depressive effect of sulfuric acid was enhanced by the presence of a desolvation system, due to degradation of the solvent removal efficiency and to further decrease in the analyte transport rate caused by clustering phenomena. Concerning the interferences due to sodium and calcium, a depressive effect was observed, which is enhanced by desolvation

  3. Selective reduction of arsenic species by hydride generation - atomic absorption spectrometry. Part 2 - sample storage and arsenic determination in natural waters

    Directory of Open Access Journals (Sweden)

    Quináia Sueli P.

    2001-01-01

    Full Text Available Total arsenic, arsenite, arsinate and dimethylarsinic acid (DMA were selectively determined in natural waters by hydride generation - atomic absorption spectrometry, using sodium tetrahydroborate(III as reductant but in different reduction media. River water samples from the north region of Paraná State, Brazil, were analysed and showed arsenate as the principal arsenical form. Detection limits found for As(III (citrate buffer, As(III + DMA (acetic acid and As(III + As(V (hydrochloric acid were 0.6, 1.1 and 0.5 mg As L-1, respectively. Sample storage on the proper reaction media revealed to be a useful way to preserve the water sample.

  4. Multielemental Determination of As, Bi, Ge, Sb, and Sn in Agricultural Samples Using Hydride Generation Coupled to Microwave-Induced Plasma Optical Emission Spectrometry.

    Science.gov (United States)

    Machado, Raquel C; Amaral, Clarice D B; Nóbrega, Joaquim A; Araujo Nogueira, Ana Rita

    2017-06-14

    A microwave-induced plasma optical emission spectrometer with N 2 -based plasma was combined with a multimode sample introduction system (MSIS) for hydride generation (HG) and multielemental determination of As, Bi, Ge, Sb, and Sn in samples of forage, bovine liver, powdered milk, agricultural gypsum, rice, and mineral fertilizer, using a single condition of prereduction and reduction. The accuracy of the developed analytical method was evaluated using certified reference materials of water and mineral fertilizer, and recoveries ranged from 95 to 106%. Addition and recovery experiments were carried out, and the recoveries varied from 85 to 117% for all samples evaluated. The limits of detection for As, Bi, Ge, Sb, and Sn were 0.46, 0.09, 0.19, 0.46, and 5.2 μg/L, respectively, for liquid samples, and 0.18, 0.04, 0.08, 0.19, and 2.1 mg/kg, respectively, for solid samples. The method proposed offers a simple, fast, multielemental, and robust alternative for successful determination of all five analytes in agricultural samples with low operational cost without compromising analytical performance.

  5. Measurement of microparticle tissue factor activity in clinical samples: A summary of two tissue factor-dependent FXa generation assays.

    Science.gov (United States)

    Hisada, Yohei; Alexander, Wyeth; Kasthuri, Raj; Voorhees, Peter; Mobarrez, Fariborz; Taylor, Angela; McNamara, Coleen; Wallen, Hakan; Witkowski, Marco; Key, Nigel S; Rauch, Ursula; Mackman, Nigel

    2016-03-01

    Thrombosis is a leading cause of morbidity and mortality. Detection of a prothrombotic state using biomarkers would be of great benefit to identify patients at risk of thrombosis that would benefit from thromboprophylaxis. Tissue factor (TF) is a highly procoagulant protein that under normal conditions is not present in the blood. However, increased levels of TF in the blood in the form of microparticles (MPs) (also called extracellular vesicles) are observed under various pathological conditions. In this review, we will discuss studies that have measured MP-TF activity in a variety of diseases using two similar FXa generation assay. One of the most robust signals for MP-TF activity (16-26 fold higher than healthy controls) is observed in pancreatic cancer patients with venous thromboembolism. In this case, the TF+ MPs appear to be derived from the cancer cells. Surprisingly, cirrhosis and acute liver injury are associated with 17-fold and 38-fold increases in MP-TF activity, respectively. Based on mouse models, we speculate that the TF+ MPs are derived from hepatocytes. More modest increases are observed in patients with urinary tract infections (6-fold) and in a human endotoxemia model (9-fold) where monocytes are the likely source of the TF+ MPs. Finally, there is no increase in MP-TF activity in the majority of cardiovascular disease patients. These studies indicate that MP-TF activity may be a useful biomarker to identify patients with particular diseases that have an increased risk of thrombosis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Slurry sampling flow injection chemical vapor generation inductively coupled plasma mass spectrometry for the determination of trace Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Wei-Ni [Department of Chemistry, National Sun Yat-sen University, Kaohsiung 80424, Taiwan (China); Jiang, Shiuh-Jen, E-mail: sjjiang@faculty.nsysu.edu.tw [Department of Chemistry, National Sun Yat-sen University, Kaohsiung 80424, Taiwan (China); Department of Medical Laboratory Science and Biotechnology, Kaohsiung Medical University, Kaohsiung 80708, Taiwan (China); Chen, Yen-Ling [Department of Fragrance and Cosmetic Science, Kaohsiung Medical University, Kaohsiung 80708, Taiwan (China); Sahayam, A.C. [National Centre for Compositional Characterisation of Materials (CCCM), Hyderabad (India)

    2015-02-20

    Highlights: • Determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions in a single run. • Accurate analysis using isotope dilution and standard addition methods. • Vapor generation ICP-MS yielded superior detection limits compared to ETV-ICP-MS. • No sample dissolution increased sample through put. • Analysis of GBW09305 Cosmetic (Cream) reference material for accuracy. - Abstract: A slurry sampling inductively coupled plasma mass spectrometry (ICP-MS) method has been developed for the determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions using flow injection (FI) vapor generation (VG) as the sample introduction system. A slurry containing 2% m/v lotion, 2% m/v thiourea, 0.05% m/v L-cysteine, 0.5 μg mL{sup −1} Co(II), 0.1% m/v Triton X-100 and 1.2% v/v HCl was injected into a VG-ICP-MS system for the determination of Ge, As, Cd, Sb, Hg and Bi without dissolution and mineralization. Because the sensitivities of the analytes in the slurry and that of aqueous solution were quite different, an isotope dilution method and a standard addition method were used for the determination. This method has been validated by the determination of Ge, As, Cd, Sb, Hg and Bi in GBW09305 Cosmetic (Cream) reference material. The method was also applied for the determination of Ge, As, Cd, Sb, Hg and Bi in three cosmetic lotion samples obtained locally. The analysis results of the reference material agreed with the certified value and/or ETV-ICP-MS results. The detection limit estimated from the standard addition curve was 0.025, 0.1, 0.2, 0.1, 0.15, and 0.03 ng g{sup −1} for Ge, As, Cd, Sb, Hg and Bi, respectively, in original cosmetic lotion sample.

  7. Slurry sampling flow injection chemical vapor generation inductively coupled plasma mass spectrometry for the determination of trace Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions

    International Nuclear Information System (INIS)

    Chen, Wei-Ni; Jiang, Shiuh-Jen; Chen, Yen-Ling; Sahayam, A.C.

    2015-01-01

    Highlights: • Determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions in a single run. • Accurate analysis using isotope dilution and standard addition methods. • Vapor generation ICP-MS yielded superior detection limits compared to ETV-ICP-MS. • No sample dissolution increased sample through put. • Analysis of GBW09305 Cosmetic (Cream) reference material for accuracy. - Abstract: A slurry sampling inductively coupled plasma mass spectrometry (ICP-MS) method has been developed for the determination of Ge, As, Cd, Sb, Hg and Bi in cosmetic lotions using flow injection (FI) vapor generation (VG) as the sample introduction system. A slurry containing 2% m/v lotion, 2% m/v thiourea, 0.05% m/v L-cysteine, 0.5 μg mL −1 Co(II), 0.1% m/v Triton X-100 and 1.2% v/v HCl was injected into a VG-ICP-MS system for the determination of Ge, As, Cd, Sb, Hg and Bi without dissolution and mineralization. Because the sensitivities of the analytes in the slurry and that of aqueous solution were quite different, an isotope dilution method and a standard addition method were used for the determination. This method has been validated by the determination of Ge, As, Cd, Sb, Hg and Bi in GBW09305 Cosmetic (Cream) reference material. The method was also applied for the determination of Ge, As, Cd, Sb, Hg and Bi in three cosmetic lotion samples obtained locally. The analysis results of the reference material agreed with the certified value and/or ETV-ICP-MS results. The detection limit estimated from the standard addition curve was 0.025, 0.1, 0.2, 0.1, 0.15, and 0.03 ng g −1 for Ge, As, Cd, Sb, Hg and Bi, respectively, in original cosmetic lotion sample

  8. Sample preconcentration utilizing nanofractures generated by junction gap breakdown assisted by self-assembled monolayer of gold nanoparticles.

    Directory of Open Access Journals (Sweden)

    Chun-Ping Jen

    Full Text Available The preconcentration of proteins with low concentrations can be used to increase the sensitivity and accuracy of detection. A nonlinear electrokinetic flow is induced in a nanofluidic channel due to the overlap of electrical double layers, resulting in the fast accumulation of proteins, referred to as the exclusion-enrichment effect. The proposed chip for protein preconcentration was fabricated using simple standard soft lithography with a polydimethylsiloxane replica. This study extends our previous paper, in which gold nanoparticles were manually deposited onto the surface of a protein preconcentrator. In the present work, nanofractures were formed by utilizing the self-assembly of gold-nanoparticle-assisted electric breakdown. This reliable method for nanofracture formation, involving self-assembled monolayers of nanoparticles at the junction gap between microchannels, also decreases the required electric breakdown voltage. The experimental results reveal that a high concentration factor of 1.5×10(4 for a protein sample with an extremely low concentration of 1 nM was achieved in 30 min by using the proposed chip, which is faster than our previously proposed chip at the same conditions. Moreover, an immunoassay of bovine serum albumin (BSA and anti-BSA was carried out to demonstrate the applicability of the proposed chip.

  9. Identification and Characterization of Epstein-Barr Virus Genomes in Lung Carcinoma Biopsy Samples by Next-Generation Sequencing Technology.

    Science.gov (United States)

    Wang, Shanshan; Xiong, Hongchao; Yan, Shi; Wu, Nan; Lu, Zheming

    2016-05-18

    Epstein-Barr virus (EBV) has been detected in the tumor cells of several cancers, including some cases of lung carcinoma (LC). However, the genomic characteristics and diversity of EBV strains associated with LC are poorly understood. In this study, we sequenced the EBV genomes isolated from four primary LC tumor biopsy samples, designated LC1 to LC4. Comparative analysis demonstrated that LC strains were more closely related to GD1 strain. Compared to GD1 reference genome, a total of 520 variations in all, including 498 substitutions, 12 insertions, and 10 deletions were found. Latent genes were found to harbor the most numbers of nonsynonymous mutations. Phylogenetic analysis showed that all LC strains were closely related to Asian EBV strains, whereas different from African/American strains. LC2 genome was distinct from the other three LC genomes, suggesting at least two parental lineages of EBV among the LC genomes may exist. All LC strains could be classified as China 1 and V-val subtype according to the amino acid sequence of LMP1 and EBNA1, respectively. In conclusion, our results showed the genomic diversity among EBV genomes isolated from LC, which might facilitate to uncover the previously unknown variations of pathogenic significance.

  10. Determination of As(III) and total inorganic As in water samples using an on-line solid phase extraction and flow injection hydride generation atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Sigrist, Mirna, E-mail: msigrist@fiq.unl.edu.ar [Laboratorio Central, Facultad de Ingenieria Quimica, Universidad Nacional del Litoral, Santiago del Estero 2654-Piso 6, (3000) Santa Fe (Argentina); Albertengo, Antonela; Beldomenico, Horacio [Laboratorio Central, Facultad de Ingenieria Quimica, Universidad Nacional del Litoral, Santiago del Estero 2654-Piso 6, (3000) Santa Fe (Argentina); Tudino, Mabel [Laboratorio de Analisis de Trazas, Departamento de Quimica Inorganica, Analitica y Quimica Fisica/INQUIMAE, Facultad de Ciencias Exactas y Naturales, Pabellon II, Ciudad Universitaria (1428), Buenos Aires (Argentina)

    2011-04-15

    A simple and robust on-line sequential injection system based on solid phase extraction (SPE) coupled to a flow injection hydride generation atomic absorption spectrometer (FI-HGAAS) with a heated quartz tube atomizer (QTA) was developed and optimized for the determination of As(III) in groundwater without any kind of sample pretreatment. The method was based on the selective retention of inorganic As(V) that was carried out by passing the filtered original sample through a cartridge containing a chloride-form strong anion exchanger. Thus the most toxic form, inorganic As(III), was determined fast and directly by AsH{sub 3} generation using 3.5 mol L{sup -1} HCl as carrier solution and 0.35% (m/v) NaBH{sub 4} in 0.025% NaOH as the reductant. Since the uptake of As(V) should be interfered by several anions of natural occurrence in waters, the effect of Cl{sup -}, SO{sub 4}{sup 2-}, NO{sub 3}{sup -}, HPO{sub 4}{sup 2-}, HCO{sub 3}{sup -} on retention was evaluated and discussed. The total soluble inorganic arsenic concentration was determined on aliquots of filtered samples acidified with concentrated HCl and pre-reduced with 5% KI-5% C{sub 6}H{sub 8}O{sub 6} solution. The concentration of As(V) was calculated by difference between the total soluble inorganic arsenic and As(III) concentrations. Detection limits (LODs) of 0.5 {mu}g L{sup -1} and 0.6 {mu}g L{sup -1} for As(III) and inorganic total As, respectively, were obtained for a 500 {mu}L sample volume. The obtained limits of detection allowed testing the water quality according to the national and international regulations. The analytical recovery for water samples spiked with As(III) ranged between 98% and 106%. The sampling throughput for As(III) determination was 60 samples h{sup -1}. The device for groundwater sampling was especially designed for the authors. Metallic components were avoided and the contact between the sample and the atmospheric oxygen was carried to a minimum. On-field arsenic species

  11. Determination of As(III) and total inorganic As in water samples using an on-line solid phase extraction and flow injection hydride generation atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Sigrist, Mirna; Albertengo, Antonela; Beldomenico, Horacio; Tudino, Mabel

    2011-01-01

    A simple and robust on-line sequential injection system based on solid phase extraction (SPE) coupled to a flow injection hydride generation atomic absorption spectrometer (FI-HGAAS) with a heated quartz tube atomizer (QTA) was developed and optimized for the determination of As(III) in groundwater without any kind of sample pretreatment. The method was based on the selective retention of inorganic As(V) that was carried out by passing the filtered original sample through a cartridge containing a chloride-form strong anion exchanger. Thus the most toxic form, inorganic As(III), was determined fast and directly by AsH 3 generation using 3.5 mol L -1 HCl as carrier solution and 0.35% (m/v) NaBH 4 in 0.025% NaOH as the reductant. Since the uptake of As(V) should be interfered by several anions of natural occurrence in waters, the effect of Cl - , SO 4 2- , NO 3 - , HPO 4 2- , HCO 3 - on retention was evaluated and discussed. The total soluble inorganic arsenic concentration was determined on aliquots of filtered samples acidified with concentrated HCl and pre-reduced with 5% KI-5% C 6 H 8 O 6 solution. The concentration of As(V) was calculated by difference between the total soluble inorganic arsenic and As(III) concentrations. Detection limits (LODs) of 0.5 μg L -1 and 0.6 μg L -1 for As(III) and inorganic total As, respectively, were obtained for a 500 μL sample volume. The obtained limits of detection allowed testing the water quality according to the national and international regulations. The analytical recovery for water samples spiked with As(III) ranged between 98% and 106%. The sampling throughput for As(III) determination was 60 samples h -1 . The device for groundwater sampling was especially designed for the authors. Metallic components were avoided and the contact between the sample and the atmospheric oxygen was carried to a minimum. On-field arsenic species separation was performed through the employ of a serial connection of membrane filters and

  12. Preliminary report on the development of some indices of relative nutritive value (RNV) of cereal and legume samples, applicable in the early generations of selection

    International Nuclear Information System (INIS)

    Kaul, A.K.; Niemann, E.G.

    1975-01-01

    Rapid screening methods for biuret nitrogen determination and fluorometric lysine estimation are described. While the biuret method has been found to be suitable for early generation screening for peptide nitrogen determination, fluorescence estimation of dansylated grain meal could be taken as a good index of available lysine in cereal and legume samples. The necessity of rapid and inexpensive tests for the determination of Relative Nutritive Value (RNV) in the advance generations of screening, is discussed. Preliminary data available on two such tests, utilizing protozoan Tetrahymena pyriformis W. and flour beetle Tribolium confusum Duval, indicated promise. Both techniques were tried on different cereal and legume samples. The relative lethality of beetle larvae and their nitrogen retention were taken as indices of RNV in legumes. Larval Nitrogen Retention Index (LNRI) of cereal samples was found to be dependent both on nitrogen content and on protein quality. It was concluded that both these organisms need to be further investigated for their potential as test animals for RNV determination in the advance segregating populations. (author)

  13. Search for Third Generation Squarks in the Missing Transverse Energy plus Jet Sample at CDF Run II

    Energy Technology Data Exchange (ETDEWEB)

    Marono, Miguel Vidal [Complutense Univ. of Madrid (Spain)

    2010-03-01

    lightest SUSY particle (LSP) which would provide a candidate for cold dark matter, that account for 23% of the universe content, as strongly suggested by recent astrophysical data [1]. The Tevatron is a hadron collider operating at Fermilab, USA. This accelerator provides proton-antiproton (p$\\bar{p}$) collisions with a center of mass energy of √s = 1.96 TeV. CDF and D0 are the detectors built to analyse the products of the collisions provided by the Tevatron. Both experiments have produced a very significant scientific output in the last few years, like the discovery of the top quark or the measurement of the Bs mixing. The Tevatron experiments are also reaching sensitivity to the SM Higgs boson. The scientific program of CDF includes a broad spectrum on searches for physics signatures beyond the Standard Model. Tevatron is still the energy frontier, what means an unique opportunity to produce a discovery in physic beyond the Standard Model. The analyses presented in this thesis focus on the search for third generation squarks in the missing transverse energy plus jets final state. The production of sbottom ($\\tilde{b}$) and stop ($\\tilde{t}$) quarks could be highly enhanced at the Tevatron, giving the possibility of discovering new physics or limiting the parameter space available in the theory. No signal is found over the predicted Standard Model background in both searches. Instead, 95% confidence level limits are set on the production cross section, and then translated into the mass plane of the hypothetical particles. This thesis sketches the basic theory concepts of the Standard Model and the Minimal Supersymmetric Extension in Chapter 2. Chapter 3, describes the Tevatron and CDF. Based on the CDF subsystems information, Chapter 4 and 5 describe the analysis objet reconstruction and the heavy flavor tagging tools. The development of the analyses is shown in Chapter 6 and Chapter 7. Finally, Chapter 8 is devoted to discuss the results and conclusions

  14. A 12 kV, 1 kHz, Pulse Generator for Breakdown Studies of Samples for CLIC RF Accelerating Structures

    CERN Document Server

    Soares, R H; Kovermann, J; Calatroni, S; Wuensch, W

    2012-01-01

    Compact Linear Collider (CLIC) RF structures must be capable of sustaining high surface electric fields, in excess of 200 MV/m, with a breakdown (BD) rate below 3×10-7 breakdowns/pulse/m. Achieving such a low rate requires a detailed understanding of all the steps involved in the mechanism of breakdown. One of the fundamental studies is to investigate the statistical characteristics of the BD rate phenomenon at very low values to understand the origin of an observed dependency of the surface electric field raised to the power of 30. To acquire sufficient BD data, in a reasonable period of time, a high repetition rate pulse generator is required for an existing d.c. spark system at CERN. Following BD of the material sample the pulse generator must deliver a current pulse of several 10’s of Amperes for ~2 μs. A high repetition rate pulse generator has been designed, built and tested; this utilizes pulse forming line technology and employs MOSFET switches. This paper describes the design of the pulse generat...

  15. Multi-Locus Next-Generation Sequence Typing of DNA Extracted From Pooled Colonies Detects Multiple Unrelated Candida albicans Strains in a Significant Proportion of Patient Samples

    Directory of Open Access Journals (Sweden)

    Ningxin Zhang

    2018-06-01

    Full Text Available The yeast Candida albicans is an important opportunistic human pathogen. For C. albicans strain typing or drug susceptibility testing, a single colony recovered from a patient sample is normally used. This is insufficient when multiple strains are present at the site sampled. How often this is the case is unclear. Previous studies, confined to oral, vaginal and vulvar samples, have yielded conflicting results and have assessed too small a number of colonies per sample to reliably detect the presence of multiple strains. We developed a next-generation sequencing (NGS modification of the highly discriminatory C. albicans MLST (multilocus sequence typing method, 100+1 NGS-MLST, for detection and typing of multiple strains in clinical samples. In 100+1 NGS-MLST, DNA is extracted from a pool of colonies from a patient sample and also from one of the colonies. MLST amplicons from both DNA preparations are analyzed by high-throughput sequencing. Using base call frequencies, our bespoke DALMATIONS software determines the MLST type of the single colony. If base call frequency differences between pool and single colony indicate the presence of an additional strain, the differences are used to computationally infer the second MLST type without the need for MLST of additional individual colonies. In mixes of previously typed pairs of strains, 100+1 NGS-MLST reliably detected a second strain. Inferred MLST types of second strains were always more similar to their real MLST types than to those of any of 59 other isolates (22 of 31 inferred types were identical to the real type. Using 100+1 NGS-MLST we found that 7/60 human samples, including three superficial candidiasis samples, contained two unrelated strains. In addition, at least one sample contained two highly similar variants of the same strain. The probability of samples containing unrelated strains appears to differ considerably between body sites. Our findings indicate the need for wider surveys to

  16. Extraction of Total DNA and RNA from Marine Filter Samples and Generation of a cDNA as Universal Template for Marker Gene Studies.

    Science.gov (United States)

    Schneider, Dominik; Wemheuer, Franziska; Pfeiffer, Birgit; Wemheuer, Bernd

    2017-01-01

    Microbial communities play an important role in marine ecosystem processes. Although the number of studies targeting marker genes such as the 16S rRNA gene has been increased in the last few years, the vast majority of marine diversity is rather unexplored. Moreover, most studies focused on the entire bacterial community and thus disregarded active microbial community players. Here, we describe a detailed protocol for the simultaneous extraction of DNA and RNA from marine water samples and for the generation of cDNA from the isolated RNA which can be used as a universal template in various marker gene studies.

  17. Sampling methods and data generation

    Science.gov (United States)

    The study of forensic microbiology is an inherent blend of forensic science and microbiology, and both disciplines have recently been undergoing rapid advancements in technology that are allowing for exciting new research avenues. The integration of two different disciplines poses challenges becaus...

  18. Optimization of chemical and instrumental parameters in hydride generation laser-induced breakdown spectrometry for the determination of arsenic, antimony, lead and germanium in aqueous samples.

    Science.gov (United States)

    Yeşiller, Semira Unal; Yalçın, Serife

    2013-04-03

    A laser induced breakdown spectrometry hyphenated with on-line continuous flow hydride generation sample introduction system, HG-LIBS, has been used for the determination of arsenic, antimony, lead and germanium in aqueous environments. Optimum chemical and instrumental parameters governing chemical hydride generation, laser plasma formation and detection were investigated for each element under argon and nitrogen atmosphere. Arsenic, antimony and germanium have presented strong enhancement in signal strength under argon atmosphere while lead has shown no sensitivity to ambient gas type. Detection limits of 1.1 mg L(-1), 1.0 mg L(-1), 1.3 mg L(-1) and 0.2 mg L(-1) were obtained for As, Sb, Pb and Ge, respectively. Up to 77 times enhancement in detection limit of Pb were obtained, compared to the result obtained from the direct analysis of liquids by LIBS. Applicability of the technique to real water samples was tested through spiking experiments and recoveries higher than 80% were obtained. Results demonstrate that, HG-LIBS approach is suitable for quantitative analysis of toxic elements and sufficiently fast for real time continuous monitoring in aqueous environments. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Determination of arsenic species in seafood samples from the Aegean Sea by liquid chromatography-(photo-oxidation)-hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Schaeffer, Richard [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Soeroes, Csilla [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Ipolyi, Ildiko [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Fodor, Peter [Department of Applied Chemistry, Corvinus University, Villanyi ut 29-35, 1118 Budapest (Hungary); Thomaidis, Nikolaos S. [Laboratory of Analytical Chemistry, Department of Chemistry, University of Athens, Panepistiomopolis Zografou, 15776 Athens (Greece)]. E-mail: ntho@chem.uoa.gr

    2005-08-15

    In this study arsenic compounds were determined in mussels (Mytulis galloprovincialis), anchovies (Engraulis encrasicholus), sea-breams (Sparus aurata), sea bass (Dicentrarchus labrax) and sardines (Sardina pilchardus) collected from Aegean Sea using liquid chromatography-photo-oxidation-hydride generation-atomic fluorescence spectrometry [LC-(PO)-HG-AFS] system. Twelve arsenicals were separated and determined on the basis of their difference in two properties: (i) the pK {sub a} values and (ii) hydride generation capacity. The separation was carried out both with an anion- and a cation-exchange column, with and without photo-oxidation. In all the samples arsenobetaine, AB was detected as the major compound (concentrations ranging between 2.7 and 23.1 {mu}g g{sup -1} dry weight), with trace amounts of arsenite, As(III), dimethylarsinic acid, DMA and arsenocholine, AC, also present. Arsenosugars were detected only in the mussel samples (in concentrations of 0.9-3.6 {mu}g g{sup -1} dry weight), along with the presence of an unknown compound, which, based on its retention time on the anion-exchange column Hamilton PRP-X100 and a recent communication [E. Schmeisser, R. Raml, K.A. Francesconi, D. Kuehnelt, A. Lindberg, Cs. Soeroes, W. Goessler, Chem. Commun. 16 (2004) 1824], is supposed to be a thio-arsenic analogue.

  20. Assessment of Epstein-Barr virus nucleic acids in gastric but not in breast cancer by next-generation sequencing of pooled Mexican samples

    Science.gov (United States)

    Fuentes-Pananá, Ezequiel M; Larios-Serrato, Violeta; Méndez-Tenorio, Alfonso; Morales-Sánchez, Abigail; Arias, Carlos F; Torres, Javier

    2016-01-01

    Gastric (GC) and breast (BrC) cancer are two of the most common and deadly tumours. Different lines of evidence suggest a possible causative role of viral infections for both GC and BrC. Wide genome sequencing (WGS) technologies allow searching for viral agents in tissues of patients with cancer. These technologies have already contributed to establish virus-cancer associations as well as to discovery new tumour viruses. The objective of this study was to document possible associations of viral infection with GC and BrC in Mexican patients. In order to gain idea about cost effective conditions of experimental sequencing, we first carried out an in silico simulation of WGS. The next-generation-platform IlluminaGallx was then used to sequence GC and BrC tumour samples. While we did not find viral sequences in tissues from BrC patients, multiple reads matching Epstein-Barr virus (EBV) sequences were found in GC tissues. An end-point polymerase chain reaction confirmed an enrichment of EBV sequences in one of the GC samples sequenced, validating the next-generation sequencing-bioinformatics pipeline. PMID:26910355

  1. Synergetic enhancement effect of ionic liquid and diethyldithiocarbamate on the chemical vapor generation of nickel for its atomic fluorescence spectrometric determination in biological samples

    International Nuclear Information System (INIS)

    Zhang Chuan; Li Yan; Wu Peng; Yan Xiuping

    2009-01-01

    Room-temperature ionic liquid in combination with sodium diethyldithiocarbamate (DDTC) was used to synergetically improve the chemical vapor generation (CVG) of nickel. Volatile species of nickel were effectively generated through reduction of acidified analyte solution with KBH 4 in the presence of 0.02% DDTC and 25 mmol L -1 1-butyl-3-methylimidazolium bromide ([C 4 mim]Br) at room temperature. Thus, a new flow injection (FI)-CVG-atomic fluorescence spectrometric (FI-CVG-AFS) method was developed for determination of nickel with a detection limit of 0.65 μg L -1 (3 s) and a sampling frequency of 180 h -1 . With consumption of 0.5 mL sample solution, an enhancement factor of 2400 was obtained. The precision (RSD) for eleven replicate determinations of 20 μg L -1 Ni was 3.4%. The developed FI-CVG-AFS method was successfully applied to determination of trace Ni in several certified biological reference materials.

  2. Influence of physical properties and chemical composition of sample on formation of aerosol particles generated by nanosecond laser ablation at 213 nm

    Energy Technology Data Exchange (ETDEWEB)

    Hola, Marketa, E-mail: mhola@sci.muni.c [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Konecna, Veronika [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Mikuska, Pavel [Institute of Analytical Chemistry, Academy of Sciences of the Czech Republic v.v.i., Veveri 97, 602 00 Brno (Czech Republic); Kaiser, Jozef [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technicka 2896/2, 616 69 Brno (Czech Republic); Kanicky, Viktor [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic)

    2010-01-15

    The influence of sample properties and composition on the size and concentration of aerosol particles generated by nanosecond Nd:YAG laser ablation at 213 nm was investigated for three sets of different materials, each containing five specimens with a similar matrix (Co-cemented carbides with a variable content of W and Co, steel samples with minor differences in elemental content and silica glasses with various colors). The concentration of ablated particles (particle number concentration, PNC) was measured in two size ranges (10-250 nm and 0.25-17 mum) using an optical aerosol spectrometer. The shapes and volumes of the ablation craters were obtained by Scanning Electron Microscopy (SEM) and by an optical profilometer, respectively. Additionally, the structure of the laser-generated particles was studied after their collection on a filter using SEM. The results of particle concentration measurements showed a significant dominance of particles smaller than 250 nm in comparison with larger particles, irrespective of the kind of material. Even if the number of particles larger than 0.25 mum is negligible (up to 0.1%), the volume of large particles that left the ablation cell can reach 50% of the whole particle volume depending on the material. Study of the ablation craters and the laser-generated particles showed a various number of particles produced by different ablation mechanisms (particle splashing or condensation), but the similar character of released particles for all materials was observed by SEM after particle collection on the membrane filter. The created aerosol always consisted of two main structures - spherical particles with diameters from tenths to units of micrometers originally ejected from the molten surface layer and mum-sized 'fibres' composed of primary agglomerates with diameters in the range between tens and hundreds of nanometers. The shape and structure of ablation craters were in good agreement with particle concentration

  3. The HLA-net GENE[RATE] pipeline for effective HLA data analysis and its application to 145 population samples from Europe and neighbouring areas.

    Science.gov (United States)

    Nunes, J M; Buhler, S; Roessli, D; Sanchez-Mazas, A

    2014-05-01

    In this review, we present for the first time an integrated version of the Gene[rate] computer tools which have been developed during the last 5 years to analyse human leukocyte antigen (HLA) data in human populations, as well as the results of their application to a large dataset of 145 HLA-typed population samples from Europe and its two neighbouring areas, North Africa and West Asia, now forming part of the Gene[va] database. All these computer tools and genetic data are, from now, publicly available through a newly designed bioinformatics platform, HLA-net, here presented as a main achievement of the HLA-NET scientific programme. The Gene[rate] pipeline offers user-friendly computer tools to estimate allele and haplotype frequencies, to test Hardy-Weinberg equilibrium (HWE), selective neutrality and linkage disequilibrium, to recode HLA data, to convert file formats, to display population frequencies of chosen alleles and haplotypes in selected geographic regions, and to perform genetic comparisons among chosen sets of population samples, including new data provided by the user. Both numerical and graphical outputs are generated, the latter being highly explicit and of publication quality. All these analyses can be performed on the pipeline after scrupulous validation of the population sample's characterisation and HLA typing reporting according to HLA-NET recommendations. The Gene[va] database offers direct access to the HLA-A, -B, -C, -DQA1, -DQB1, -DRB1 and -DPB1 frequencies and summary statistics of 145 population samples having successfully passed these HLA-NET 'filters', and representing three European subregions (South-East, North-East and Central-West Europe) and two neighbouring areas (North Africa, as far as Sudan, and West Asia, as far as South India). The analysis of these data, summarized in this review, shows a substantial genetic variation at the regional level in this continental area. These results have main implications for population genetics

  4. Calibrated acoustic emission system records M -3.5 to M -8 events generated on a saw-cut granite sample

    Science.gov (United States)

    McLaskey, Gregory C.; Lockner, David A.

    2016-01-01

    Acoustic emission (AE) analyses have been used for decades for rock mechanics testing, but because AE systems are not typically calibrated, the absolute sizes of dynamic microcrack growth and other physical processes responsible for the generation of AEs are poorly constrained. We describe a calibration technique for the AE recording system as a whole (transducers + amplifiers + digitizers + sample + loading frame) that uses the impact of a 4.76-mm free-falling steel ball bearing as a reference source. We demonstrate the technique on a 76-mm diameter cylinder of westerly granite loaded in a triaxial deformation apparatus at 40 MPa confining pressure. The ball bearing is dropped inside a cavity within the sample while inside the pressure vessel. We compare this reference source to conventional AEs generated during loading of a saw-cut fault in a second granite sample. All located AEs occur on the saw-cut surface and have moment magnitudes ranging from M −5.7 down to at least M −8. Dynamic events rupturing the entire simulated fault surface (stick–slip events) have measurable stress drop and macroscopic slip and radiate seismic waves similar to those from a M −3.5 earthquake. The largest AE events that do not rupture the entire fault are M −5.7. For these events, we also estimate the corner frequency (200–300 kHz), and we assume the Brune model to estimate source dimensions of 4–6 mm. These AE sources are larger than the 0.2 mm grain size and smaller than the 76 × 152 mm fault surface.

  5. Development of novel and sensitive methods for the determination of sulfide in aqueous samples by hydrogen sulfide generation-inductively coupled plasma-atomic emission spectroscopy.

    Science.gov (United States)

    Colon, M; Todolí, J L; Hidalgo, M; Iglesias, M

    2008-02-25

    Two new, simple and accurate methods for the determination of sulfide (S(2-)) at low levels (microgL(-1)) in aqueous samples were developed. The generation of hydrogen sulfide (H(2)S) took place in a coil where sulfide reacted with hydrochloric acid. The resulting H(2)S was then introduced as a vapor into an inductively coupled plasma-atomic emission spectrometer (ICP-AES) and sulfur emission intensity was measured at 180.669nm. In comparison to when aqueous sulfide was introduced, the introduction of sulfur as H(2)S enhanced the sulfur signal emission. By setting a gas separator at the end of the reaction coil, reduced sulfur species in the form of H(2)S were removed from the water matrix, thus, interferences could be avoided. Alternatively, the gas separator was replaced by a nebulizer/spray chamber combination to introduce the sample matrix and reagents into the plasma. This methodology allowed the determination of both sulfide and sulfate in aqueous samples. For both methods the linear response was found to range from 5microgL(-1) to 25mgL(-1) of sulfide. Detection limits of 5microgL(-1) and 6microgL(-1) were obtained with and without the gas separator, respectively. These new methods were evaluated by comparison to the standard potentiometric method and were successfully applied to the analysis of reduced sulfur species in environmental waters.

  6. In-line electrochemical reagent generation coupled to a flow injection biamperometric system for the determination of sulfite in beverage samples.

    Science.gov (United States)

    de Paula, Nattany T G; Barbosa, Elaine M O; da Silva, Paulo A B; de Souza, Gustavo C S; Nascimento, Valberes B; Lavorante, André F

    2016-07-15

    This work reports an in-line electrochemical reagent generation coupled to a flow injection biamperometric procedure for the determination of SO3(2-). The method was based on a redox reaction between the I3(-) and SO3(2-) ions, after the diffusion of SO2 through a gas diffusion chamber. Under optimum experimental conditions, a linear response ranging from 1.0 to 12.0 mg L(-1) (R=0.9999 and n=7), a detection and quantification limit estimated at 0.26 and 0.86 mg L(-1), respectively, a standard deviation relative of 0.4% (n=10) for a reference solution of 4.0 mg L(-1) SO3(2-) and sampling throughput for 40 determinations per hour were achieved. Addition and recovery tests with juice and wine samples were performed resulting in a range between 92% and 110%. There were no significant differences at a 95% confidence level in the analysis of eight samples when comparing the new method with a reference procedure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Surface plasmon resonance sensor based on golden nanoparticles and cold vapour generation technique for the detection of mercury in aqueous samples

    Science.gov (United States)

    Castillo, Jimmy; Chirinos, José; Gutiérrez, Héctor; La Cruz, Marie

    2017-09-01

    In this work, a surface plasmon resonance sensor for determination of Hg based on golden nanoparticles was developed. The sensor follows the change of the signal from solutions in contact with atomic mercury previously generated by the reaction with sodium borohydride. Mie theory predicts that Hg film, as low as 5 nm, induced a significant reduction of the surface plasmon resonance signal of 40 nm golden nanoparticles. This property was used for quantification purposes in the sensor. The device provide limits of detection of 172 ng/L that can compared with the 91 ng/L obtained with atomic fluorescence, a common technique used for Hg quantification in drinking water. This result was relevant, considering that it was not necessary to functionalize the nanoparticles or use nanoparticles deposited in a substrate. Also, thanks that Hg is released from the matrix, the surface plasmon resonance signal was not affected by concomitant elements in the sample.

  8. Simple and rapid determination methods for low-level radioactive wastes generated from nuclear research facilities. Guidelines for determination of radioactive waste samples

    International Nuclear Information System (INIS)

    Kameo, Yutaka; Shimada, Asako; Ishimori, Ken-ichiro; Haraga, Tomoko; Katayama, Atsushi; Nakashima, Mikio; Hoshi, Akiko

    2009-10-01

    Analytical methods were developed for simple and rapid determination of U, Th, and several nuclides, which are selected as important nuclides for safety assessment of disposal of wastes generated from research facilities at Nuclear Science Research Institute and Oarai Research and Development Center. The present analytical methods were assumed to apply to solidified products made from miscellaneous wastes by plasma melting in the Advanced Volume Reduction Facilities. In order to establish a system to analyze the important nuclides in the solidified products at low cost and routinely, we have advanced the development of a high-efficiency non-destructive measurement technique for γ-ray emitting nuclides, simple and rapid methods for pretreatment of solidified product samples and subsequent radiochemical separations, and rapid determination methods for long-lived nuclides. In the present paper, we summarized the methods developed as guidelines for determination of radionuclides in the low-level solidified products. (author)

  9. Imaging of Caenorhabditis elegans samples and sub-cellular localization of new generation photosensitizers for photodynamic therapy, using non-linear microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Filippidis, G [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Kouloumentas, C [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Kapsokalyvas, D [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece); Voglis, G [Institute of Molecular Biology and Biotechnology, Foundation of Research and Technology, Heraklion 71110, Crete (Greece); Tavernarakis, N [Institute of Molecular Biology and Biotechnology, Foundation of Research and Technology, Heraklion 71110, Crete (Greece); Papazoglou, T G [Institute of Electronic Structure and Laser, Foundation of Research and Technology-Hellas, PO Box 1527, 71110 Heraklion (Greece)

    2005-08-07

    Two-photon excitation fluorescence (TPEF) and second-harmonic generation (SHG) are relatively new promising tools for the imaging and mapping of biological structures and processes at the microscopic level. The combination of the two image-contrast modes in a single instrument can provide unique and complementary information concerning the structure and the function of tissues and individual cells. The extended application of this novel, innovative technique by the biological community is limited due to the high price of commercial multiphoton microscopes. In this study, a compact, inexpensive and reliable setup utilizing femtosecond pulses for excitation was developed for the TPEF and SHG imaging of biological samples. Specific cell types of the nematode Caenorhabditis elegans were imaged. Detection of the endogenous structural proteins of the worm, which are responsible for observation of SHG signals, was achieved. Additionally, the binding of different photosensitizers in the HL-60 cell line was investigated, using non-linear microscopy. The sub-cellular localization of photosensitizers of a new generation, very promising for photodynamic therapy (PDT) (Hypericum perforatum L. extracts) was achieved. The sub-cellular localization of these novel photosensitizers was linked with their photodynamic action during PDT, and the possible mechanisms for cell killing have been elucidated.

  10. Non-Destructive Method by Gamma Sampling Measurements for Radiological Characterization of a Steam Generator: Physical and Numerical Modeling for ANIMMA (23-27 June 2013)

    International Nuclear Information System (INIS)

    Auge, G.; Rottner, B.; Dubois, C.

    2013-06-01

    The radiological characterization of a steam generator consists of evaluating the global radiological activity in the tube bundle. In this paper, we present a non-destructive method and the results analysis of the gamma sampling measurements from a sample of U-tubes in the bundle. On site, the implementation of the methodology is fairly easy. But the analysis of the results is more complicated due to the long path of the gamma ray (from 60 Co quite penetrating), and also the heterogeneous activity of U-tubes bundle, which have not the same life cycle. We explain why the periodic spatial arrangement complicates also the analysis. Furthermore, we have taken into account the environment of all tubes measured because of all the external influence activity of others U-tubes (the nearest, the most distant and potential hot spot). A great amount of independent influence coefficient had to be considered (roughly 18 million). Based on a physical and numerical modeling, and using a Cholesky algorithm solving the problem and saving time machine. (authors)

  11. Development of novel and sensitive methods for the determination of sulfide in aqueous samples by hydrogen sulfide generation-inductively coupled plasma-atomic emission spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Colon, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Departamento de Quimica Analitica, Nutricion y Bromatologia, University of Alicante, 03080 Alicante (Spain); Todoli, J.L. [Departamento de Quimica Analitica, Nutricion y Bromatologia, University of Alicante, 03080 Alicante (Spain); Hidalgo, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Iglesias, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain)], E-mail: monica.iglesias@udg.es

    2008-02-25

    Two new, simple and accurate methods for the determination of sulfide (S{sup 2-}) at low levels ({mu}g L{sup -1}) in aqueous samples were developed. The generation of hydrogen sulfide (H{sub 2}S) took place in a coil where sulfide reacted with hydrochloric acid. The resulting H{sub 2}S was then introduced as a vapor into an inductively coupled plasma-atomic emission spectrometer (ICP-AES) and sulfur emission intensity was measured at 180.669 nm. In comparison to when aqueous sulfide was introduced, the introduction of sulfur as H{sub 2}S enhanced the sulfur signal emission. By setting a gas separator at the end of the reaction coil, reduced sulfur species in the form of H{sub 2}S were removed from the water matrix, thus, interferences could be avoided. Alternatively, the gas separator was replaced by a nebulizer/spray chamber combination to introduce the sample matrix and reagents into the plasma. This methodology allowed the determination of both sulfide and sulfate in aqueous samples. For both methods the linear response was found to range from 5 {mu}g L{sup -1} to 25 mg L{sup -1} of sulfide. Detection limits of 5 {mu}g L{sup -1} and 6 {mu}g L{sup -1} were obtained with and without the gas separator, respectively. These new methods were evaluated by comparison to the standard potentiometric method and were successfully applied to the analysis of reduced sulfur species in environmental waters.

  12. A Next-Generation Sequencing Data Analysis Pipeline for Detecting Unknown Pathogens from Mixed Clinical Samples and Revealing Their Genetic Diversity.

    Directory of Open Access Journals (Sweden)

    Yu-Nong Gong

    Full Text Available Forty-two cytopathic effect (CPE-positive isolates were collected from 2008 to 2012. All isolates could not be identified for known viral pathogens by routine diagnostic assays. They were pooled into 8 groups of 5-6 isolates to reduce the sequencing cost. Next-generation sequencing (NGS was conducted for each group of mixed samples, and the proposed data analysis pipeline was used to identify viral pathogens in these mixed samples. Polymerase chain reaction (PCR or enzyme-linked immunosorbent assay (ELISA was individually conducted for each of these 42 isolates depending on the predicted viral types in each group. Two isolates remained unknown after these tests. Moreover, iteration mapping was implemented for each of these 2 isolates, and predicted human parechovirus (HPeV in both. In summary, our NGS pipeline detected the following viruses among the 42 isolates: 29 human rhinoviruses (HRVs, 10 HPeVs, 1 human adenovirus (HAdV, 1 echovirus and 1 rotavirus. We then focused on the 10 identified Taiwanese HPeVs because of their reported clinical significance over HRVs. Their genomes were assembled and their genetic diversity was explored. One novel 6-bp deletion was found in one HPeV-1 virus. In terms of nucleotide heterogeneity, 64 genetic variants were detected from these HPeVs using the mapped NGS reads. Most importantly, a recombination event was found between our HPeV-3 and a known HPeV-4 strain in the database. Similar event was detected in the other HPeV-3 strains in the same clade of the phylogenetic tree. These findings demonstrated that the proposed NGS data analysis pipeline identified unknown viruses from the mixed clinical samples, revealed their genetic identity and variants, and characterized their genetic features in terms of viral evolution.

  13. Highly efficient electrocatalytic vapor generation of methylmercury based on the gold particles deposited glassy carbon electrode: A typical application for sensitive mercury speciation analysis in fish samples.

    Science.gov (United States)

    Shi, Meng-Ting; Yang, Xin-An; Qin, Li-Ming; Zhang, Wang-Bing

    2018-09-26

    A gold particle deposited glassy carbon electrode (Au/GCE) was first used in electrochemical vapor generation (ECVG) technology and demonstrated to have excellent catalytic property for the electrochemical conversion process of aqueous mercury, especially for methylmercury (CH 3 Hg + ), to gaseous mercury. Systematical research has shown that the highly consistent or distinct difference between the atomic fluorescence spectroscopy signals of CH 3 Hg + and Hg 2+ can be achieved by controlling the electrolytic parameters of ECVG. Hereby, a new green and accurate method for mercury speciation analysis based on the distinguishing electrochemical reaction behavior of Hg 2+ and CH 3 Hg +  on the modified electrode was firstly established. Furthermore, electrochemical impedance spectra and the square wave voltammetry displayed that the ECVG reaction of CH 3 Hg +  may belong to the electrocatalytic mechanism. Under the selected conditions, the limits of detection of Hg 2+ and CH 3 Hg +  are 5.3 ng L -1 and 4.4 ng L -1 for liquid samples and 0.53 pg mg -1 and 0.44 pg mg -1 for solid samples, respectively. The precision of the 5 measurements is less than 6% within the concentration of Hg 2+ and CH 3 Hg +  ranging from 0.2 to 15.0 μg L -1 . The accuracy and practicability of the proposed method was verified by analyzing the mercury content in the certified reference material and several fish as well as water samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Clinical Application of Picodroplet Digital PCR Technology for Rapid Detection of EGFR T790M in Next-Generation Sequencing Libraries and DNA from Limited Tumor Samples.

    Science.gov (United States)

    Borsu, Laetitia; Intrieri, Julie; Thampi, Linta; Yu, Helena; Riely, Gregory; Nafa, Khedoudja; Chandramohan, Raghu; Ladanyi, Marc; Arcila, Maria E

    2016-11-01

    Although next-generation sequencing (NGS) is a robust technology for comprehensive assessment of EGFR-mutant lung adenocarcinomas with acquired resistance to tyrosine kinase inhibitors, it may not provide sufficiently rapid and sensitive detection of the EGFR T790M mutation, the most clinically relevant resistance biomarker. Here, we describe a digital PCR (dPCR) assay for rapid T790M detection on aliquots of NGS libraries prepared for comprehensive profiling, fully maximizing broad genomic analysis on limited samples. Tumor DNAs from patients with EGFR-mutant lung adenocarcinomas and acquired resistance to epidermal growth factor receptor inhibitors were prepared for Memorial Sloan-Kettering-Integrated Mutation Profiling of Actionable Cancer Targets sequencing, a hybrid capture-based assay interrogating 410 cancer-related genes. Precapture library aliquots were used for rapid EGFR T790M testing by dPCR, and results were compared with NGS and locked nucleic acid-PCR Sanger sequencing (reference high sensitivity method). Seventy resistance samples showed 99% concordance with the reference high sensitivity method in accuracy studies. Input as low as 2.5 ng provided a sensitivity of 1% and improved further with increasing DNA input. dPCR on libraries required less DNA and showed better performance than direct genomic DNA. dPCR on NGS libraries is a robust and rapid approach to EGFR T790M testing, allowing most economical utilization of limited material for comprehensive assessment. The same assay can also be performed directly on any limited DNA source and cell-free DNA. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  15. A Hidden Markov Model Approach for Simultaneously Estimating Local Ancestry and Admixture Time Using Next Generation Sequence Data in Samples of Arbitrary Ploidy.

    Science.gov (United States)

    Corbett-Detig, Russell; Nielsen, Rasmus

    2017-01-01

    Admixture-the mixing of genomes from divergent populations-is increasingly appreciated as a central process in evolution. To characterize and quantify patterns of admixture across the genome, a number of methods have been developed for local ancestry inference. However, existing approaches have a number of shortcomings. First, all local ancestry inference methods require some prior assumption about the expected ancestry tract lengths. Second, existing methods generally require genotypes, which is not feasible to obtain for many next-generation sequencing projects. Third, many methods assume samples are diploid, however a wide variety of sequencing applications will fail to meet this assumption. To address these issues, we introduce a novel hidden Markov model for estimating local ancestry that models the read pileup data, rather than genotypes, is generalized to arbitrary ploidy, and can estimate the time since admixture during local ancestry inference. We demonstrate that our method can simultaneously estimate the time since admixture and local ancestry with good accuracy, and that it performs well on samples of high ploidy-i.e. 100 or more chromosomes. As this method is very general, we expect it will be useful for local ancestry inference in a wider variety of populations than what previously has been possible. We then applied our method to pooled sequencing data derived from populations of Drosophila melanogaster on an ancestry cline on the east coast of North America. We find that regions of local recombination rates are negatively correlated with the proportion of African ancestry, suggesting that selection against foreign ancestry is the least efficient in low recombination regions. Finally we show that clinal outlier loci are enriched for genes associated with gene regulatory functions, consistent with a role of regulatory evolution in ecological adaptation of admixed D. melanogaster populations. Our results illustrate the potential of local ancestry

  16. Performance of next-generation sequencing on small tumor specimens and/or low tumor content samples using a commercially available platform.

    Directory of Open Access Journals (Sweden)

    Scott Morris

    Full Text Available Next generation sequencing tests (NGS are usually performed on relatively small core biopsy or fine needle aspiration (FNA samples. Data is limited on what amount of tumor by volume or minimum number of FNA passes are needed to yield sufficient material for running NGS. We sought to identify the amount of tumor for running the PCDx NGS platform.2,723 consecutive tumor tissues of all cancer types were queried and reviewed for inclusion. Information on tumor volume, success of performing NGS, and results of NGS were compiled. Assessment of sequence analysis, mutation calling and sensitivity, quality control, drug associations, and data aggregation and analysis were performed.6.4% of samples were rejected from all testing due to insufficient tumor quantity. The number of genes with insufficient sensitivity make definitive mutation calls increased as the percentage of tumor decreased, reaching statistical significance below 5% tumor content. The number of drug associations also decreased with a lower percentage of tumor, but this difference only became significant between 1-3%. The number of drug associations did decrease with smaller tissue size as expected. Neither specimen size or percentage of tumor affected the ability to pass mRNA quality control. A tumor area of 10 mm2 provides a good margin of error for specimens to yield adequate drug association results.Specimen suitability remains a major obstacle to clinical NGS testing. We determined that PCR-based library creation methods allow the use of smaller specimens, and those with a lower percentage of tumor cells to be run on the PCDx NGS platform.

  17. Highlight on the indigenous organic molecules detected on Mars by SAM and potential sources of artifacts and backgrounds generated by the sample preparation

    Science.gov (United States)

    Buch, A.; Belmahdi, I.; Szopa, C.; Freissinet, C.; Glavin, D. P.; Coll, P. J.; Cabane, M.; Millan, M.; Eigenbrode, J. L.; Navarro-Gonzalez, R.; Stern, J. C.; Pinnick, V. T.; Coscia, D.; Teinturier, S.; Stambouli, M.; Dequaire, T.; Mahaffy, P. R.

    2015-12-01

    Among the experiments which explore the martian soil aboard the Curiosity Rover, SAM experiment is mainly dedicated to the search for indigenous organic compounds. To reach its goals SAM can operate in different analysis modes: Pyrolysis-GC-MS and Pyrolysis-MS (EGA). In addition SAM includes wet chemistry experiments [1] to supports extraction of polar organic compounds from solid samples that improves their detection either by increasing the release of chemical species from solid sample matrices, or by changing their chemical structure to make compounds more amenable to gas chromatography mass spectrometry (GCMS). The two wet chemistry experimental capabilities of SAM provide alternatives to the nominal inert-thermal desorption/pyrolysis analytical protocol and are more aptly suited for polar components: MTBSTFA derivatization [2-3] and TMAH thermochemolysis [4-5]. Here we focus on the MTBSTFA derivatization experiment. In order to build a support used to help the interpretation of SAM results, we have investigated the artifacts and backgrounds sources generated by the all analysis process: Solid sample were heated up to approximately 840°C at a rate of 35°C/min under He flow. For GC analyses, the majority of the gas released was trapped on a hydrocarbon trap (Tenax®) over a specific temperature range. Adsorbed volatiles on the GC injection trap (IT) were then released into the GC column (CLP-MXT 30m x 0.25mm x 0.25μm) by rapidly heating the IT to 300°C. Then, in order better understand the part of compounds detected coming from internal reaction we have performed several lab experiments to mimic the SAM device: Among the sources of artifact, we test: (1) the thermal stability and the organic material released during the degradation of Tenax® and carbosieve, (2) the impact of MTBSTFA and a mixture of DMF and MTBSTFA on the adsorbent, (3) the reaction between the different adsorbents (Tenax® and Carbosieve) and calcium perchlorate and then (4) the sources

  18. Application of High-Throughput Next-Generation Sequencing for HLA Typing on Buccal Extracted DNA: Results from over 10,000 Donor Recruitment Samples.

    Directory of Open Access Journals (Sweden)

    Yuxin Yin

    Full Text Available Unambiguous HLA typing is important in hematopoietic stem cell transplantation (HSCT, HLA disease association studies, and solid organ transplantation. However, current molecular typing methods only interrogate the antigen recognition site (ARS of HLA genes, resulting in many cis-trans ambiguities that require additional typing methods to resolve. Here we report high-resolution HLA typing of 10,063 National Marrow Donor Program (NMDP registry donors using long-range PCR by next generation sequencing (NGS approach on buccal swab DNA.Multiplex long-range PCR primers amplified the full-length of HLA class I genes (A, B, C from promotor to 3' UTR. Class II genes (DRB1, DQB1 were amplified from exon 2 through part of exon 4. PCR amplicons were pooled and sheared using Covaris fragmentation. Library preparation was performed using the Illumina TruSeq Nano kit on the Beckman FX automated platform. Each sample was tagged with a unique barcode, followed by 2×250 bp paired-end sequencing on the Illumina MiSeq. HLA typing was assigned using Omixon Twin software that combines two independent computational algorithms to ensure high confidence in allele calling. Consensus sequence and typing results were reported in Histoimmunogenetics Markup Language (HML format. All homozygous alleles were confirmed by Luminex SSO typing and exon novelties were confirmed by Sanger sequencing.Using this automated workflow, over 10,063 NMDP registry donors were successfully typed under high-resolution by NGS. Despite known challenges of nucleic acid degradation and low DNA concentration commonly associated with buccal-based specimens, 97.8% of samples were successfully amplified using long-range PCR. Among these, 98.2% were successfully reported by NGS, with an accuracy rate of 99.84% in an independent blind Quality Control audit performed by the NDMP. In this study, NGS-HLA typing identified 23 null alleles (0.023%, 92 rare alleles (0.091% and 42 exon novelties (0.042%.Long

  19. Application of High-Throughput Next-Generation Sequencing for HLA Typing on Buccal Extracted DNA: Results from over 10,000 Donor Recruitment Samples.

    Science.gov (United States)

    Yin, Yuxin; Lan, James H; Nguyen, David; Valenzuela, Nicole; Takemura, Ping; Bolon, Yung-Tsi; Springer, Brianna; Saito, Katsuyuki; Zheng, Ying; Hague, Tim; Pasztor, Agnes; Horvath, Gyorgy; Rigo, Krisztina; Reed, Elaine F; Zhang, Qiuheng

    2016-01-01

    Unambiguous HLA typing is important in hematopoietic stem cell transplantation (HSCT), HLA disease association studies, and solid organ transplantation. However, current molecular typing methods only interrogate the antigen recognition site (ARS) of HLA genes, resulting in many cis-trans ambiguities that require additional typing methods to resolve. Here we report high-resolution HLA typing of 10,063 National Marrow Donor Program (NMDP) registry donors using long-range PCR by next generation sequencing (NGS) approach on buccal swab DNA. Multiplex long-range PCR primers amplified the full-length of HLA class I genes (A, B, C) from promotor to 3' UTR. Class II genes (DRB1, DQB1) were amplified from exon 2 through part of exon 4. PCR amplicons were pooled and sheared using Covaris fragmentation. Library preparation was performed using the Illumina TruSeq Nano kit on the Beckman FX automated platform. Each sample was tagged with a unique barcode, followed by 2×250 bp paired-end sequencing on the Illumina MiSeq. HLA typing was assigned using Omixon Twin software that combines two independent computational algorithms to ensure high confidence in allele calling. Consensus sequence and typing results were reported in Histoimmunogenetics Markup Language (HML) format. All homozygous alleles were confirmed by Luminex SSO typing and exon novelties were confirmed by Sanger sequencing. Using this automated workflow, over 10,063 NMDP registry donors were successfully typed under high-resolution by NGS. Despite known challenges of nucleic acid degradation and low DNA concentration commonly associated with buccal-based specimens, 97.8% of samples were successfully amplified using long-range PCR. Among these, 98.2% were successfully reported by NGS, with an accuracy rate of 99.84% in an independent blind Quality Control audit performed by the NDMP. In this study, NGS-HLA typing identified 23 null alleles (0.023%), 92 rare alleles (0.091%) and 42 exon novelties (0.042%). Long

  20. Selective determination of four arsenic species in rice and water samples by modified graphite electrode-based electrolytic hydride generation coupled with atomic fluorescence spectrometry.

    Science.gov (United States)

    Yang, Xin-An; Lu, Xiao-Ping; Liu, Lin; Chi, Miao-Bin; Hu, Hui-Hui; Zhang, Wang-Bing

    2016-10-01

    This work describes a novel non-chromatographic approach for the accurate and selective determining As species by modified graphite electrode-based electrolytic hydride generation (EHG) for sample introduction coupled with atomic fluorescence spectrometry (AFS) detection. Two kinds of sulfydryl-containing modifiers, l-cysteine (Cys) and glutathione (GSH), are used to modify cathode. The EHG performance of As has been changed greatly at the modified cathode, which has never been reported. Arsenite [As(III)] on the GSH modified graphite electrode (GSH/GE)-based EHG can be selectively and quantitatively converted to AsH3 at applied current of 0.4A. As(III) and arsenate [As(V)] on the Cys modified graphite electrode (Cys/GE) EHG can be selectively and efficiently converted to arsine at applied current of 0.6A, whereas monomethylarsonic acid (MMA) and dimethylarsinic acid (DMA) do not form any or only less volatile hydrides under this condition. By changing the analytical conditions, we also have achieved the analysis of total As (tAs) and DMA. Under the optimal condition, the detection limits (3s) of As(III), iAs and tAs in aqueous solutions are 0.25μgL(-1), 0.22μgL(-1) and 0.10μgL(-1), respectively. The accuracy of the method is verified through the analysis of standard reference materials (SRM 1568a). Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Separation and sampling of ice nucleation chamber generated ice particles by means of the counterflow virtual impactor technique for the characterization of ambient ice nuclei.

    Science.gov (United States)

    Schenk, Ludwig; Mertes, Stephan; Kästner, Udo; Schmidt, Susan; Schneider, Johannes; Frank, Fabian; Nillius, Björn; Worringen, Annette; Kandler, Konrad; Ebert, Martin; Stratmann, Frank

    2014-05-01

    In 2011, the German research foundation (DFG) research group called Ice Nuclei Research Unit (INUIT (FOR 1525, project STR 453/7-1) was established with the objective to achieve a better understanding concerning heterogeneous ice formation. The presented work is part of INUIT and aims for a better microphysical and chemical characterization of atmospheric aerosol particles that have the potential to act as ice nuclei (IN). For this purpose a counterflow virtual impactor (Kulkarni et al., 2011) system (IN-PCVI) was developed and characterized in order to separate and collect ice particles generated in the Fast Ice Nucleus Chamber (FINCH; Bundke et al., 2008) and to release their IN for further analysis. Here the IN-PCVI was used for the inertial separation of the IN counter produced ice particles from smaller drops and interstitial particles. This is realized by a counterflow that matches the FINCH output flow inside the IN-PCVI. The choice of these flows determines the aerodynamic cut-off diameter. The collected ice particles are transferred into the IN-PCVI sample flow where they are completely evaporated in a particle-free and dry carrier air. In this way, the aerosol particles detected as IN by the IN counter can be extracted and distributed to several particle sensors. This coupled setup FINCH, IN-PCVI and aerosol instrumentation was deployed during the INUIT-JFJ joint measurement field campaign at the research station Jungfraujoch (3580m asl). Downstream of the IN-PCVI, the Aircraft-based Laser Ablation Aerosol Mass Spectrometer (ALABAMA; Brands et al., 2011) was attached for the chemical analysis of the atmospheric IN. Also, number concentration and size distribution of IN were measured online (TROPOS) and IN impactor samples for electron microscopy (TU Darmstadt) were taken. Therefore the IN-PCVI was operated with different flow settings than known from literature (Kulkarni et al., 2011), which required a further characterisation of its cut

  2. A Survey on Language Use, Attitudes, and Identity in Relation to Philippine English among Young Generation Filipinos: An Initial Sample from a Private University

    Science.gov (United States)

    Borlongan, Ariane Macalinga

    2009-01-01

    This study looks at the language use, attitudes, and identity in relation to Philippine English among young generation Filipinos through a questionnaire survey of a selected group of students from a Philippine private university. The survey findings would reveal that most domains of use and verbal activities are dominated by English as the…

  3. One should avoid retro-orbital pharmacokinetic sample collections for intranasal dosing in rats: Illustration of spurious pharmacokinetics generated for anti-migraine drugs zolmitriptan and eletriptan.

    Science.gov (United States)

    Patel, Harilal; Patel, Prakash; Modi, Nirav; Shah, Shaival; Ghoghari, Ashok; Variya, Bhavesh; Laddha, Ritu; Baradia, Dipesh; Dobaria, Nitin; Mehta, Pavak; Srinivas, Nuggehally R

    2017-08-30

    Because of the avoidance of first pass metabolic effects due to direct and rapid absorption with improved permeability, intranasal route represents a good alternative for extravascular drug administration. The aim of the study was to investigate the intranasal pharmacokinetics of two anti-migraine drugs (zolmitriptan and eletriptan), using retro-orbital sinus and jugular vein sites sampling. In a parallel study design, healthy male Sprague-Dawley (SD) rats aged between 8 and 12weeks were divided into groups (n=4 or 5/group). The animals of individual groups were dosed intranasal (~1.0mg/kg) and oral doses of 2.1mg/kg of either zolmitriptan or eletriptan. Serial blood sampling was performed from jugular vein or retro-orbital site and plasma samples were analyzed for drug concentrations using LC-MS/MS assay. Standard pharmacokinetics parameters such as T max , C max , AUC last , AUC 0-inf and T 1/2 were calculated and statistics of derived parameters was performed using unpaired t-test. After intranasal dosing, the mean pharmacokinetic parameters C max and AUC inf of zolmitriptan/eletriptan showed about 17-fold and 3-5-fold higher values for retro-orbital sampling as compared to the jugular vein sampling site. Whereas after oral administration such parameters derived for both drugs were largely comparable between the two sampling sites and statistically non-significant. In conclusion, the assessment of plasma levels after intranasal administration with retro-orbital sampling would result in spurious and misleading pharmacokinetics. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Archived neonatal dried blood spot samples can be used for accurate whole genome and exome-targeted next-generation sequencing

    DEFF Research Database (Denmark)

    Hollegaard, Mads Vilhelm; Grauholm, Jonas; Nielsen, Ronni

    2013-01-01

    Dried blood spot samples (DBSS) have been collected and stored for decades as part of newborn screening programmes worldwide. Representing almost an entire population under a certain age and collected with virtually no bias, the Newborn Screening Biobanks are of immense value in medical studies......, for example, to examine the genetics of various disorders. We have previously demonstrated that DNA extracted from a fraction (2×3.2mm discs) of an archived DBSS can be whole genome amplified (wgaDNA) and used for accurate array genotyping. However, until now, it has been uncertain whether wgaDNA from DBSS...... can be used for accurate whole genome sequencing (WGS) and exome sequencing (WES). This study examined two individuals represented by three different types of samples each: whole-blood (reference samples), 3-year-old DBSS spotted with reference material (refDBSS), and 27- to 29-year-old archived...

  5. Water quality of stormwater generated from an airport in a cold climate, function of an infiltration pond, and sampling strategy with limited resources.

    Science.gov (United States)

    Jia, Yu; Ehlert, Ludwig; Wahlskog, Cecilia; Lundberg, Angela; Maurice, Christian

    2017-12-05

    Monitoring pollutants in stormwater discharge in cold climates is challenging. An environmental survey was performed by sampling the stormwater from Luleå Airport, Northern Sweden, during the period 2010-2013, when urea was used as a main component of aircraft deicing/anti-icing fluids (ADAFs). The stormwater collected from the runway was led through an oil trap to an infiltration pond to store excess water during precipitation periods and enhance infiltration and water treatment. Due to insufficient capacity, an emergency spillway was established and equipped with a flow meter and an automatic sampler. This study proposes a program for effective monitoring of pollutant discharge with a minimum number of sampling occasions when use of automatic samplers is not possible. The results showed that 90% of nitrogen discharge occurs during late autumn before the water pipes freeze and during snow melting, regardless of the precipitation during the remaining months when the pollutant discharge was negligible. The concentrations of other constituents in the discharge were generally low compared to guideline values. The best data quality was obtained using flow controlled sampling. Intensive time-controlled sampling during late autumn (few weeks) and snow melting (2 weeks) would be sufficient for necessary information. The flow meters installed at the rectangular notch appeared to be difficult to calibrate and gave contradictory results. Overall, the spillway was dry, as water infiltrated into the pond, and stagnant water close to the edge might be registered as flow. Water level monitoring revealed that the infiltration capacity gradually decreased with time.

  6. Valence properties of tellurium in different chemical systems and its determination in refractory environmental samples using hydride generation – Atomic fluorescence spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yu-Wei; Alzahrani, Ali [Department of Chemistry and Biochemistry, Laurentian University, Sudbury, Ontario P3E 2C6 (Canada); Deng, Tian-Long [College of Marine Science and Engineering, Tianjin University of Science and Technology, Tianjin (China); Belzile, Nelson, E-mail: nbelzile@laurentian.ca [Department of Chemistry and Biochemistry, Laurentian University, Sudbury, Ontario P3E 2C6 (Canada); Cooperative Freshwater Ecology Unit, Laurentian University, Sudbury, Ontario P3E 2C6 (Canada)

    2016-01-28

    Using HG – AFS as a powerful tool to study valence transformations of Te, we found that, in presence of HCl and at high temperature, Te can form volatile species and be lost during sample digestion and pre-reduction steps. It was also noticed that the chemical valences of Te can be modified under different chemical and digestion conditions and even by samples themselves with certain matrices. KBr can reduce Te(VI) to Te(IV) in 3.0 M HCl at 100 °C, but when HNO{sub 3} was >5% (v/v) in solution, Br{sub 2} was formed and caused serious interference to Te measurements. HCl alone can also pre-reduce Te(VI) to Te(IV), only when its concentration was ≥6.0 M (100 °C for 15min). Among 10 studied chemical elements, only Cu{sup 2+} caused severe interference. Thiourea is an effective masking agent only when Cu{sup 2+} concentration is equal or lower than 10 mg/L. Chemical reagents, chemical composition of sample, as well as the modes of digestion can greatly affect Te valences, reagent blanks and analytical precisions. A protocol of 2–step–digestion followed by an elimination of HF is proposed to minimize reagent blank and increase the signal/noise ratios. It is important to perform a preliminary test to confirm whether a pre-reduction step is necessary; this is especially true for samples with complex matrices such as those with high sulfide content. The analytical detection limits of this method in a pure solution and a solid sample were 100 ng/L and 0.10 ± 0.02 μg/g, respectively. - Highlights: • HG–AFS is a powerful tool in studies of chemical valences and forms of Te in different conditions. • Te can be lost in form of volatile species in presence of HCl at high temperature. • Metal ions can be classified into 3 categories of interference; thiourea can effectively mask Cu{sup 2+}. • A 2-step digestion allows to eliminate HF, reduce background and improve analytical precision. • Matrix of sample can strongly influence Te chemical valence

  7. Disambiguate: An open-source application for disambiguating two species in next generation sequencing data from grafted samples [version 2; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Miika J. Ahdesmäki

    2017-01-01

    Full Text Available Grafting of cell lines and primary tumours is a crucial step in the drug development process between cell line studies and clinical trials. Disambiguate is a program for computationally separating the sequencing reads of two species derived from grafted samples. Disambiguate operates on DNA or RNA-seq alignments to the two species and separates the components at very high sensitivity and specificity as illustrated in artificially mixed human-mouse samples. This allows for maximum recovery of data from target tumours for more accurate variant calling and gene expression quantification. Given that no general use open source algorithm accessible to the bioinformatics community exists for the purposes of separating the two species data, the proposed Disambiguate tool presents a novel approach and improvement to performing sequence analysis of grafted samples. Both Python and C++ implementations are available and they are integrated into several open and closed source pipelines. Disambiguate is open source and is freely available at https://github.com/AstraZeneca-NGS/disambiguate.

  8. Disambiguate: An open-source application for disambiguating two species in next generation sequencing data from grafted samples [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Miika J. Ahdesmäki

    2016-11-01

    Full Text Available Grafting of cell lines and primary tumours is a crucial step in the drug development process between cell line studies and clinical trials. Disambiguate is a program for computationally separating the sequencing reads of two species derived from grafted samples. Disambiguate operates on alignments to the two species and separates the components at very high sensitivity and specificity as illustrated in artificially mixed human-mouse samples. This allows for maximum recovery of data from target tumours for more accurate variant calling and gene expression quantification. Given that no general use open source algorithm accessible to the bioinformatics community exists for the purposes of separating the two species data, the proposed Disambiguate tool presents a novel approach and improvement to performing sequence analysis of grafted samples. Both Python and C++ implementations are available and they are integrated into several open and closed source pipelines. Disambiguate is open source and is freely available at https://github.com/AstraZeneca-NGS/disambiguate.

  9. Pre-Analytical Considerations for Successful Next-Generation Sequencing (NGS: Challenges and Opportunities for Formalin-Fixed and Paraffin-Embedded Tumor Tissue (FFPE Samples

    Directory of Open Access Journals (Sweden)

    Gladys Arreaza

    2016-09-01

    Full Text Available In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC, circulating tumor DNA (ctDNA, etc., tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC, in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.. Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research.

  10. A new vapor generation system for mercury species based on the UV irradiation of mercaptoethanol used in the determination of total and methyl mercury in environmental and biological samples by atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Yanmin; Qiu, Jianhua; Yang, Limin [College of Chemistry and Chemical Engineering, Xiamen University, Department of Chemistry and the MOE Key Laboratory of Analytical Sciences, Xiamen (China); Wang, Qiuquan [College of Chemistry and Chemical Engineering, Xiamen University, Department of Chemistry and the MOE Key Laboratory of Analytical Sciences, Xiamen (China); Xiamen University, State Key Laboratory of Marine Environmental Science, Xiamen (China)

    2007-06-15

    A new vapor generation system for mercury (Hg) species based on the irradiation of mercaptoethanol (ME) with UV was developed to provide an effective sample introduction unit for atomic fluorescence spectrometry (AFS). Preliminary investigations of the mechanism of this novel vapor generation system were based on GC-MS and FT-IR studies. Under optimum conditions, the limits of determination for inorganic divalence mercury and methyl mercury were 60 and 50 pg mL{sup -1}, respectively. Certified reference materials (BCR 463 tuna fish and BCR 580 estuarine sediment) were used to validate this new method, and the results agreed well with certified values. This new system provides an attractive alternative method of chemical vapor generation (CVG) of mercury species compared to other developed CVG systems (for example, the traditional KBH{sub 4}/NaOH-acid system). To our knowledge, this is the first systematic report on UV/ME-based Hg species vapor generation and the determination of total and methyl Hg in environmental and biological samples using UV/ME-AFS. (orig.)

  11. Application of hydrocyanic acid vapor generation via focused microwave radiation to the preparation of industrial effluent samples prior to free and total cyanide determinations by spectrophotometric flow injection analysis.

    Science.gov (United States)

    Quaresma, Maria Cristina Baptista; de Carvalho, Maria de Fátima Batista; Meirelles, Francis Assis; Santiago, Vânia Maria Junqueira; Santelli, Ricardo Erthal

    2007-02-01

    A sample preparation procedure for the quantitative determination of free and total cyanides in industrial effluents has been developed that involves hydrocyanic acid vapor generation via focused microwave radiation. Hydrocyanic acid vapor was generated from free cyanides using only 5 min of irradiation time (90 W power) and a purge time of 5 min. The HCN generated was absorbed into an accepting NaOH solution using very simple glassware apparatus that was appropriate for the microwave oven cavity. After that, the cyanide concentration was determined within 90 s using a well-known spectrophotometric flow injection analysis system. Total cyanide analysis required 15 min irradiation time (90 W power), as well as chemical conditions such as the presence of EDTA-acetate buffer solution or ascorbic acid, depending on the effluent to be analyzed (petroleum refinery or electroplating effluents, respectively). The detection limit was 0.018 mg CN l(-1) (quantification limit of 0.05 mg CN l(-1)), and the measured RSD was better than 8% for ten independent analyses of effluent samples (1.4 mg l(-1) cyanide). The accuracy of the procedure was assessed via analyte spiking (with free and complex cyanides) and by performing an independent sample analysis based on the standard methodology recommended by the APHA for comparison. The sample preparation procedure takes only 10 min for free and 20 min for total cyanide, making this procedure much faster than traditional methodologies (conventional heating and distillation), which are time-consuming (they require at least 1 h). Samples from oil (sour and stripping tower bottom waters) and electroplating effluents were analyzed successfully.

  12. Validation of the Hirst-Type Spore Trap for Simultaneous Monitoring of Prokaryotic and Eukaryotic Biodiversities in Urban Air Samples by Next-Generation Sequencing.

    Science.gov (United States)

    Núñez, Andrés; Amo de Paz, Guillermo; Ferencova, Zuzana; Rastrojo, Alberto; Guantes, Raúl; García, Ana M; Alcamí, Antonio; Gutiérrez-Bustillo, A Montserrat; Moreno, Diego A

    2017-07-01

    Pollen, fungi, and bacteria are the main microscopic biological entities present in outdoor air, causing allergy symptoms and disease transmission and having a significant role in atmosphere dynamics. Despite their relevance, a method for monitoring simultaneously these biological particles in metropolitan environments has not yet been developed. Here, we assessed the use of the Hirst-type spore trap to characterize the global airborne biota by high-throughput DNA sequencing, selecting regions of the 16S rRNA gene and internal transcribed spacer for the taxonomic assignment. We showed that aerobiological communities are well represented by this approach. The operational taxonomic units (OTUs) of two traps working synchronically compiled >87% of the total relative abundance for bacterial diversity collected in each sampler, >89% for fungi, and >97% for pollen. We found a good correspondence between traditional characterization by microscopy and genetic identification, obtaining more-accurate taxonomic assignments and detecting a greater diversity using the latter. We also demonstrated that DNA sequencing accurately detects differences in biodiversity between samples. We concluded that high-throughput DNA sequencing applied to aerobiological samples obtained with Hirst spore traps provides reliable results and can be easily implemented for monitoring prokaryotic and eukaryotic entities present in the air of urban areas. IMPORTANCE Detection, monitoring, and characterization of the wide diversity of biological entities present in the air are difficult tasks that require time and expertise in different disciplines. We have evaluated the use of the Hirst spore trap (an instrument broadly employed in aerobiological studies) to detect and identify these organisms by DNA-based analyses. Our results showed a consistent collection of DNA and a good concordance with traditional methods for identification, suggesting that these devices can be used as a tool for continuous

  13. Efficiency of Executive Function: A Two-Generation Cross-Cultural Comparison of Samples From Hong Kong and the United Kingdom.

    Science.gov (United States)

    Ellefson, Michelle R; Ng, Florrie Fei-Yin; Wang, Qian; Hughes, Claire

    2017-05-01

    Although Asian preschoolers acquire executive functions (EFs) earlier than their Western counterparts, little is known about whether this advantage persists into later childhood and adulthood. To address this gap, in the current study we gave four computerized EF tasks (providing measures of inhibition, working memory, cognitive flexibility, and planning) to a large sample ( n = 1,427) of 9- to 16-year-olds and their parents. All participants lived in either the United Kingdom or Hong Kong. Our findings highlight the importance of combining developmental and cultural perspectives and show both similarities and contrasts across sites. Specifically, adults' EF performance did not differ between the two sites; age-related changes in executive function for both the children and the parents appeared to be culturally invariant, as did a modest intergenerational correlation. In contrast, school-age children and young adolescents in Hong Kong outperformed their United Kingdom counterparts on all four EF tasks, a difference consistent with previous findings from preschool children.

  14. Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) scores generated from the MMPI-2 and MMPI-2-RF test booklets: internal structure comparability in a sample of criminal defendants.

    Science.gov (United States)

    Tarescavage, Anthony M; Alosco, Michael L; Ben-Porath, Yossef S; Wood, Arcangela; Luna-Jones, Lynn

    2015-04-01

    We investigated the internal structure comparability of Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) scores derived from the MMPI-2 and MMPI-2-RF booklets in a sample of 320 criminal defendants (229 males and 54 females). After exclusion of invalid protocols, the final sample consisted of 96 defendants who were administered the MMPI-2-RF booklet and 83 who completed the MMPI-2. No statistically significant differences in MMPI-2-RF invalidity rates were observed between the two forms. Individuals in the final sample who completed the MMPI-2-RF did not statistically differ on demographics or referral question from those who were administered the MMPI-2 booklet. Independent t tests showed no statistically significant differences between MMPI-2-RF scores generated with the MMPI-2 and MMPI-2-RF booklets on the test's substantive scales. Statistically significant small differences were observed on the revised Variable Response Inconsistency (VRIN-r) and True Response Inconsistency (TRIN-r) scales. Cronbach's alpha and standard errors of measurement were approximately equal between the booklets for all MMPI-2-RF scales. Finally, MMPI-2-RF intercorrelations produced from the two forms yielded mostly small and a few medium differences, indicating that discriminant validity and test structure are maintained. Overall, our findings reflect the internal structure comparability of MMPI-2-RF scale scores generated from MMPI-2 and MMPI-2-RF booklets. Implications of these results and limitations of these findings are discussed. © The Author(s) 2014.

  15. Flow injection electrochemical hydride generation inductively coupled plasma time-of-flight mass spectrometry for the simultaneous determination of hydride forming elements and its application to the analysis of fresh water samples

    International Nuclear Information System (INIS)

    Bings, Nicolas H.; Stefanka, Zsolt; Mallada, Sergio Rodriguez

    2003-01-01

    A flow injection (FI) method was developed using electrochemical hydride generation (EcHG) as a sample introduction system, coupled to an inductively coupled plasma time-of-flight mass spectrometer (ICP-TOFMS) for rapid and simultaneous determination of six elements forming hydrides (As, Bi, Ge, Hg, Sb and Se). A novel low volume electrolysis cell, especially suited for FI experiments was designed and the conditions for simultaneous electrochemical hydride generation (EcHG; electrolyte concentrations and flow rates, electrolysis voltage and current) as well as the ICP-TOFMS operational parameters (carrier gas flow rate, modulation pulse width (MPW)) for the simultaneous determination of 12 isotopes were optimized. The compromise operation parameters of the electrolysis were found to be 1.4 and 3 ml min -1 for the anolyte and catholyte flow rates, respectively, using 2 M sulphuric acid. An optimum electrolysis current of 0.7 A (16 V) and an argon carrier gas flow rate of 0.91 l min -1 were chosen. A modulation pulse width of 5 μs, which influences the sensitivity through the amount of ions being collected by the MS per single analytical cycle, provided optimum results for the detection of transient signals. The achieved detection limits were compared with those obtained by using FI in combination with conventional nebulization (FI-ICP-TOFMS); values for chemical hydride generation (FI-CHG-ICP-TOFMS) were taken from the literature. By using a 200 μl sample loop absolute detection limits (3σ) in the range of 10-160 pg for As, Bi, Ge, Hg, Sb and 1.1 ng for Se and a precision of 4-8% for seven replicate injections of 20-100 ng ml -1 multielemental sample solutions were achieved. The analysis of a standard reference material (SRM) 1643d (NIST, 'Trace Elements in Water') showed good agreement with the certified values for As and Sb. Se showed a drastic difference, which is probably due to the presence of hydride-inactive Se species in the sample. Recoveries better than

  16. A New Generation of Thermal Desorption Technology Incorporating Multi Mode Sampling (NRT/DAAMS/Liquid Agent) for Both on and off Line Analysis of Trace Level Airbone Chemical Warfare Agents

    International Nuclear Information System (INIS)

    Roberts, G. M.

    2007-01-01

    A multi functional, twin-trap, electrically-cooled thermal desorption (TD) system (TT24-7) will be discussed for the analysis of airborne trace level chemical warfare agents. This technology can operate in both military environments (CW stockpile, or destruction facilities) and civilian locations where it is used to monitor for accidental or terrorist release of acutely toxic substances. The TD system interfaces to GC, GCMS or direct MS analytical platforms and provides for on-line continuous air monitoring with no sampling time blind spots and within a near real time (NRT) context. Using this technology enables on-line sub ppt levels of agent detection from a vapour sample. In addition to continuous sampling the system has the capacity for off-line single (DAAMS) tube analysis and the ability to receive an external liquid agent injection. The multi mode sampling functionality provides considerable flexibility to the TD system, allowing continuous monitoring of an environment for toxic substances plus the ability to analyse calibration standards. A calibration solution can be introduced via a conventional sampling tube on to either cold trap or as a direct liquid injection using a conventional capillary split/splitless injection port within a gas chromatograph. Low level (linearity) data will be supplied showing the TT24-7 analyzing a variety of CW compounds including free (underivitised) VX using the three sampling modes described above. Stepwise changes in vapor generated agent concentrations will be shown, and this is cross referenced against direct liquid agent introduction, and the tube sampling modes. This technology is in use today in several geographies around the world in both static and mobile analytical laboratories. (author)

  17. Speciation of arsenic in water samples by high-performance liquid chromatography-hydride generation-atomic absorption spectrometry at trace levels using a post-column reaction system

    Energy Technology Data Exchange (ETDEWEB)

    Stummeyer, J. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany); Harazim, B. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany); Wippermann, T. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany)

    1996-02-01

    Anion-exchange HPLC has been combined with hydride generation - atomic absorption spectrometry (HG-AAS) for the routine speciation of arsenite, arsenate, monomethylarsenic acid and dimethylarsinic acid. The sensitivity of the AAS-detection was increased by a post-column reaction system to achieve complete formation of volatile arsines from the methylated species and arsenate. The system allows the quantitative determination of 0.5 {mu}g/l of each arsenic compound in water samples. The stability of synthetical and natural water containing arsenic at trace levels was investigated. To preserve stored water samples, a method for quantitative separation of arsenate at high pH-values with the basic anion-exchange resin Dowex 1 x 8 was developed. (orig.)

  18. High speed network sampling

    OpenAIRE

    Rindalsholt, Ole Arild

    2005-01-01

    Master i nettverks- og systemadministrasjon Classical Sampling methods play an important role in the current practice of Internet measurement. With today’s high speed networks, routers cannot manage to generate complete Netflow data for every packet. They have to perform restricted sampling. This thesis summarizes some of the most important sampling schemes and their applications before diving into an analysis on the effect of sampling Netflow records.

  19. Boat sampling

    International Nuclear Information System (INIS)

    Citanovic, M.; Bezlaj, H.

    1994-01-01

    This presentation describes essential boat sampling activities: on site boat sampling process optimization and qualification; boat sampling of base material (beltline region); boat sampling of weld material (weld No. 4); problems accompanied with weld crown varieties, RPV shell inner radius tolerance, local corrosion pitting and water clarity. The equipment used for boat sampling is described too. 7 pictures

  20. Graph sampling

    OpenAIRE

    Zhang, L.-C.; Patone, M.

    2017-01-01

    We synthesise the existing theory of graph sampling. We propose a formal definition of sampling in finite graphs, and provide a classification of potential graph parameters. We develop a general approach of Horvitz–Thompson estimation to T-stage snowball sampling, and present various reformulations of some common network sampling methods in the literature in terms of the outlined graph sampling theory.

  1. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  2. Ensemble Sampling

    OpenAIRE

    Lu, Xiuyuan; Van Roy, Benjamin

    2017-01-01

    Thompson sampling has emerged as an effective heuristic for a broad range of online decision problems. In its basic form, the algorithm requires computing and sampling from a posterior distribution over models, which is tractable only for simple special cases. This paper develops ensemble sampling, which aims to approximate Thompson sampling while maintaining tractability even in the face of complex models such as neural networks. Ensemble sampling dramatically expands on the range of applica...

  3. Laser sampling

    International Nuclear Information System (INIS)

    Gorbatenko, A A; Revina, E I

    2015-01-01

    The review is devoted to the major advances in laser sampling. The advantages and drawbacks of the technique are considered. Specific features of combinations of laser sampling with various instrumental analytical methods, primarily inductively coupled plasma mass spectrometry, are discussed. Examples of practical implementation of hybrid methods involving laser sampling as well as corresponding analytical characteristics are presented. The bibliography includes 78 references

  4. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    Science.gov (United States)

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  5. Soil sampling

    International Nuclear Information System (INIS)

    Fortunati, G.U.; Banfi, C.; Pasturenzi, M.

    1994-01-01

    This study attempts to survey the problems associated with techniques and strategies of soil sampling. Keeping in mind the well defined objectives of a sampling campaign, the aim was to highlight the most important aspect of representativeness of samples as a function of the available resources. Particular emphasis was given to the techniques and particularly to a description of the many types of samplers which are in use. The procedures and techniques employed during the investigations following the Seveso accident are described. (orig.)

  6. Language sampling

    DEFF Research Database (Denmark)

    Rijkhoff, Jan; Bakker, Dik

    1998-01-01

    This article has two aims: [1] to present a revised version of the sampling method that was originally proposed in 1993 by Rijkhoff, Bakker, Hengeveld and Kahrel, and [2] to discuss a number of other approaches to language sampling in the light of our own method. We will also demonstrate how our...... sampling method is used with different genetic classifications (Voegelin & Voegelin 1977, Ruhlen 1987, Grimes ed. 1997) and argue that —on the whole— our sampling technique compares favourably with other methods, especially in the case of exploratory research....

  7. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  8. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  9. Environmental sampling

    International Nuclear Information System (INIS)

    Puckett, J.M.

    1998-01-01

    Environmental Sampling (ES) is a technology option that can have application in transparency in nuclear nonproliferation. The basic process is to take a sample from the environment, e.g., soil, water, vegetation, or dust and debris from a surface, and through very careful sample preparation and analysis, determine the types, elemental concentration, and isotopic composition of actinides in the sample. The sample is prepared and the analysis performed in a clean chemistry laboratory (CCL). This ES capability is part of the IAEA Strengthened Safeguards System. Such a Laboratory is planned to be built by JAERI at Tokai and will give Japan an intrinsic ES capability. This paper presents options for the use of ES as a transparency measure for nuclear nonproliferation

  10. Spherical sampling

    CERN Document Server

    Freeden, Willi; Schreiner, Michael

    2018-01-01

    This book presents, in a consistent and unified overview, results and developments in the field of today´s spherical sampling, particularly arising in mathematical geosciences. Although the book often refers to original contributions, the authors made them accessible to (graduate) students and scientists not only from mathematics but also from geosciences and geoengineering. Building a library of topics in spherical sampling theory it shows how advances in this theory lead to new discoveries in mathematical, geodetic, geophysical as well as other scientific branches like neuro-medicine. A must-to-read for everybody working in the area of spherical sampling.

  11. Generating Units

    Data.gov (United States)

    Department of Homeland Security — Generating Units are any combination of physically connected generators, reactors, boilers, combustion turbines, and other prime movers operated together to produce...

  12. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  13. Generational diversity.

    Science.gov (United States)

    Kramer, Linda W

    2010-01-01

    Generational diversity has proven challenges for nurse leaders, and generational values may influence ideas about work and career planning. This article discusses generational gaps, influencing factors and support, and the various generational groups present in today's workplace as well as the consequences of need addressing these issues. The article ends with a discussion of possible solutions.

  14. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  15. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  16. Isotope generator

    International Nuclear Information System (INIS)

    1979-01-01

    The patent describes an isotope generator incorporating the possibility of stopping elution before the elution vessel is completely full. Sterile ventilation of the whole system can then occur, including of both generator reservoir and elution vessel. A sterile, and therefore pharmaceutically acceptable, elution fluid is thus obtained and the interior of the generator is not polluted with non-sterile air. (T.P.)

  17. Instant Generation

    Science.gov (United States)

    Loveland, Elaina

    2017-01-01

    Generation Z students (born between 1995-2010) have replaced millennials on college campuses. Generation Z students are entrepreneurial, desire practical skills with their education, and are concerned about the cost of college. This article presents what need to be known about this new generation of students.

  18. Optimising generators

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, E.J.; Garcia, A.O.; Graffigna, F.M.; Verdu, C.A. (IMPSA (Argentina). Generators Div.)

    1994-11-01

    A new computer tool, the ARGEN program, has been developed for dimensioning large hydroelectric generators. This results in better designs, and reduces calculation time for engineers. ARGEN performs dimensional tailoring of salient pole synchronous machines in generators, synchronous condensers, and generator-motors. The operation and uses of ARGEN are explained and its advantages are listed in this article. (UK)

  19. Operational air sampling report

    International Nuclear Information System (INIS)

    Lyons, C.L.

    1994-03-01

    Nevada Test Site vertical shaft and tunnel events generate beta/gamma fission products. The REECo air sampling program is designed to measure these radionuclides at various facilities supporting these events. The current testing moratorium and closure of the Decontamination Facility has decreased the scope of the program significantly. Of the 118 air samples collected in the only active tunnel complex, only one showed any airborne fission products. Tritiated water vapor concentrations were very similar to previously reported levels. The 206 air samples collected at the Area-6 decontamination bays and laundry were again well below any Derived Air Concentration calculation standard. Laboratory analyses of these samples were negative for any airborne fission products

  20. Study of five novel non-synonymous polymorphisms in human brain-expressed genes in a Colombian sample.

    Science.gov (United States)

    Ojeda, Diego A; Forero, Diego A

    2014-10-01

    Non-synonymous single nucleotide polymorphisms (nsSNPs) in brain-expressed genes represent interesting candidates for genetic research in neuropsychiatric disorders. To study novel nsSNPs in brain-expressed genes in a sample of Colombian subjects. We applied an approach based on in silico mining of available genomic data to identify and select novel nsSNPs in brain-expressed genes. We developed novel genotyping assays, based in allele-specific PCR methods, for these nsSNPs and genotyped them in 171 Colombian subjects. Five common nsSNPs (rs6855837; p.Leu395Ile, rs2305160; p.Thr394Ala, rs10503929; p.Met289Thr, rs2270641; p.Thr4Pro and rs3822659; p.Ser735Ala) were studied, located in the CLOCK, NPAS2, NRG1, SLC18A1 and WWC1 genes. We reported allele and genotype frequencies in a sample of South American healthy subjects. There is previous experimental evidence, arising from genome-wide expression and association studies, for the involvement of these genes in several neuropsychiatric disorders and endophenotypes, such as schizophrenia, mood disorders or memory performance. Frequencies for these nsSNPSs in the Colombian samples varied in comparison to different HapMap populations. Future study of these nsSNPs in brain-expressed genes, a synaptogenomics approach, will be important for a better understanding of neuropsychiatric diseases and endophenotypes in different populations.

  1. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  2. Wind Generators

    Science.gov (United States)

    1989-01-01

    When Enerpro, Inc. president, Frank J. Bourbeau, attempted to file a patent on a system for synchronizing a wind generator to the electric utility grid, he discovered Marshall Space Flight Center's Frank Nola's power factor controller. Bourbeau advanced the technology and received a NASA license and a patent for his Auto Synchronous Controller (ASC). The ASC reduces generator "inrush current," which occurs when large generators are abruptly brought on line. It controls voltage so the generator is smoothly connected to the utility grid when it reaches its synchronous speed, protecting the components from inrush current damage. Generator efficiency is also increased in light winds by applying lower than rated voltage. Wind energy is utilized to drive turbines to generate electricity for utility companies.

  3. A second generation human haplotype map of over 3.1 million SNPs.

    Science.gov (United States)

    Frazer, Kelly A; Ballinger, Dennis G; Cox, David R; Hinds, David A; Stuve, Laura L; Gibbs, Richard A; Belmont, John W; Boudreau, Andrew; Hardenbol, Paul; Leal, Suzanne M; Pasternak, Shiran; Wheeler, David A; Willis, Thomas D; Yu, Fuli; Yang, Huanming; Zeng, Changqing; Gao, Yang; Hu, Haoran; Hu, Weitao; Li, Chaohua; Lin, Wei; Liu, Siqi; Pan, Hao; Tang, Xiaoli; Wang, Jian; Wang, Wei; Yu, Jun; Zhang, Bo; Zhang, Qingrun; Zhao, Hongbin; Zhao, Hui; Zhou, Jun; Gabriel, Stacey B; Barry, Rachel; Blumenstiel, Brendan; Camargo, Amy; Defelice, Matthew; Faggart, Maura; Goyette, Mary; Gupta, Supriya; Moore, Jamie; Nguyen, Huy; Onofrio, Robert C; Parkin, Melissa; Roy, Jessica; Stahl, Erich; Winchester, Ellen; Ziaugra, Liuda; Altshuler, David; Shen, Yan; Yao, Zhijian; Huang, Wei; Chu, Xun; He, Yungang; Jin, Li; Liu, Yangfan; Shen, Yayun; Sun, Weiwei; Wang, Haifeng; Wang, Yi; Wang, Ying; Xiong, Xiaoyan; Xu, Liang; Waye, Mary M Y; Tsui, Stephen K W; Xue, Hong; Wong, J Tze-Fei; Galver, Luana M; Fan, Jian-Bing; Gunderson, Kevin; Murray, Sarah S; Oliphant, Arnold R; Chee, Mark S; Montpetit, Alexandre; Chagnon, Fanny; Ferretti, Vincent; Leboeuf, Martin; Olivier, Jean-François; Phillips, Michael S; Roumy, Stéphanie; Sallée, Clémentine; Verner, Andrei; Hudson, Thomas J; Kwok, Pui-Yan; Cai, Dongmei; Koboldt, Daniel C; Miller, Raymond D; Pawlikowska, Ludmila; Taillon-Miller, Patricia; Xiao, Ming; Tsui, Lap-Chee; Mak, William; Song, You Qiang; Tam, Paul K H; Nakamura, Yusuke; Kawaguchi, Takahisa; Kitamoto, Takuya; Morizono, Takashi; Nagashima, Atsushi; Ohnishi, Yozo; Sekine, Akihiro; Tanaka, Toshihiro; Tsunoda, Tatsuhiko; Deloukas, Panos; Bird, Christine P; Delgado, Marcos; Dermitzakis, Emmanouil T; Gwilliam, Rhian; Hunt, Sarah; Morrison, Jonathan; Powell, Don; Stranger, Barbara E; Whittaker, Pamela; Bentley, David R; Daly, Mark J; de Bakker, Paul I W; Barrett, Jeff; Chretien, Yves R; Maller, Julian; McCarroll, Steve; Patterson, Nick; Pe'er, Itsik; Price, Alkes; Purcell, Shaun; Richter, Daniel J; Sabeti, Pardis; Saxena, Richa; Schaffner, Stephen F; Sham, Pak C; Varilly, Patrick; Altshuler, David; Stein, Lincoln D; Krishnan, Lalitha; Smith, Albert Vernon; Tello-Ruiz, Marcela K; Thorisson, Gudmundur A; Chakravarti, Aravinda; Chen, Peter E; Cutler, David J; Kashuk, Carl S; Lin, Shin; Abecasis, Gonçalo R; Guan, Weihua; Li, Yun; Munro, Heather M; Qin, Zhaohui Steve; Thomas, Daryl J; McVean, Gilean; Auton, Adam; Bottolo, Leonardo; Cardin, Niall; Eyheramendy, Susana; Freeman, Colin; Marchini, Jonathan; Myers, Simon; Spencer, Chris; Stephens, Matthew; Donnelly, Peter; Cardon, Lon R; Clarke, Geraldine; Evans, David M; Morris, Andrew P; Weir, Bruce S; Tsunoda, Tatsuhiko; Mullikin, James C; Sherry, Stephen T; Feolo, Michael; Skol, Andrew; Zhang, Houcan; Zeng, Changqing; Zhao, Hui; Matsuda, Ichiro; Fukushima, Yoshimitsu; Macer, Darryl R; Suda, Eiko; Rotimi, Charles N; Adebamowo, Clement A; Ajayi, Ike; Aniagwu, Toyin; Marshall, Patricia A; Nkwodimmah, Chibuzor; Royal, Charmaine D M; Leppert, Mark F; Dixon, Missy; Peiffer, Andy; Qiu, Renzong; Kent, Alastair; Kato, Kazuto; Niikawa, Norio; Adewole, Isaac F; Knoppers, Bartha M; Foster, Morris W; Clayton, Ellen Wright; Watkin, Jessica; Gibbs, Richard A; Belmont, John W; Muzny, Donna; Nazareth, Lynne; Sodergren, Erica; Weinstock, George M; Wheeler, David A; Yakub, Imtaz; Gabriel, Stacey B; Onofrio, Robert C; Richter, Daniel J; Ziaugra, Liuda; Birren, Bruce W; Daly, Mark J; Altshuler, David; Wilson, Richard K; Fulton, Lucinda L; Rogers, Jane; Burton, John; Carter, Nigel P; Clee, Christopher M; Griffiths, Mark; Jones, Matthew C; McLay, Kirsten; Plumb, Robert W; Ross, Mark T; Sims, Sarah K; Willey, David L; Chen, Zhu; Han, Hua; Kang, Le; Godbout, Martin; Wallenburg, John C; L'Archevêque, Paul; Bellemare, Guy; Saeki, Koji; Wang, Hongguang; An, Daochang; Fu, Hongbo; Li, Qing; Wang, Zhen; Wang, Renwu; Holden, Arthur L; Brooks, Lisa D; McEwen, Jean E; Guyer, Mark S; Wang, Vivian Ota; Peterson, Jane L; Shi, Michael; Spiegel, Jack; Sung, Lawrence M; Zacharia, Lynn F; Collins, Francis S; Kennedy, Karen; Jamieson, Ruth; Stewart, John

    2007-10-18

    We describe the Phase II HapMap, which characterizes over 3.1 million human single nucleotide polymorphisms (SNPs) genotyped in 270 individuals from four geographically diverse populations and includes 25-35% of common SNP variation in the populations surveyed. The map is estimated to capture untyped common variation with an average maximum r2 of between 0.9 and 0.96 depending on population. We demonstrate that the current generation of commercial genome-wide genotyping products captures common Phase II SNPs with an average maximum r2 of up to 0.8 in African and up to 0.95 in non-African populations, and that potential gains in power in association studies can be obtained through imputation. These data also reveal novel aspects of the structure of linkage disequilibrium. We show that 10-30% of pairs of individuals within a population share at least one region of extended genetic identity arising from recent ancestry and that up to 1% of all common variants are untaggable, primarily because they lie within recombination hotspots. We show that recombination rates vary systematically around genes and between genes of different function. Finally, we demonstrate increased differentiation at non-synonymous, compared to synonymous, SNPs, resulting from systematic differences in the strength or efficacy of natural selection between populations.

  4. Generative Semantics.

    Science.gov (United States)

    King, Margaret

    The first section of this paper deals with the attempts within the framework of transformational grammar to make semantics a systematic part of linguistic description, and outlines the characteristics of the generative semantics position. The second section takes a critical look at generative semantics in its later manifestations, and makes a case…

  5. Generative Semantics

    Science.gov (United States)

    Bagha, Karim Nazari

    2011-01-01

    Generative semantics is (or perhaps was) a research program within linguistics, initiated by the work of George Lakoff, John R. Ross, Paul Postal and later McCawley. The approach developed out of transformational generative grammar in the mid 1960s, but stood largely in opposition to work by Noam Chomsky and his students. The nature and genesis of…

  6. Sound generator

    NARCIS (Netherlands)

    Berkhoff, Arthur P.

    2008-01-01

    A sound generator, particularly a loudspeaker, configured to emit sound, comprising a rigid element (2) enclosing a plurality of air compartments (3), wherein the rigid element (2) has a back side (B) comprising apertures (4), and a front side (F) that is closed, wherein the generator is provided

  7. Sound generator

    NARCIS (Netherlands)

    Berkhoff, Arthur P.

    2010-01-01

    A sound generator, particularly a loudspeaker, configured to emit sound, comprising a rigid element (2) enclosing a plurality of air compartments (3), wherein the rigid element (2) has a back side (B) comprising apertures (4), and a front side (F) that is closed, wherein the generator is provided

  8. Pulse Generator

    Science.gov (United States)

    Greer, Lawrence (Inventor)

    2017-01-01

    An apparatus and a computer-implemented method for generating pulses synchronized to a rising edge of a tachometer signal from rotating machinery are disclosed. For example, in one embodiment, a pulse state machine may be configured to generate a plurality of pulses, and a period state machine may be configured to determine a period for each of the plurality of pulses.

  9. Sound generator

    NARCIS (Netherlands)

    Berkhoff, Arthur P.

    2007-01-01

    A sound generator, particularly a loudspeaker, configured to emit sound, comprising a rigid element (2) enclosing a plurality of air compartments (3), wherein the rigid element (2) has a back side (B) comprising apertures (4), and a front side (F) that is closed, wherein the generator is provided

  10. Steam generator

    International Nuclear Information System (INIS)

    Fenet, J.-C.

    1980-01-01

    Steam generator particularly intended for use in the coolant system of a pressurized water reactor for vaporizing a secondary liquid, generally water, by the primary cooling liquid of the reactor and comprising special arrangements for drying the steam before it leaves the generator [fr

  11. Groundwater sampling: Chapter 5

    Science.gov (United States)

    Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati

    2011-01-01

    About the book: As water quality becomes a leading concern for people and ecosystems worldwide, it must be properly assessed in order to protect water resources for current and future generations. Water Quality Concepts, Sampling, and Analyses supplies practical information for planning, conducting, or evaluating water quality monitoring programs. It presents the latest information and methodologies for water quality policy, regulation, monitoring, field measurement, laboratory analysis, and data analysis. The book addresses water quality issues, water quality regulatory development, monitoring and sampling techniques, best management practices, and laboratory methods related to the water quality of surface and ground waters. It also discusses basic concepts of water chemistry and hydrology related to water sampling and analysis; instrumentation; water quality data analysis; and evaluation and reporting results.

  12. Genotoxicity testing of samples generated during UV/H2O2 treatment of surface water for the production of drinking water using the Ames test in vitro and the Comet assay and the SCE test in vivo

    NARCIS (Netherlands)

    Penders, E.J.M.; Martijn, A.J.; Spenkelink, A.; Alink, G.M.; Rietjens, I.; Hoogenboezem, W.

    2012-01-01

    UV/H2O2 treatment can be part of the process converting surface water to drinking water, but would pose a potential problem when resulting in genotoxicity. This study investigates the genotoxicity of samples collected from the water treatment plant Andijk, applying UV/H2O2 treatment with an

  13. Analysis of industry-generated data. Part 1: a baseline for the development of a tool to assist the milk industry in designing sampling plans for controlling aflatoxin M1 in milk.

    Science.gov (United States)

    Trevisani, Marcello; Farkas, Zsuzsa; Serraino, Andrea; Zambrini, Angelo Vittorio; Pizzamiglio, Valentina; Giacometti, Federica; Ámbrus, Arpád

    2014-01-01

    The presence of aflatoxin M1 (AFM1) in milk was assessed in Italy in the framework of designing a monitoring plan actuated by the milk industry in the period 2005-10. Overall, 21,969 samples were taken from tankers collecting milk from 690 dairy farms. The milk samples were representative of the consignments of co-mingled milk received from multiple (two to six) farms. Systematic, biweekly sampling of consignments involved each of the 121 districts (70 in the North, 17 in the Central and 34 in the South regions of Italy). AFM1 concentration was measured using an enzyme-linked immunoassay method (validated within the range of 5-100 ng kg(-1)) whereas an HPLC method was used for the quantification of levels in the samples that had concentrations higher than 100 ng kg(-1). Process control charts using data collected in three processing plants illustrate, as an example, the seasonal variation of the contamination. The mean concentration of AFM1 was in the range between 11 and 19 ng kg(-1). The 90th and 99th percentile values were 19-34 and 41-91 ng kg(-1), respectively, and values as high as 280 ng kg(-1) were reached in 2008. The number of non-compliant consignments (those with an AFM1 concentration above the statutory limit of 50 ng kg(-1)) varied between 0.3% and 3.1% per year, with peaks in September, after the maize harvest season. The variability between different regions was not significant. The results show that controlling the aflatoxins in feed at farm level was inadequate, consequently screening of raw milk prior to processing was needed. The evaluation of the AFM1 contamination level observed during a long-term period can provide useful data for defining the frequency of sampling.

  14. Wideband 4-diode sampling circuit

    Science.gov (United States)

    Wojtulewicz, Andrzej; Radtke, Maciej

    2016-09-01

    The objective of this work was to develop a wide-band sampling circuit. The device should have the ability to collect samples of a very fast signal applied to its input, strengthen it and prepare for further processing. The study emphasizes the method of sampling pulse shaping. The use of ultrafast pulse generator allows sampling signals with a wide frequency spectrum, reaching several gigahertzes. The device uses a pulse transformer to prepare symmetrical pulses. Their final shape is formed with the help of the step recovery diode, two coplanar strips and Schottky diode. Made device can be used in the sampling oscilloscope, as well as other measurement system.

  15. Energy generation

    CSIR Research Space (South Africa)

    Osburn, L

    2009-02-01

    Full Text Available Current perceptions conjure images of photovoltaic panels and wind turbines when green building or sustainable development is discussed. How energy is used and how it is generated are core components of both green building and sustainable...

  16. Radionuclide generators

    International Nuclear Information System (INIS)

    Lambrecht, R.M.; Wollongong Univ.; Tomiyoshi, K.; Sekine, T.

    1997-01-01

    The present status and future directions of research and development on radionuclide generator technology are reported. The recent interest to develop double-neutron capture reactions for production of in vivo generators; neutron rich nuclides for radio-immunotherapeutic pharmaceuticals: and advances with ultra-short lived generators is highlighted. Emphasis is focused on: production of the parent radionuclide; the selection and the evaluation of support materials and eluents with respect to the resultant radiochemical yield of the daughter, and the breakthrough of the radionuclide parent: and, the uses of radionuclide generators in radiopharmaceutical chemistry, biomedical and industrial applications. The 62 Zn → 62 Cu, 66 Ni → 66 Cu, 103m Rh → 103 Rh, 188 W → 188 Re and the 225 Ac → 221 Fr → 213 Bi generators are predicted to be emphasized for future development. Coverage of the 99 Mo → 99m Tc generator was excluded, as it the subject of another review. The literature search ended June, 1996. (orig.)

  17. Radionuclide generators

    International Nuclear Information System (INIS)

    Lambrecht, R.M.

    1983-01-01

    The status of radionuclide generators for chemical research and applications related to the life sciences and biomedical research are reviewed. Emphasis is placed upon convenient, efficient and rapid separation of short-lived daughter radionuclides in a chemical form suitable for use without further chemical manipulation. The focus is on the production of the parent, the radiochemistry associated with processing the parent and daughter, the selection and the characteristic separation methods, and yields. Quality control considerations are briefly noted. The scope of this review includes selected references to applications of radionuclide generators in radiopharmaceutical chemistry, and the life sciences, particularly in diagnostic and therapeutic medicine. The 99 Mo-sup(99m)Tc generator was excluded. 202 references are cited. (orig.)

  18. Sampling the Mouse Hippocampal Dentate Gyrus

    OpenAIRE

    Lisa Basler; Lisa Basler; Stephan Gerdes; David P. Wolfer; David P. Wolfer; David P. Wolfer; Lutz Slomianka; Lutz Slomianka

    2017-01-01

    Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE) have been develope...

  19. Generation Next

    Science.gov (United States)

    Hawkins, B. Denise

    2010-01-01

    There is a shortage of accounting professors with Ph.D.s who can prepare the next generation. To help reverse the faculty deficit, the American Institute of Certified Public Accountants (CPAs) has created the new Accounting Doctoral Scholars program by pooling more than $17 million and soliciting commitments from more than 70 of the nation's…

  20. Generative Contexts

    Science.gov (United States)

    Lyles, Dan Allen

    Educational research has identified how science, technology, engineering, and mathematics (STEM) practice and education have underperforming metrics in racial and gender diversity, despite decades of intervention. These disparities are part of the construction of a culture of science that is alienating to these populations. Recent studies in a social science framework described as "Generative Justice" have suggested that the context of social and scientific practice might be modified to bring about more just and equitable relations among the disenfranchised by circulating the value they and their non-human allies create back to them in unalienated forms. What is not known are the underlying principles of social and material space that makes a system more or less generative. I employ an autoethnographic method at four sites: a high school science class; a farm committed to "Black and Brown liberation"; a summer program geared towards youth environmental mapping; and a summer workshop for Harlem middle school students. My findings suggest that by identifying instances where material affinity, participatory voice, and creative solidarity are mutually reinforcing, it is possible to create educational contexts that generate unalienated value, and circulate it back to the producers themselves. This cycle of generation may help explain how to create systems of justice that strengthen and grow themselves through successive iterations. The problem of lack of diversity in STEM may be addressed not merely by recruiting the best and the brightest from underrepresented populations, but by changing the context of STEM education to provide tools for its own systematic restructuring.

  1. Steam generators

    International Nuclear Information System (INIS)

    Hayden, R.L.J.

    1979-01-01

    Steam generators for nuclear reactors are designed so that deposition of solids on the surface of the inlet side of the tubesheet or the inlet header with the consequent danger of corrosion and eventual tube failure is obviated or substantially reduced. (U.K.)

  2. Sample Preprocessing For Atomic Spectrometry

    International Nuclear Information System (INIS)

    Kim, Sun Tae

    2004-08-01

    This book gives descriptions of atomic spectrometry, which deals with atomic absorption spectrometry such as Maxwell-Boltzmann equation and Beer-Lambert law, atomic absorption spectrometry for solvent extraction, HGAAS, ETASS, and CVAAS and inductively coupled plasma emission spectrometer, such as basic principle, generative principle of plasma and device and equipment, and interferences, and inductively coupled plasma mass spectrometry like device, pros and cons of ICP/MS, sample analysis, reagent, water, acid, flux, materials of experiments, sample and sampling and disassembling of sample and pollution and loss in open system and closed system.

  3. Device for sampling HTGR recycle fuel particles

    International Nuclear Information System (INIS)

    Suchomel, R.R.; Lackey, W.J.

    1977-03-01

    Devices for sampling High-Temperature Gas-Cooled Reactor fuel microspheres were evaluated. Analysis of samples obtained with each of two specially designed passive samplers were compared with data generated by more common techniques. A ten-stage two-way sampler was found to produce a representative sample with a constant batch-to-sample ratio

  4. Comparing Generative and Inter-generative Subjectivity in Post-Revolutionary Academic Generations in Iran

    Directory of Open Access Journals (Sweden)

    mehran Sohrabzadeh

    2010-01-01

    Full Text Available Comparative study of different post-revolutionary generation has been broadly applied by social scientists; among them some believe there is a gulf between generations, while some others endorsing some small differences among generations, emphasis that this variety is natural. Avoiding being loyal to any of these two views, the present study attempts to compare three different post-revolutionary academic generations using theory of “generative objects” which explores generations’ view about their behaviors, Beliefs, and historical monuments. Sampling was carried among 3 generations; firstly ones who were student in 60s and now are experienced faculties in the university, secondly ones who are recently employed as faculty members, and finally who are now students in universities. Results show that in all 3 generations there are essential in-generation similarities, while comparatively there are some differentiations in inter-generative analysis.

  5. Thermoelectric generator

    International Nuclear Information System (INIS)

    Purdy, D.L.

    1978-01-01

    The main components of a thermoelectric generator are housed in an evacuated cylindrical vessel. In the middle of it there is the radioactive heat source, e.g. 90 Sr or 238 Pu, enclosed by a gamma radiation shield. This one is surrounded by a heat-insulating screen from getter material or indicidual sheets of titanium. In the bottom of the screen there are arranged several thermocouples on a circle. The thermocouples themselves are contained within casings sealed gas-tight and filled with an inert gas, e.g. argon. By separating the internal space of the generator vessel from the thermocouple casings, made of e.g. n- respectively p-doped lead telluride cylinders, for both the optimal gas state may be obtained. (DG) [de

  6. Cluster generator

    Science.gov (United States)

    Donchev, Todor I [Urbana, IL; Petrov, Ivan G [Champaign, IL

    2011-05-31

    Described herein is an apparatus and a method for producing atom clusters based on a gas discharge within a hollow cathode. The hollow cathode includes one or more walls. The one or more walls define a sputtering chamber within the hollow cathode and include a material to be sputtered. A hollow anode is positioned at an end of the sputtering chamber, and atom clusters are formed when a gas discharge is generated between the hollow anode and the hollow cathode.

  7. Photon generator

    Science.gov (United States)

    Srinivasan-Rao, Triveni

    2002-01-01

    A photon generator includes an electron gun for emitting an electron beam, a laser for emitting a laser beam, and an interaction ring wherein the laser beam repetitively collides with the electron beam for emitting a high energy photon beam therefrom in the exemplary form of x-rays. The interaction ring is a closed loop, sized and configured for circulating the electron beam with a period substantially equal to the period of the laser beam pulses for effecting repetitive collisions.

  8. Development of SYVAC sampling techniques

    International Nuclear Information System (INIS)

    Prust, J.O.; Dalrymple, G.J.

    1985-04-01

    This report describes the requirements of a sampling scheme for use with the SYVAC radiological assessment model. The constraints on the number of samples that may be taken is considered. The conclusions from earlier studies using the deterministic generator sampling scheme are summarised. The method of Importance Sampling and a High Dose algorithm, which are designed to preferentially sample in the high dose region of the parameter space, are reviewed in the light of experience gained from earlier studies and the requirements of a site assessment and sensitivity analyses. In addition the use of an alternative numerical integration method for estimating risk is discussed. It is recommended that the method of Importance Sampling is developed and tested for use with SYVAC. An alternative numerical integration method is not recommended for investigation at this stage but should be the subject of future work. (author)

  9. Event generators

    International Nuclear Information System (INIS)

    Durand, D.; Gulminelli, F.; Lopez, O.; Vient, E.

    1998-01-01

    The results concerning the heavy ion collision simulations at Fermi energies by means of phenomenological models obtained in the last two years ar presented. The event generators are essentially following the phase of elaboration of analysis methods of data obtained by INDRA or NAUTILUS 4 π multidetectors. To identify and correctly quantify a phenomenon or a physical quantity it is necessary to verify by simulation the feasibility and validity of the analysis and also to estimate the bias introduced by the experimental filter. Many studies have shown this, for instance: the determination of the collision reaction plan for flow studies, determination of kinematical characteristics of the quasi-projectiles, and the excitation energy measurement stored in the hot nuclei. To Eugene, the currently utilised generator, several improvements were added: introduction of space-time correlations between the different products emitted in the decay of excited nuclei by calculating the trajectories of the particles in the final phase of the reaction; taking into account in the decay cascade of the discrete levels of the lighter fragments; the possibility of the schematically description of the explosion of the nucleus by simultaneous emission of multi-fragments. Thus, by comparing the calculations with the data relative to heavy systems studied with the NAUTILUS assembly it was possible to extract the time scales in the nuclear fragmentation. The utilisation of these event generators was extended to the analysis of INDRA data concerning the determination of the vaporization threshold in the collisions Ar + Ni and also the research of the expansion effects in the collisions Xe + Sn at 50 MeV/u

  10. GENERATIVE LEADERSHIP

    Directory of Open Access Journals (Sweden)

    Janina León

    2010-07-01

    Full Text Available This article presents the results of a research project that studied leadership from the standpoint of the personal conceptions that influence the behavior of local government leaders, as well as those conceptions desired to generate the social transformation processes required in communities. Qualitative methodology was used. Categories of analysis were created based on Pearson’s (1992 model of psychological archetypes. A relevant finding was the limited advance shown by interviewees regarding self-knowledge and a fragmented vision between the observer and the observee, which hinders their ability to take on the challenges that current reality demands from them.

  11. Power generation

    International Nuclear Information System (INIS)

    Nunez, Anibal D.

    2001-01-01

    In the second half of twentieth century, nuclear power became an industrial reality. Now the operating 433 power plants, the 37 plants under construction, near 9000 years/reactor with only one serious accident with emission of radioactive material to the environment (Chernobyl) show the maturity of this technology. Today nuclear power contribute a 17% to the global generation and an increase of 75 % of the demand of electricity is estimated for 2020 while this demand is expected to triplicate by 2050. How this requirement can be satisfied? All the indicators seems to demonstrate that nuclear power will be the solution because of the shortage of other sources, the increase of the prices of the non renewable fuels and the scarce contribution of the renewable ones. In addition, the climatic changes produced by the greenhouse effect make even more attractive nuclear power. The situation of Argentina is analyzed and compared with other countries. The convenience of an increase of nuclear power contribution to the total national generation seems clear and the conclusion of the construction of the Atucha II nuclear power plant is recommended

  12. Direct impact aerosol sampling by electrostatic precipitation

    Science.gov (United States)

    Braden, Jason D.; Harter, Andrew G.; Stinson, Brad J.; Sullivan, Nicholas M.

    2016-02-02

    The present disclosure provides apparatuses for collecting aerosol samples by ionizing an air sample at different degrees. An air flow is generated through a cavity in which at least one corona wire is disposed and electrically charged to form a corona therearound. At least one grounded sample collection plate is provided downstream of the at least one corona wire so that aerosol ions generated within the corona are deposited on the at least one grounded sample collection plate. A plurality of aerosol samples ionized to different degrees can be generated. The at least one corona wire may be perpendicular to the direction of the flow, or may be parallel to the direction of the flow. The apparatus can include a serial connection of a plurality of stages such that each stage is capable of generating at least one aerosol sample, and the air flow passes through the plurality of stages serially.

  13. Generativity Does Not Necessarily Satisfy All Your Needs: Associations among Cultural Demand for Generativity, Generative Concern, Generative Action, and Need Satisfaction in the Elderly in Four Cultures

    Science.gov (United States)

    Hofer, Jan; Busch, Holger; Au, Alma; Polácková Šolcová, Iva; Tavel, Peter; Tsien Wong, Teresa

    2016-01-01

    The present study examines the association between various facets of generativity, that is, cultural demand for generativity, generative concern, and generative action, with the satisfaction of the needs for relatedness, competence, and autonomy in samples of elderly from Cameroon, China (Hong Kong), the Czech Republic, and Germany. Participants…

  14. Generation and analysis of chemical compound libraries

    Science.gov (United States)

    Gregoire, John M.; Jin, Jian; Kan, Kevin S.; Marcin, Martin R.; Mitrovic, Slobodan; Newhouse, Paul F.; Suram, Santosh K.; Xiang, Chengxiang; Zhou, Lan

    2017-10-03

    Various samples are generated on a substrate. The samples each includes or consists of one or more analytes. In some instances, the samples are generated through the use of gels or through vapor deposition techniques. The samples are used in an instrument for screening large numbers of analytes by locating the samples between a working electrode and a counter electrode assembly. The instrument also includes one or more light sources for illuminating each of the samples. The instrument is configured to measure the photocurrent formed through a sample as a result of the illumination of the sample.

  15. Robotic system for process sampling

    International Nuclear Information System (INIS)

    Dyches, G.M.

    1985-01-01

    A three-axis cartesian geometry robot for process sampling was developed at the Savannah River Laboratory (SRL) and implemented in one of the site radioisotope separations facilities. Use of the robot reduces personnel radiation exposure and contamination potential by routinely handling sample containers under operator control in a low-level radiation area. This robot represents the initial phase of a longer term development program to use robotics for further sample automation. Preliminary design of a second generation robot with additional capabilities is also described. 8 figs

  16. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  17. The determination by irradiation with a pulsed neutron generator and delayed neutron counting of the amount of fissile material present in a sample; Determination de la quantite de matiere fissile presente dans un echantillon par irradiation au moyen d'une source pulsee de neutrons et comptage des neutrons retardes

    Energy Technology Data Exchange (ETDEWEB)

    Beliard, L; Janot, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    A preliminary study was conducted to determine the amount of fissile material present in a sample. The method used consisted in irradiating the sample by means of a pulsed neutron generator and delayed neutron counting. Results show the validity of this method provided some experimental precautions are taken. Checking on the residual proportion of fissile material in leached hulls seems possible. (authors) [French] Ce rapport rend compte d'une etude preliminaire effectuee en vue de determiner la quantite de matiere fissile presente dans un echantillon. La methode utilisee consiste a irradier l'echantillon considere au moyen d'une source puisee de neutrons et a compter les neutrons retardes produits. Les resultats obtenus permettent de conclure a la validite de la methode moyennant certaines precautions. Un controle de la teneur residuelle en matiere fissile des gaines apres traitement semble possible. (auteurs)

  18. Next-generation phylogenomics

    Directory of Open Access Journals (Sweden)

    Chan Cheong Xin

    2013-01-01

    Full Text Available Abstract Thanks to advances in next-generation technologies, genome sequences are now being generated at breadth (e.g. across environments and depth (thousands of closely related strains, individuals or samples unimaginable only a few years ago. Phylogenomics – the study of evolutionary relationships based on comparative analysis of genome-scale data – has so far been developed as industrial-scale molecular phylogenetics, proceeding in the two classical steps: multiple alignment of homologous sequences, followed by inference of a tree (or multiple trees. However, the algorithms typically employed for these steps scale poorly with number of sequences, such that for an increasing number of problems, high-quality phylogenomic analysis is (or soon will be computationally infeasible. Moreover, next-generation data are often incomplete and error-prone, and analysis may be further complicated by genome rearrangement, gene fusion and deletion, lateral genetic transfer, and transcript variation. Here we argue that next-generation data require next-generation phylogenomics, including so-called alignment-free approaches. Reviewers Reviewed by Mr Alexander Panchin (nominated by Dr Mikhail Gelfand, Dr Eugene Koonin and Prof Peter Gogarten. For the full reviews, please go to the Reviewers’ comments section.

  19. Plasma generator

    International Nuclear Information System (INIS)

    Omichi, Takeo; Yamanaka, Toshiyuki.

    1976-01-01

    Object: To recycle a coolant in a sealed hollow portion formed interiorly of a plasma limiter itself to thereby to cause direct contact between the coolant and the plasma limiter and increase of contact area therebetween to cool the plasma limiter. Structure: The heat resulting from plasma generated during operation and applied to the body of the plasma limiter is transmitted to the coolant, which recycles through an inlet and outlet pipe, an inlet and outlet nozzle and a hollow portion to hold the plasma limiter at a level less than a predetermined temperature. On the other hand, the heater wire is, at the time of emergency operation, energized to heat the plasma limiter, but this heat is transmitted to the limiter body to increase the temperature thereof. However, the coolant recycling the hollow portion comes into direct contact with the limiter body, and since the plasma limiter surround the hollow portion, the heat amount transmitted from the limiter body to the coolant increases to sufficiently cool the plasma limiter. (Yoshihara, H.)

  20. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  1. A multi-sample based method for identifying common CNVs in normal human genomic structure using high-resolution aCGH data.

    Directory of Open Access Journals (Sweden)

    Chihyun Park

    Full Text Available BACKGROUND: It is difficult to identify copy number variations (CNV in normal human genomic data due to noise and non-linear relationships between different genomic regions and signal intensity. A high-resolution array comparative genomic hybridization (aCGH containing 42 million probes, which is very large compared to previous arrays, was recently published. Most existing CNV detection algorithms do not work well because of noise associated with the large amount of input data and because most of the current methods were not designed to analyze normal human samples. Normal human genome analysis often requires a joint approach across multiple samples. However, the majority of existing methods can only identify CNVs from a single sample. METHODOLOGY AND PRINCIPAL FINDINGS: We developed a multi-sample-based genomic variations detector (MGVD that uses segmentation to identify common breakpoints across multiple samples and a k-means-based clustering strategy. Unlike previous methods, MGVD simultaneously considers multiple samples with different genomic intensities and identifies CNVs and CNV zones (CNVZs; CNVZ is a more precise measure of the location of a genomic variant than the CNV region (CNVR. CONCLUSIONS AND SIGNIFICANCE: We designed a specialized algorithm to detect common CNVs from extremely high-resolution multi-sample aCGH data. MGVD showed high sensitivity and a low false discovery rate for a simulated data set, and outperformed most current methods when real, high-resolution HapMap datasets were analyzed. MGVD also had the fastest runtime compared to the other algorithms evaluated when actual, high-resolution aCGH data were analyzed. The CNVZs identified by MGVD can be used in association studies for revealing relationships between phenotypes and genomic aberrations. Our algorithm was developed with standard C++ and is available in Linux and MS Windows format in the STL library. It is freely available at: http://embio.yonsei.ac.kr/~Park/mgvd.php.

  2. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...

  3. Electrical generator

    International Nuclear Information System (INIS)

    Purdy, D.L.

    1976-01-01

    A nuclear heart pacer having a heat-to-electricity converter including a solid-state thermoelectric unit embedded in rubber which is compressed to impress hydrostatic precompression on the unit is described. The converter and the radioactive heat source are enclosed in a container which includes the electrical circuit components for producing and controlling the pulses; the converter and components being embedded in rubber. The portions of the rubber in the converter and in the container through which heat flows between the radioactive primary source and the hot junction and between the cold junction and the wall of the container are of thermally conducting silicone rubber. The 238 Pu primary radioactive source material is encapsuled in a refractory casing of WC-222 (T-222) which in turn is encapsuled in a corrosion-resistant casing of platinum rhodium, a diffusion barrier separating the WC-222 and the Pt--Rh casings. The Pt--Rh casing is in a closed basket of tantalum. The tantalum protects the Pt--Rh from reacting with other materials during cremation of the host, if any. The casings and basket suppress the transmission of hard x rays generated by the alpha particles from the 238 Pu. The outside casing of the pacer is typically of titanium but its surface is covered by an electrically insulating coating, typically epoxy resin, except over a relatively limited area for effective electrical grounding to the body of the host. It is contemplated that the pacer will be inserted in the host with the exposed titanium engaging a non-muscular region of the body

  4. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2011-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  5. Signal sampling circuit

    NARCIS (Netherlands)

    Louwsma, S.M.; Vertregt, Maarten

    2010-01-01

    A sampling circuit for sampling a signal is disclosed. The sampling circuit comprises a plurality of sampling channels adapted to sample the signal in time-multiplexed fashion, each sampling channel comprising a respective track-and-hold circuit connected to a respective analogue to digital

  6. Exome sequencing generates high quality data in non-target regions

    Directory of Open Access Journals (Sweden)

    Guo Yan

    2012-05-01

    Full Text Available Abstract Background Exome sequencing using next-generation sequencing technologies is a cost efficient approach to selectively sequencing coding regions of human genome for detection of disease variants. A significant amount of DNA fragments from the capture process fall outside target regions, and sequence data for positions outside target regions have been mostly ignored after alignment. Result We performed whole exome sequencing on 22 subjects using Agilent SureSelect capture reagent and 6 subjects using Illumina TrueSeq capture reagent. We also downloaded sequencing data for 6 subjects from the 1000 Genomes Project Pilot 3 study. Using these data, we examined the quality of SNPs detected outside target regions by computing consistency rate with genotypes obtained from SNP chips or the Hapmap database, transition-transversion (Ti/Tv ratio, and percentage of SNPs inside dbSNP. For all three platforms, we obtained high-quality SNPs outside target regions, and some far from target regions. In our Agilent SureSelect data, we obtained 84,049 high-quality SNPs outside target regions compared to 65,231 SNPs inside target regions (a 129% increase. For our Illumina TrueSeq data, we obtained 222,171 high-quality SNPs outside target regions compared to 95,818 SNPs inside target regions (a 232% increase. For the data from the 1000 Genomes Project, we obtained 7,139 high-quality SNPs outside target regions compared to 1,548 SNPs inside target regions (a 461% increase. Conclusions These results demonstrate that a significant amount of high quality genotypes outside target regions can be obtained from exome sequencing data. These data should not be ignored in genetic epidemiology studies.

  7. Magnet Free Generators - 3rd Generation Wind Turbine Generators

    DEFF Research Database (Denmark)

    Jensen, Bogi Bech; Mijatovic, Nenad; Henriksen, Matthew Lee

    2013-01-01

    This paper presents an introduction to superconducting wind turbine generators, which are often referred to as 3rd generation wind turbine generators. Advantages and challenges of superconducting generators are presented with particular focus on possible weight and efficiency improvements. A comp...

  8. Trip generation characteristics of special generators

    Science.gov (United States)

    2010-03-01

    Special generators are introduced in the sequential four-step modeling procedure to represent certain types of facilities whose trip generation characteristics are not fully captured by the standard trip generation module. They are also used in the t...

  9. GET electronics samples data analysis

    International Nuclear Information System (INIS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G.F.; Pancin, J.; Pedroza, J.L.; Pibernat, J.; Pollacco, E.; Rebii, A.

    2016-01-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  10. The ocean sampling day consortium

    DEFF Research Database (Denmark)

    Kopf, Anna; Bicak, Mesude; Kottmann, Renzo

    2015-01-01

    Ocean Sampling Day was initiated by the EU-funded Micro B3 (Marine Microbial Biodiversity, Bioinformatics, Biotechnology) project to obtain a snapshot of the marine microbial biodiversity and function of the world’s oceans. It is a simultaneous global mega-sequencing campaign aiming to generate...... the largest standardized microbial data set in a single day. This will be achievable only through the coordinated efforts of an Ocean Sampling Day Consortium, supportive partnerships and networks between sites. This commentary outlines the establishment, function and aims of the Consortium and describes our...

  11. WRAP Module 1 sampling and analysis plan

    International Nuclear Information System (INIS)

    Mayancsik, B.A.

    1995-01-01

    This document provides the methodology to sample, screen, and analyze waste generated, processed, or otherwise the responsibility of the Waste Receiving and Processing Module 1 facility. This includes Low-Level Waste, Transuranic Waste, Mixed Waste, and Dangerous Waste

  12. WRAP Module 1 sampling and analysis plan

    Energy Technology Data Exchange (ETDEWEB)

    Mayancsik, B.A.

    1995-03-24

    This document provides the methodology to sample, screen, and analyze waste generated, processed, or otherwise the responsibility of the Waste Receiving and Processing Module 1 facility. This includes Low-Level Waste, Transuranic Waste, Mixed Waste, and Dangerous Waste.

  13. Wyoming CV Pilot Traveler Information Message Sample

    Data.gov (United States)

    Department of Transportation — This dataset contains a sample of the sanitized Traveler Information Messages (TIM) being generated by the Wyoming Connected Vehicle (CV) Pilot. The full set of TIMs...

  14. Automated sampling and control of gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Keyser, John

    2013-01-01

    In this work, we describe a method that automates the sampling and control of gaseous fluid simulations. Several recent approaches have provided techniques for artists to generate high-resolution simulations based on a low-resolution simulation

  15. On the Sampling

    OpenAIRE

    Güleda Doğan

    2017-01-01

    This editorial is on statistical sampling, which is one of the most two important reasons for editorial rejection from our journal Turkish Librarianship. The stages of quantitative research, the stage in which we are sampling, the importance of sampling for a research, deciding on sample size and sampling methods are summarised briefly.

  16. Information sampling behavior with explicit sampling costs

    Science.gov (United States)

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  17. How Sample Size Affects a Sampling Distribution

    Science.gov (United States)

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  18. Radioactivity in environmental samples

    International Nuclear Information System (INIS)

    Fornaro, Laura

    2001-01-01

    The objective of this practical work is to familiarize the student with radioactivity measures in environmental samples. For that were chosen samples a salt of natural potassium, a salt of uranium or torio and a sample of drinkable water

  19. DNA Sampling Hook

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The DNA Sampling Hook is a significant improvement on a method of obtaining a tissue sample from a live fish in situ from an aquatic environment. A tissue sample...

  20. Iowa Geologic Sampling Points

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Point locations of geologic samples/files in the IGS repository. Types of samples include well cuttings, outcrop samples, cores, drillers logs, measured sections,...

  1. Network and adaptive sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Combining the two statistical techniques of network sampling and adaptive sampling, this book illustrates the advantages of using them in tandem to effectively capture sparsely located elements in unknown pockets. It shows how network sampling is a reliable guide in capturing inaccessible entities through linked auxiliaries. The text also explores how adaptive sampling is strengthened in information content through subsidiary sampling with devices to mitigate unmanageable expanding sample sizes. Empirical data illustrates the applicability of both methods.

  2. Self generation, small generation, and embedded generation issues

    International Nuclear Information System (INIS)

    2001-01-01

    The New Brunswick Market Design Committee for electric power restructuring has been directed to examine issues regarding cogeneration and small-scale, on-site generation and how they will fit within the framework of the bilateral contract market. The Committee will also have to deal with issues of generation embedded in a distribution system. The Committee has defined cogeneration as the simultaneous production of electricity and useful thermal energy. Self-generation has been defined as small-scale power generation by an end-user, while embedded generation has been defined as a generation facility that is located within a distribution utility but is not directly connected to the transmission system. The Committee has postponed its decision on whether embedded generation will be eligible to participate under the bilateral contract market for electricity. This report discusses general issues such as the physical support of generation, market support of generation, transition issues and policy issues. It also discusses generation support issues such as operating reserves, transmission tariff issues, and distribution tariffs. Market support issues such as transmission access for generation sales were also considered, along with market access for generation sales, and net metering for behind the meter generation. 7 refs., 1 tab

  3. Generation Y Online Buying Patterns

    Directory of Open Access Journals (Sweden)

    Katija Vojvodić

    2015-12-01

    Full Text Available The advantages of electronic retailing can, among other things, result in uncontrolled buying by online consumers, i.e. in extreme buying behavior. The main purpose of this paper is to analyze and determine the buying patterns of Generation Y online consumers in order to explore the existence of different types of behavior based on the characteristics of online buying. The paper also aims at exploring the relationship between extracted factors and Generation Y consumers’ buying intentions. Empirical research was conducted on a sample of 515 consumers in the Dubrovnik-Neretva County. Based on the factor analysis, research results indicate that Generation Y online consumers are influenced by three factors: compulsivity, impulsivity, and functionality. The analysis of variance reveals that significant differences exist between the extracted factors and Generation Y’s online buying characteristics. In addition, correlation analysis shows a statistically significant correlation between the extracted factors and Generation Y’s buying intentions.

  4. Semantic attributes based texture generation

    Science.gov (United States)

    Chi, Huifang; Gan, Yanhai; Qi, Lin; Dong, Junyu; Madessa, Amanuel Hirpa

    2018-04-01

    Semantic attributes are commonly used for texture description. They can be used to describe the information of a texture, such as patterns, textons, distributions, brightness, and so on. Generally speaking, semantic attributes are more concrete descriptors than perceptual features. Therefore, it is practical to generate texture images from semantic attributes. In this paper, we propose to generate high-quality texture images from semantic attributes. Over the last two decades, several works have been done on texture synthesis and generation. Most of them focusing on example-based texture synthesis and procedural texture generation. Semantic attributes based texture generation still deserves more devotion. Gan et al. proposed a useful joint model for perception driven texture generation. However, perceptual features are nonobjective spatial statistics used by humans to distinguish different textures in pre-attentive situations. To give more describing information about texture appearance, semantic attributes which are more in line with human description habits are desired. In this paper, we use sigmoid cross entropy loss in an auxiliary model to provide enough information for a generator. Consequently, the discriminator is released from the relatively intractable mission of figuring out the joint distribution of condition vectors and samples. To demonstrate the validity of our method, we compare our method to Gan et al.'s method on generating textures by designing experiments on PTD and DTD. All experimental results show that our model can generate textures from semantic attributes.

  5. Sampling procedures and tables

    International Nuclear Information System (INIS)

    Franzkowski, R.

    1980-01-01

    Characteristics, defects, defectives - Sampling by attributes and by variables - Sample versus population - Frequency distributions for the number of defectives or the number of defects in the sample - Operating characteristic curve, producer's risk, consumer's risk - Acceptable quality level AQL - Average outgoing quality AOQ - Standard ISQ 2859 - Fundamentals of sampling by variables for fraction defective. (RW)

  6. Leading Generation Y

    National Research Council Canada - National Science Library

    Newman, Jill M

    2008-01-01

    .... Whether referred to as the Millennial Generation, Generation Y or the Next Generation, the Army needs to consider the gap between Boomers, Generation X and the Soldiers that fill our junior ranks...

  7. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  8. Effective sample labeling

    International Nuclear Information System (INIS)

    Rieger, J.T.; Bryce, R.W.

    1990-01-01

    Ground-water samples collected for hazardous-waste and radiological monitoring have come under strict regulatory and quality assurance requirements as a result of laws such as the Resource Conservation and Recovery Act. To comply with these laws, the labeling system used to identify environmental samples had to be upgraded to ensure proper handling and to protect collection personnel from exposure to sample contaminants and sample preservatives. The sample label now used as the Pacific Northwest Laboratory is a complete sample document. In the event other paperwork on a labeled sample were lost, the necessary information could be found on the label

  9. Enhanced conformational sampling using enveloping distribution sampling.

    Science.gov (United States)

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  10. A Consistent System for Coding Laboratory Samples

    Science.gov (United States)

    Sih, John C.

    1996-07-01

    A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.

  11. Sampling in practice

    DEFF Research Database (Denmark)

    Esbensen, Kim Harry; Petersen, Lars

    2005-01-01

    A basic knowledge of the Theory of Sampling (TOS) and a set of only eight sampling unit operations is all the practical sampler needs to ensure representativeness of samples extracted from all kinds of lots: production batches, - truckloads, - barrels, sub-division in the laboratory, sampling...... in nature and in the field (environmental sampling, forestry, geology, biology), from raw materials or manufactory processes etc. We here can only give a brief introduction to the Fundamental Sampling Principle (FSP) and these eight Sampling Unit Operations (SUO’s). Always respecting FSP and invoking only...... the necessary SUO’s (dependent on the practical situation) is the only prerequisite needed for eliminating all sampling bias and simultaneously minimizing sampling variance, and this is in addition a sure guarantee for making the final analytical results trustworthy. No reliable conclusions can be made unless...

  12. Integrative analysis of single nucleotide polymorphisms and gene expression efficiently distinguishes samples from closely related ethnic populations

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2012-07-01

    Full Text Available Abstract Background Ancestry informative markers (AIMs are a type of genetic marker that is informative for tracing the ancestral ethnicity of individuals. Application of AIMs has gained substantial attention in population genetics, forensic sciences, and medical genetics. Single nucleotide polymorphisms (SNPs, the materials of AIMs, are useful for classifying individuals from distinct continental origins but cannot discriminate individuals with subtle genetic differences from closely related ancestral lineages. Proof-of-principle studies have shown that gene expression (GE also is a heritable human variation that exhibits differential intensity distributions among ethnic groups. GE supplies ethnic information supplemental to SNPs; this motivated us to integrate SNP and GE markers to construct AIM panels with a reduced number of required markers and provide high accuracy in ancestry inference. Few studies in the literature have considered GE in this aspect, and none have integrated SNP and GE markers to aid classification of samples from closely related ethnic populations. Results We integrated a forward variable selection procedure into flexible discriminant analysis to identify key SNP and/or GE markers with the highest cross-validation prediction accuracy. By analyzing genome-wide SNP and/or GE markers in 210 independent samples from four ethnic groups in the HapMap II Project, we found that average testing accuracies for a majority of classification analyses were quite high, except for SNP-only analyses that were performed to discern study samples containing individuals from two close Asian populations. The average testing accuracies ranged from 0.53 to 0.79 for SNP-only analyses and increased to around 0.90 when GE markers were integrated together with SNP markers for the classification of samples from closely related Asian populations. Compared to GE-only analyses, integrative analyses of SNP and GE markers showed comparable testing

  13. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  14. Genetic Sample Inventory

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected primarily from the U.S. east coast. The collection includes samples from field programs,...

  15. Superposition Enhanced Nested Sampling

    Directory of Open Access Journals (Sweden)

    Stefano Martiniani

    2014-08-01

    Full Text Available The theoretical analysis of many problems in physics, astronomy, and applied mathematics requires an efficient numerical exploration of multimodal parameter spaces that exhibit broken ergodicity. Monte Carlo methods are widely used to deal with these classes of problems, but such simulations suffer from a ubiquitous sampling problem: The probability of sampling a particular state is proportional to its entropic weight. Devising an algorithm capable of sampling efficiently the full phase space is a long-standing problem. Here, we report a new hybrid method for the exploration of multimodal parameter spaces exhibiting broken ergodicity. Superposition enhanced nested sampling combines the strengths of global optimization with the unbiased or athermal sampling of nested sampling, greatly enhancing its efficiency with no additional parameters. We report extensive tests of this new approach for atomic clusters that are known to have energy landscapes for which conventional sampling schemes suffer from broken ergodicity. We also introduce a novel parallelization algorithm for nested sampling.

  16. Chorionic villus sampling

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003406.htm Chorionic villus sampling To use the sharing features on this page, please enable JavaScript. Chorionic villus sampling (CVS) is a test some pregnant women have ...

  17. Sampling on Quasicrystals

    OpenAIRE

    Grepstad, Sigrid

    2011-01-01

    We prove that quasicrystals are universal sets of stable sampling in any dimension. Necessary and sufficient density conditions for stable sampling and interpolation sets in one dimension are studied in detail.

  18. Genetic Sample Inventory - NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database archives genetic tissue samples from marine mammals collected in the North-Central Gulf of Mexico from 2010-2015. The collection includes samples from...

  19. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  20. Test sample handling apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    A test sample handling apparatus using automatic scintillation counting for gamma detection, for use in such fields as radioimmunoassay, is described. The apparatus automatically and continuously counts large numbers of samples rapidly and efficiently by the simultaneous counting of two samples. By means of sequential ordering of non-sequential counting data, it is possible to obtain precisely ordered data while utilizing sample carrier holders having a minimum length. (U.K.)

  1. Laboratory Sampling Guide

    Science.gov (United States)

    2012-05-11

    environment, and by ingestion of foodstuffs that have incorporated C-14 by photosynthesis . Like tritium, C-14 is a very low energy beta emitter and is... bacterial growth and to minimize development of solids in the sample. • Properly identify each sample container with name, SSN, and collection start and...sampling in the same cardboard carton. The sample may be kept cool or frozen during collection to control odor and bacterial growth. • Once

  2. Generation Y preferences towards wine

    DEFF Research Database (Denmark)

    Chrysochou, Polymeros; Krystallis Krontalis, Athanasios; Mocanu, Ana

    2012-01-01

    Purpose – The purpose of this paper is to explore differences in wine preferences between Generation Y and older cohorts in the USA. Design/methodology/approach – A total of 260 US consumers participated in a web-based survey that took place in April 2010. The best-worst scaling method was applied...... measuring the level of importance given by participants to a list of most common attributes used in choice of wine. Independent sample t-tests were applied to compare the best-worst scores between Generation Y and older cohorts. Findings – Differences were found in the level of importance that Generation Y...... gives to wine attributes in comparison to older cohorts. Generation Y was found to attach more importance to attributes such as “Someone recommended it”, “Attractive front label” and “Promotional display in-store”, whereas older cohorts gave more importance to attributes such as “I read about it...

  3. Second harmonic generation microscopy

    DEFF Research Database (Denmark)

    Brüggemann, Dagmar Adeline; Brewer, Jonathan R.; Risbo, Jens

    2010-01-01

    Myofibers and collagen show non-linear optical properties enabling imaging using second harmonic generation (SHG) microscopy. The technique is evaluated for use as a tool for real-time studies of thermally induced changes in thin samples of unfixed and unstained pork. The forward and the backward...... scattered SHG light reveal complementary features of the structures of myofibers and collagen fibers. Upon heating the myofibers show no structural changes before reaching a temperature of 53 °C. At this temperature the SHG signal becomes extinct. The extinction of the SHG at 53 °C coincides with a low......-temperature endotherm peak observable in the differential scanning calorimetry (DSC) thermograms. DSC analysis of epimysium, the connective tissue layer that enfold skeletal muscles, produces one large endotherm starting at 57 °C and peaking at 59.5 °C. SHG microscopy of collagen fibers reveals a variability of thermal...

  4. Material sampling for rotor evaluation

    International Nuclear Information System (INIS)

    Mercaldi, D.; Parker, J.

    1990-01-01

    Decisions regarding continued operation of aging rotating machinery must often be made without adequate knowledge of rotor material conditions. Physical specimens of the material are not generally available due to lack of an appropriate sampling technique or the high cost and inconvenience of obtaining such samples. This is despite the fact that examination of such samples may be critical to effectively assess the degradation of mechanical properties of the components in service or to permit detailed examination of microstructure and surface flaws. Such information permits a reduction in the uncertainty of remaining life estimates for turbine rotors to avoid unnecessarily premature and costly rotor retirement decisions. This paper describes the operation and use of a recently developed material sampling device which machines and recovers an undeformed specimen from the surface of rotor bores or other components for metallurgical analysis. The removal of the thin, wafer-like sample has a negligible effect on the structural integrity of these components, due to the geometry and smooth surface finish of the resulting shallow depression. Samples measuring approximately 0.03 to 0.1 inches (0.76 to 2.5 mm) thick by 0.5 to 1.0 inch (1.3 to 2.5 cm) in diameter can be removed without mechanical deformation or thermal degradation of the sample or the remaining component material. The device is operated remotely from a control console and can be used externally or internally on any surface for which there is at least a three inch (7.6 cm) working clearance. Application of the device in two case studies of turbine-generator evaluations are presented

  5. Mars Sample Handling Functionality

    Science.gov (United States)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  6. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  7. Developing Water Sampling Standards

    Science.gov (United States)

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  8. Verification of Representative Sampling in RI waste

    International Nuclear Information System (INIS)

    Ahn, Hong Joo; Song, Byung Cheul; Sohn, Se Cheul; Song, Kyu Seok; Jee, Kwang Yong; Choi, Kwang Seop

    2009-01-01

    For evaluating the radionuclide inventories for RI wastes, representative sampling is one of the most important parts in the process of radiochemical assay. Sampling to characterized RI waste conditions typically has been based on judgment or convenience sampling of individual or groups. However, it is difficult to get a sample representatively among the numerous drums. In addition, RI waste drums might be classified into heterogeneous wastes because they have a content of cotton, glass, vinyl, gloves, etc. In order to get the representative samples, the sample to be analyzed must be collected from selected every drum. Considering the expense and time of analysis, however, the number of sample has to be minimized. In this study, RI waste drums were classified by the various conditions of the half-life, surface dose, acceptance date, waste form, generator, etc. A sample for radiochemical assay was obtained through mixing samples of each drum. The sample has to be prepared for radiochemical assay and although the sample should be reasonably uniform, it is rare that a completely homogeneous material is received. Every sample is shredded by a 1 ∼ 2 cm 2 diameter and a representative aliquot taken for the required analysis. For verification of representative sampling, classified every group is tested for evaluation of 'selection of representative drum in a group' and 'representative sampling in a drum'

  9. Distributed generation induction and permanent magnet generators

    CERN Document Server

    Lai, L

    2007-01-01

    Distributed power generation is a technology that could help to enable efficient, renewable energy production both in the developed and developing world. It includes all use of small electric power generators, whether located on the utility system, at the site of a utility customer, or at an isolated site not connected to the power grid. Induction generators (IGs) are the cheapest and most commonly used technology, compatible with renewable energy resources. Permanent magnet (PM) generators have traditionally been avoided due to high fabrication costs; however, compared with IGs they are more reliable and productive. Distributed Generation thoroughly examines the principles, possibilities and limitations of creating energy with both IGs and PM generators. It takes an electrical engineering approach in the analysis and testing of these generators, and includes diagrams and extensive case study examples o better demonstrate how the integration of energy sources can be accomplished. The book also provides the ...

  10. Generalized sampling in Julia

    DEFF Research Database (Denmark)

    Jacobsen, Christian Robert Dahl; Nielsen, Morten; Rasmussen, Morten Grud

    2017-01-01

    Generalized sampling is a numerically stable framework for obtaining reconstructions of signals in different bases and frames from their samples. For example, one can use wavelet bases for reconstruction given frequency measurements. In this paper, we will introduce a carefully documented toolbox...... for performing generalized sampling in Julia. Julia is a new language for technical computing with focus on performance, which is ideally suited to handle the large size problems often encountered in generalized sampling. The toolbox provides specialized solutions for the setup of Fourier bases and wavelets....... The performance of the toolbox is compared to existing implementations of generalized sampling in MATLAB....

  11. Random pulse generator

    International Nuclear Information System (INIS)

    Guo Ya'nan; Jin Dapeng; Zhao Dixin; Liu Zhen'an; Qiao Qiao; Chinese Academy of Sciences, Beijing

    2007-01-01

    Due to the randomness of radioactive decay and nuclear reaction, the signals from detectors are random in time. But normal pulse generator generates periodical pulses. To measure the performances of nuclear electronic devices under random inputs, a random generator is necessary. Types of random pulse generator are reviewed, 2 digital random pulse generators are introduced. (authors)

  12. The Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  13. Image Sampling with Quasicrystals

    Directory of Open Access Journals (Sweden)

    Mark Grundland

    2009-07-01

    Full Text Available We investigate the use of quasicrystals in image sampling. Quasicrystals produce space-filling, non-periodic point sets that are uniformly discrete and relatively dense, thereby ensuring the sample sites are evenly spread out throughout the sampled image. Their self-similar structure can be attractive for creating sampling patterns endowed with a decorative symmetry. We present a brief general overview of the algebraic theory of cut-and-project quasicrystals based on the geometry of the golden ratio. To assess the practical utility of quasicrystal sampling, we evaluate the visual effects of a variety of non-adaptive image sampling strategies on photorealistic image reconstruction and non-photorealistic image rendering used in multiresolution image representations. For computer visualization of point sets used in image sampling, we introduce a mosaic rendering technique.

  14. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  15. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  16. Next-Generation Pathology.

    Science.gov (United States)

    Caie, Peter D; Harrison, David J

    2016-01-01

    The field of pathology is rapidly transforming from a semiquantitative and empirical science toward a big data discipline. Large data sets from across multiple omics fields may now be extracted from a patient's tissue sample. Tissue is, however, complex, heterogeneous, and prone to artifact. A reductionist view of tissue and disease progression, which does not take this complexity into account, may lead to single biomarkers failing in clinical trials. The integration of standardized multi-omics big data and the retention of valuable information on spatial heterogeneity are imperative to model complex disease mechanisms. Mathematical modeling through systems pathology approaches is the ideal medium to distill the significant information from these large, multi-parametric, and hierarchical data sets. Systems pathology may also predict the dynamical response of disease progression or response to therapy regimens from a static tissue sample. Next-generation pathology will incorporate big data with systems medicine in order to personalize clinical practice for both prognostic and predictive patient care.

  17. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming

    2014-05-07

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  18. Unbiased Sampling and Meshing of Isosurfaces

    KAUST Repository

    Yan, Dongming; Wallner, Johannes; Wonka, Peter

    2014-01-01

    In this paper, we present a new technique to generate unbiased samples on isosurfaces. An isosurface, F(x,y,z) = c , of a function, F , is implicitly defined by trilinear interpolation of background grid points. The key idea of our approach is that of treating the isosurface within a grid cell as a graph (height) function in one of the three coordinate axis directions, restricted to where the slope is not too high, and integrating / sampling from each of these three. We use this unbiased sampling algorithm for applications in Monte Carlo integration, Poisson-disk sampling, and isosurface meshing.

  19. Power Generation for River and Tidal Generators

    Energy Technology Data Exchange (ETDEWEB)

    Muljadi, Eduard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wright, Alan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Donegan, James [Ocean Renewable Power Company (ORPC), Portland, ME (United States); Marnagh, Cian [Ocean Renewable Power Company (ORPC), Portland, ME (United States); McEntee, Jarlath [Ocean Renewable Power Company (ORPC), Portland, ME (United States)

    2016-06-01

    Renewable energy sources are the second largest contributor to global electricity production, after fossil fuels. The integration of renewable energy continued to grow in 2014 against a backdrop of increasing global energy consumption and a dramatic decline in oil prices during the second half of the year. As renewable generation has become less expensive during recent decades, and it becomes more accepted by the global population, the focus on renewable generation has expanded from primarily wind and solar to include new types with promising future applications, such as hydropower generation, including river and tidal generation. Today, hydropower is considered one of the most important renewable energy sources. In river and tidal generation, the input resource flow is slower but also steadier than it is in wind or solar generation, yet the level of water turbulent flow may vary from one place to another. This report focuses on hydrokinetic power conversion.

  20. An Investigation of the Sampling Distribution of the Congruence Coefficient.

    Science.gov (United States)

    Broadbooks, Wendy J.; Elmore, Patricia B.

    This study developed and investigated an empirical sampling distribution of the congruence coefficient. The effects of sample size, number of variables, and population value of the congruence coefficient on the sampling distribution of the congruence coefficient were examined. Sample data were generated on the basis of the common factor model and…

  1. Galaxy LIMS for next-generation sequencing

    NARCIS (Netherlands)

    Scholtalbers, J.; Rossler, J.; Sorn, P.; Graaf, J. de; Boisguerin, V.; Castle, J.; Sahin, U.

    2013-01-01

    SUMMARY: We have developed a laboratory information management system (LIMS) for a next-generation sequencing (NGS) laboratory within the existing Galaxy platform. The system provides lab technicians standard and customizable sample information forms, barcoded submission forms, tracking of input

  2. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  3. Statistical sampling strategies

    International Nuclear Information System (INIS)

    Andres, T.H.

    1987-01-01

    Systems assessment codes use mathematical models to simulate natural and engineered systems. Probabilistic systems assessment codes carry out multiple simulations to reveal the uncertainty in values of output variables due to uncertainty in the values of the model parameters. In this paper, methods are described for sampling sets of parameter values to be used in a probabilistic systems assessment code. Three Monte Carlo parameter selection methods are discussed: simple random sampling, Latin hypercube sampling, and sampling using two-level orthogonal arrays. Three post-selection transformations are also described: truncation, importance transformation, and discretization. Advantages and disadvantages of each method are summarized

  4. Statistical distribution sampling

    Science.gov (United States)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  5. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  6. Analysis of arsenical metabolites in biological samples.

    Science.gov (United States)

    Hernandez-Zavala, Araceli; Drobna, Zuzana; Styblo, Miroslav; Thomas, David J

    2009-11-01

    Quantitation of iAs and its methylated metabolites in biological samples provides dosimetric information needed to understand dose-response relations. Here, methods are described for separation of inorganic and mono-, di-, and trimethylated arsenicals by thin layer chromatography. This method has been extensively used to track the metabolism of the radionuclide [(73)As] in a variety of in vitro assay systems. In addition, a hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometric method is described for the quantitation of arsenicals in biological samples. This method uses pH-selective hydride generation to differentiate among arsenicals containing trivalent or pentavalent arsenic.

  7. Strategic management of steam generators

    International Nuclear Information System (INIS)

    Hernalsteen, P.; Berthe, J.

    1991-01-01

    This paper addresses the general approach followed in Belgium for managing any kind of generic defect affecting a Steam Generator tubebundle. This involves the successive steps of: problem detection, dedicated sample monitoring, implementation of preventive methods, development of specific plugging criteria, dedicated 100% inspection, implementation of repair methods, adjusted sample monitoring and repair versus replacement strategy. These steps are illustrated by the particular case of Primary Water Stress Corrosion Cracking in tube roll transitions, which is presently the main problem for two Belgian units Doele-3 and Tihange-2. (author)

  8. Big Data, Small Sample.

    Science.gov (United States)

    Gerlovina, Inna; van der Laan, Mark J; Hubbard, Alan

    2017-05-20

    Multiple comparisons and small sample size, common characteristics of many types of "Big Data" including those that are produced by genomic studies, present specific challenges that affect reliability of inference. Use of multiple testing procedures necessitates calculation of very small tail probabilities of a test statistic distribution. Results based on large deviation theory provide a formal condition that is necessary to guarantee error rate control given practical sample sizes, linking the number of tests and the sample size; this condition, however, is rarely satisfied. Using methods that are based on Edgeworth expansions (relying especially on the work of Peter Hall), we explore the impact of departures of sampling distributions from typical assumptions on actual error rates. Our investigation illustrates how far the actual error rates can be from the declared nominal levels, suggesting potentially wide-spread problems with error rate control, specifically excessive false positives. This is an important factor that contributes to "reproducibility crisis". We also review some other commonly used methods (such as permutation and methods based on finite sampling inequalities) in their application to multiple testing/small sample data. We point out that Edgeworth expansions, providing higher order approximations to the sampling distribution, offer a promising direction for data analysis that could improve reliability of studies relying on large numbers of comparisons with modest sample sizes.

  9. Sampling system and method

    Science.gov (United States)

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  10. Simple street tree sampling

    Science.gov (United States)

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  11. Sampling or gambling

    Energy Technology Data Exchange (ETDEWEB)

    Gy, P.M.

    1981-12-01

    Sampling can be compared to no other technique. A mechanical sampler must above all be selected according to its aptitude for supressing or reducing all components of the sampling error. Sampling is said to be correct when it gives all elements making up the batch of matter submitted to sampling an uniform probability of being selected. A sampler must be correctly designed, built, installed, operated and maintained. When the conditions of sampling correctness are not strictly respected, the sampling error can no longer be controlled and can, unknown to the user, be unacceptably large: the sample is no longer representative. The implementation of an incorrect sampler is a form of gambling and this paper intends to show that at this game the user is nearly always the loser in the long run. The users' and the manufacturers' interests may diverge and the standards which should safeguard the users' interests very often fail to do so by tolerating or even recommending incorrect techniques such as the implementation of too narrow cutters traveling too fast through the stream to be sampled.

  12. Sample pretretment in microsystems

    DEFF Research Database (Denmark)

    Perch-Nielsen, Ivan R.

    2003-01-01

    : Sample preparation → DNA amplification → DNA analysis. The overall goal of the project is integration of as many as possible of these steps. This thesis covers mainly pretreatment in a microchip. Some methods for sample pretreatment have been tested. Most conventional is fluorescence activated cell sort......When a sample, e.g. from a patient, is processed using conventional methods, the sample must be transported to the laboratory where it is analyzed, after which the results is sent back. By integrating the separate steps of the analysis in a micro total analysis system (μTAS), results can...... be obtained fast and better. Preferably with all the processes from sample to signal moved to the bedside of the patient. Of course there is still much to learn and study in the process of miniaturization. DNA analysis is one process subject to integration. There are roughly three steps in a DNA analysis...

  13. Biological sample collector

    Science.gov (United States)

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  14. Magnetohydrodynamic (MHD) power generation

    International Nuclear Information System (INIS)

    Chandra, Avinash

    1980-01-01

    The concept of MHD power generation, principles of operation of the MHD generator, its design, types, MHD generator cycles, technological problems to be overcome, the current state of the art in USA and USSR are described. Progress of India's experimental 5 Mw water-gas fired open cycle MHD power generator project is reported in brief. (M.G.B.)

  15. Talkin' 'bout My Generation

    Science.gov (United States)

    Rickes, Persis C.

    2010-01-01

    The monikers are many: (1) "Generation Y"; (2) "Echo Boomers"; (3) "GenMe"; (4) the "Net Generation"; (5) "RenGen"; and (6) "Generation Next". One name that appears to be gaining currency is "Millennials," perhaps as a way to better differentiate the current generation from its…

  16. Work Values across Generations

    Science.gov (United States)

    Hansen, Jo-Ida C.; Leuty, Melanie E.

    2012-01-01

    Mainstream publication discussions of differences in generational cohorts in the workplace suggest that individuals of more recent generations, such as Generation X and Y, have different work values than do individuals of the Silent and Baby Boom generations. Although extant research suggests that age may influence work values, few of the…

  17. Minding the Generation Gap

    Science.gov (United States)

    Field, John

    2011-01-01

    Generational conflict is back. After years of relative silence, and mutual ignorance, the young and old are once more at war. With youth unemployment high on the political agenda, the fortunes of the "jobless generation" are being contrasted with those of the "golden generation" of baby boomers, but is one generation really…

  18. Authentication of forensic DNA samples.

    Science.gov (United States)

    Frumkin, Dan; Wasserstrom, Adam; Davidson, Ariane; Grafit, Arnon

    2010-02-01

    Over the past twenty years, DNA analysis has revolutionized forensic science, and has become a dominant tool in law enforcement. Today, DNA evidence is key to the conviction or exoneration of suspects of various types of crime, from theft to rape and murder. However, the disturbing possibility that DNA evidence can be faked has been overlooked. It turns out that standard molecular biology techniques such as PCR, molecular cloning, and recently developed whole genome amplification (WGA), enable anyone with basic equipment and know-how to produce practically unlimited amounts of in vitro synthesized (artificial) DNA with any desired genetic profile. This artificial DNA can then be applied to surfaces of objects or incorporated into genuine human tissues and planted in crime scenes. Here we show that the current forensic procedure fails to distinguish between such samples of blood, saliva, and touched surfaces with artificial DNA, and corresponding samples with in vivo generated (natural) DNA. Furthermore, genotyping of both artificial and natural samples with Profiler Plus((R)) yielded full profiles with no anomalies. In order to effectively deal with this problem, we developed an authentication assay, which distinguishes between natural and artificial DNA based on methylation analysis of a set of genomic loci: in natural DNA, some loci are methylated and others are unmethylated, while in artificial DNA all loci are unmethylated. The assay was tested on natural and artificial samples of blood, saliva, and touched surfaces, with complete success. Adopting an authentication assay for casework samples as part of the forensic procedure is necessary for maintaining the high credibility of DNA evidence in the judiciary system.

  19. Generational Reproduction of child abuse

    OpenAIRE

    García Ampudia, Lupe; Orellana M., Oswaldo; Pomalaya V., Ricardo; Yanac Reynoso, Elisa; Malaver S., Carmela; Herrera F., Edgar; Sotelo L., Noemi; Campos C., Lilia; Sotelo L., Lidia; Orellana García, Daphne; Velasquez M., Katherine

    2014-01-01

    The objective of this research is the study of the abuse rising in two generations, of parents and children and establishing the relationship between background child’s abuse with the potential abuse. The sample is comprised of 441 students and 303 parents who agreed to answer the Memories of Abuse Questionnaire. The used instruments were the Child History Questionnaire adapted for the purpose of this research, the Inventory of Potential Child Abuse (Milner, J. 1977), adapted by De Paul, Arru...

  20. Gamma ray generator

    Science.gov (United States)

    Firestone, Richard B; Reijonen, Jani

    2014-05-27

    An embodiment of a gamma ray generator includes a neutron generator and a moderator. The moderator is coupled to the neutron generator. The moderator includes a neutron capture material. In operation, the neutron generator produces neutrons and the neutron capture material captures at least some of the neutrons to produces gamma rays. An application of the gamma ray generator is as a source of gamma rays for calibration of gamma ray detectors.

  1. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  2. PFP Wastewater Sampling Facility

    International Nuclear Information System (INIS)

    Hirzel, D.R.

    1995-01-01

    This test report documents the results obtained while conducting operational testing of the sampling equipment in the 225-WC building, the PFP Wastewater Sampling Facility. The Wastewater Sampling Facility houses equipment to sample and monitor the PFP's liquid effluents before discharging the stream to the 200 Area Treated Effluent Disposal Facility (TEDF). The majority of the streams are not radioactive and discharges from the PFP Heating, Ventilation, and Air Conditioning (HVAC). The streams that might be contaminated are processed through the Low Level Waste Treatment Facility (LLWTF) before discharging to TEDF. The sampling equipment consists of two flow-proportional composite samplers, an ultrasonic flowmeter, pH and conductivity monitors, chart recorder, and associated relays and current isolators to interconnect the equipment to allow proper operation. Data signals from the monitors are received in the 234-5Z Shift Office which contains a chart recorder and alarm annunciator panel. The data signals are also duplicated and sent to the TEDF control room through the Local Control Unit (LCU). Performing the OTP has verified the operability of the PFP wastewater sampling system. This Operability Test Report documents the acceptance of the sampling system for use

  3. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  4. Focused conformational sampling in proteins

    Science.gov (United States)

    Bacci, Marco; Langini, Cassiano; Vymětal, Jiří; Caflisch, Amedeo; Vitalis, Andreas

    2017-11-01

    A detailed understanding of the conformational dynamics of biological molecules is difficult to obtain by experimental techniques due to resolution limitations in both time and space. Computer simulations avoid these in theory but are often too short to sample rare events reliably. Here we show that the progress index-guided sampling (PIGS) protocol can be used to enhance the sampling of rare events in selected parts of biomolecules without perturbing the remainder of the system. The method is very easy to use as it only requires as essential input a set of several features representing the parts of interest sufficiently. In this feature space, new states are discovered by spontaneous fluctuations alone and in unsupervised fashion. Because there are no energetic biases acting on phase space variables or projections thereof, the trajectories PIGS generates can be analyzed directly in the framework of transition networks. We demonstrate the possibility and usefulness of such focused explorations of biomolecules with two loops that are part of the binding sites of bromodomains, a family of epigenetic "reader" modules. This real-life application uncovers states that are structurally and kinetically far away from the initial crystallographic structures and are also metastable. Representative conformations are intended to be used in future high-throughput virtual screening campaigns.

  5. Sample Return Robot

    Data.gov (United States)

    National Aeronautics and Space Administration — This Challenge requires demonstration of an autonomous robotic system to locate and collect a set of specific sample types from a large planetary analog area and...

  6. Ecotoxicology statistical sampling

    International Nuclear Information System (INIS)

    Saona, G.

    2012-01-01

    This presentation introduces to general concepts in toxicology sample designs such as the distribution of organic or inorganic contaminants, a microbiological contamination, and the determination of the position in an eco toxicological bioassays ecosystem.

  7. Mini MAX - Medicaid Sample

    Data.gov (United States)

    U.S. Department of Health & Human Services — To facilitate wider use of MAX, CMS contracted with Mathematica to convene a technical expert panel (TEP) and determine the feasibility of creating a sample file for...

  8. Collecting Samples for Testing

    Science.gov (United States)

    ... Creatinine Ratio Valproic Acid Vancomycin Vanillylmandelic Acid (VMA) VAP Vitamin A Vitamin B12 and Folate Vitamin D ... that used for CSF in that they require aspiration of a sample of the fluid through a ...

  9. Roadway sampling evaluation.

    Science.gov (United States)

    2014-09-01

    The Florida Department of Transportation (FDOT) has traditionally required that all sampling : and testing of asphalt mixtures be at the Contractors production facility. With recent staffing cuts, as : well as budget reductions, FDOT has been cons...

  10. Soil Gas Sampling

    Science.gov (United States)

    Field Branches Quality System and Technical Procedures: This document describes general and specific procedures, methods and considerations to be used and observed when collecting soil gas samples for field screening or laboratory analysis.

  11. Soil Sampling Operating Procedure

    Science.gov (United States)

    EPA Region 4 Science and Ecosystem Support Division (SESD) document that describes general and specific procedures, methods, and considerations when collecting soil samples for field screening or laboratory analysis.

  12. HTGR steam generator development

    International Nuclear Information System (INIS)

    Schuetzenduebel, W.G.; Hunt, P.S.; Weber, M.

    1976-01-01

    More than 40 gas-cooled reactor plants have produced in excess of 400 reactor years of operating experience which have proved a reasonably high rate of gas-cooled reactor steam generator availability. The steam generators used in these reactors include single U-tube and straight-tube steam generators as well as meander type and helically wound or involute tube steam generators. It appears that modern reactors are being equipped with helically wound steam generators of the once-through type as the end product of steam generator evolution in gas-cooled reactor plants. This paper provides a general overview of gas-cooled reactor steam generator evolution and operating experience and shows how design criteria and constraints, research and development, and experience data are factored into the design/development of modern helically wound tube steam generators for the present generation of gas-cooled reactors

  13. Cylindrical neutron generator

    Science.gov (United States)

    Leung, Ka-Ngo [Hercules, CA

    2008-04-22

    A cylindrical neutron generator is formed with a coaxial RF-driven plasma ion source and target. A deuterium (or deuterium and tritium) plasma is produced by RF excitation in a cylindrical plasma ion generator using an RF antenna. A cylindrical neutron generating target is coaxial with the ion generator, separated by plasma and extraction electrodes which contain many slots. The plasma generator emanates ions radially over 360.degree. and the cylindrical target is thus irradiated by ions over its entire circumference. The plasma generator and target may be as long as desired. The plasma generator may be in the center and the neutron target on the outside, or the plasma generator may be on the outside and the target on the inside. In a nested configuration, several concentric targets and plasma generating regions are nested to increase the neutron flux.

  14. Testing generative thinking among Swazi children | Mushoriwa ...

    African Journals Online (AJOL)

    The survey research design was used, with interviews employed to collect the data. Crosstabs and a two-sample t-test were used to analyse the data. The study found no significant differences in generative thinking between second and fifth graders in the Swazi sample. In the comparative analyses, while significant ...

  15. Statistical sampling plans

    International Nuclear Information System (INIS)

    Jaech, J.L.

    1984-01-01

    In auditing and in inspection, one selects a number of items by some set of procedures and performs measurements which are compared with the operator's values. This session considers the problem of how to select the samples to be measured, and what kinds of measurements to make. In the inspection situation, the ultimate aim is to independently verify the operator's material balance. The effectiveness of the sample plan in achieving this objective is briefly considered. The discussion focuses on the model plant

  16. Two phase sampling

    CERN Document Server

    Ahmad, Zahoor; Hanif, Muhammad

    2013-01-01

    The development of estimators of population parameters based on two-phase sampling schemes has seen a dramatic increase in the past decade. Various authors have developed estimators of population using either one or two auxiliary variables. The present volume is a comprehensive collection of estimators available in single and two phase sampling. The book covers estimators which utilize information on single, two and multiple auxiliary variables of both quantitative and qualitative nature. Th...

  17. Uranium tailings sampling manual

    International Nuclear Information System (INIS)

    Feenstra, S.; Reades, D.W.; Cherry, J.A.; Chambers, D.B.; Case, G.G.; Ibbotson, B.G.

    1985-01-01

    The purpose of this manual is to describe the requisite sampling procedures for the application of uniform high-quality standards to detailed geotechnical, hydrogeological, geochemical and air quality measurements at Canadian uranium tailings disposal sites. The selection and implementation of applicable sampling procedures for such measurements at uranium tailings disposal sites are complicated by two primary factors. Firstly, the physical and chemical nature of uranium mine tailings and effluent is considerably different from natural soil materials and natural waters. Consequently, many conventional methods for the collection and analysis of natural soils and waters are not directly applicable to tailings. Secondly, there is a wide range in the physical and chemical nature of uranium tailings. The composition of the ore, the milling process, the nature of tailings depositon, and effluent treatment vary considerably and are highly site-specific. Therefore, the definition and implementation of sampling programs for uranium tailings disposal sites require considerable evaluation, and often innovation, to ensure that appropriate sampling and analysis methods are used which provide the flexibility to take into account site-specific considerations. The following chapters describe the objective and scope of a sampling program, preliminary data collection, and the procedures for sampling of tailings solids, surface water and seepage, tailings pore-water, and wind-blown dust and radon

  18. Reactor water sampling device

    International Nuclear Information System (INIS)

    Sakamaki, Kazuo.

    1992-01-01

    The present invention concerns a reactor water sampling device for sampling reactor water in an in-core monitor (neutron measuring tube) housing in a BWR type reactor. The upper end portion of a drain pipe of the reactor water sampling device is attached detachably to an in-core monitor flange. A push-up rod is inserted in the drain pipe vertically movably. A sampling vessel and a vacuum pump are connected to the lower end of the drain pipe. A vacuum pump is operated to depressurize the inside of the device and move the push-up rod upwardly. Reactor water in the in-core monitor housing flows between the drain pipe and the push-up rod and flows into the sampling vessel. With such a constitution, reactor water in the in-core monitor housing can be sampled rapidly with neither opening the lid of the reactor pressure vessel nor being in contact with air. Accordingly, operator's exposure dose can be reduced. (I.N.)

  19. Wet gas sampling

    Energy Technology Data Exchange (ETDEWEB)

    Welker, T.F.

    1997-07-01

    The quality of gas has changed drastically in the past few years. Most gas is wet with hydrocarbons, water, and heavier contaminants that tend to condense if not handled properly. If a gas stream is contaminated with condensables, the sampling of that stream must be done in a manner that will ensure all of the components in the stream are introduced into the sample container as the composite. The sampling and handling of wet gas is extremely difficult under ideal conditions. There are no ideal conditions in the real world. The problems related to offshore operations and other wet gas systems, as well as the transportation of the sample, are additional problems that must be overcome if the analysis is to mean anything to the producer and gatherer. The sampling of wet gas systems is decidedly more difficult than sampling conventional dry gas systems. Wet gas systems were generally going to result in the measurement of one heating value at the inlet of the pipe and a drastic reduction in the heating value of the gas at the outlet end of the system. This is caused by the fallout or accumulation of the heavier products that, at the inlet, may be in the vapor state in the pipeline; hence, the high gravity and high BTU. But, in fact, because of pressure and temperature variances, these liquids condense and form a liquid that is actually running down the pipe as a stream or is accumulated in drips to be blown from the system. (author)

  20. Lunar Sample Compendium

    Science.gov (United States)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  1. Second harmonic generation and sum frequency generation

    International Nuclear Information System (INIS)

    Pellin, M.J.; Biwer, B.M.; Schauer, M.W.; Frye, J.M.; Gruen, D.M.

    1990-01-01

    Second harmonic generation and sum frequency generation are increasingly being used as in situ surface probes. These techniques are coherent and inherently surface sensitive by the nature of the mediums response to intense laser light. Here we will review these two techniques using aqueous corrosion as an example problem. Aqueous corrosion of technologically important materials such as Fe, Ni and Cr proceeds from a reduced metal surface with layer by layer growth of oxide films mitigated by compositional changes in the chemical makeup of the growing film. Passivation of the metal surface is achieved after growth of only a few tens of atomic layers of metal oxide. Surface Second Harmonic Generation and a related nonlinear laser technique, Sum Frequency Generation have demonstrated an ability to probe the surface composition of growing films even in the presence of aqueous solutions. 96 refs., 4 figs

  2. Refrigeration generation using expander-generator units

    Science.gov (United States)

    Klimenko, A. V.; Agababov, V. S.; Koryagin, A. V.; Baidakova, Yu. O.

    2016-05-01

    The problems of using the expander-generator unit (EGU) to generate refrigeration, along with electricity were considered. It is shown that, on the level of the temperatures of refrigeration flows using the EGU, one can provide the refrigeration supply of the different consumers: ventilation and air conditioning plants and industrial refrigerators and freezers. The analysis of influence of process parameters on the cooling power of the EGU, which depends on the parameters of the gas expansion process in the expander and temperatures of cooled environment, was carried out. The schematic diagram of refrigeration generation plant based on EGU is presented. The features and advantages of EGU to generate refrigeration compared with thermotransformer of steam compressive and absorption types were shown, namely: there is no need to use the energy generated by burning fuel to operate the EGU; beneficial use of the heat delivered to gas from the flow being cooled in equipment operating on gas; energy production along with refrigeration generation, which makes it possible to create, using EGU, the trigeneration plants without using the energy power equipment. It is shown that the level of the temperatures of refrigeration flows, which can be obtained by using the EGU on existing technological decompression stations of the transported gas, allows providing the refrigeration supply of various consumers. The information that the refrigeration capacity of an expander-generator unit not only depends on the parameters of the process of expansion of gas flowing in the expander (flow rate, temperatures and pressures at the inlet and outlet) but it is also determined by the temperature needed for a consumer and the initial temperature of the flow of the refrigeration-carrier being cooled. The conclusion was made that the expander-generator units can be used to create trigeneration plants both at major power plants and at small energy.

  3. The Paternal Landscape along the Bight of Benin - Testing Regional Representativeness of West-African Population Samples Using Y-Chromosomal Markers.

    Directory of Open Access Journals (Sweden)

    Maarten H D Larmuseau

    Full Text Available Patterns of genetic variation in human populations across the African continent are still not well studied in comparison with Eurasia and America, despite the high genetic and cultural diversity among African populations. In population and forensic genetic studies a single sample is often used to represent a complete African region. In such a scenario, inappropriate sampling strategies and/or the use of local, isolated populations may bias interpretations and pose questions of representativeness at a macrogeographic-scale. The non-recombining region of the Y-chromosome (NRY has great potential to reveal the regional representation of a sample due to its powerful phylogeographic information content. An area poorly characterized for Y-chromosomal data is the West-African region along the Bight of Benin, despite its important history in the trans-Atlantic slave trade and its large number of ethnic groups, languages and lifestyles. In this study, Y-chromosomal haplotypes from four Beninese populations were determined and a global meta-analysis with available Y-SNP and Y-STR data from populations along the Bight of Benin and surrounding areas was performed. A thorough methodology was developed allowing comparison of population samples using Y-chromosomal lineage data based on different Y-SNP panels and phylogenies. Geographic proximity turned out to be the best predictor of genetic affinity between populations along the Bight of Benin. Nevertheless, based on Y-chromosomal data from the literature two population samples differed strongly from others from the same or neighbouring areas and are not regionally representative within large-scale studies. Furthermore, the analysis of the HapMap sample YRI of a Yoruban population from South-western Nigeria based on Y-SNPs and Y-STR data showed for the first time its regional representativeness, a result which is important for standard population and forensic genetic applications using the YRI sample

  4. Nonuniform sampling by quantiles

    Science.gov (United States)

    Craft, D. Levi; Sonstrom, Reilly E.; Rovnyak, Virginia G.; Rovnyak, David

    2018-03-01

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license.

  5. AND/OR Importance Sampling

    OpenAIRE

    Gogate, Vibhav; Dechter, Rina

    2012-01-01

    The paper introduces AND/OR importance sampling for probabilistic graphical models. In contrast to importance sampling, AND/OR importance sampling caches samples in the AND/OR space and then extracts a new sample mean from the stored samples. We prove that AND/OR importance sampling may have lower variance than importance sampling; thereby providing a theoretical justification for preferring it over importance sampling. Our empirical evaluation demonstrates that AND/OR importance sampling is ...

  6. Surry steam generator - examination and evaluation

    International Nuclear Information System (INIS)

    Clark, R.A.; Doctor, P.G.; Ferris, R.H.

    1985-10-01

    This report summarizes research conducted during the fourth year of the five year Steam Generator Group Project. During this period the project conducted numerous nondestructive examination (NDE) round robin inspections of the original Surry 2A steam generator. They included data acquisition/analysis and analysis-only round robins using multifrequency bobbin coil eddy current tests. In addition, the generator was nondestructively examined by alternate or advanced techniques including ultrasonics, optical fiber, profilometry and special eddy current instrumentation. The round robin interpretation data were compared. To validate the NDE results and for tube integrity testing, a selection of tubing samples, determined to be representative of the generator, was designated for removal. Initial sample removals from the generator included three sections of tube sheet, two sections of support plate and encompassed tubes, and a number of straight and U-bend tubing sections. Metallographic examination of these sections was initiated. Details of significant results are presented in the following paper. 13 figs

  7. Surry steam generator - examination and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Clark, R A; Doctor, P G; Ferris, R H

    1987-01-01

    This report summarizes research conducted during the fourth year of the five year Steam Generator Group Project. During this period the project conducted numerous nondestructive examination (NDE) round robin inspections of the original Surry 2A steam generator. They included data acquisition/analysis and analysis-only round robins using multifrequency bobbin coil eddy current tests. In addition, the generator was nondestructively examined by alternate or advanced techniques including ultrasonics, optical fiber, profilometry and special eddy current instrumentation. The round robin interpretation data were compared. To validate the NDE results and for tube integrity testing, a selection of tubing samples, determined to be representative of the generator, was designated for removal. Initial sample removals from the generator included three sections of tube sheet, two sections of support plate and encompassed tubes, and a number of straight and U-bend tubing sections. Metallographic examination of these sections was initiated. Details of significant results are presented in the following paper.

  8. Sample collection and documentation

    International Nuclear Information System (INIS)

    Cullings, Harry M.; Fujita, Shoichiro; Watanabe, Tadaaki; Yamashita, Tomoaki; Tanaka, Kenichi; Endo, Satoru; Shizuma, Kiyoshi; Hoshi, Masaharu; Hasai, Hiromi

    2005-01-01

    Beginning within a few weeks after the bombings and periodically during the intervening decades, investigators in Hiroshima and Nagasaki have collected samples of materials that were in the cities at the time of the bombings. Although some early efforts were not driven by specific measurement objectives, many others were. Even some of the very earliest samples collected in 1945 were based on carefully conceived research plans and detailed specifications for samples appropriate to particular retrospective measurements, i.e., of particular residual quantities remaining from exposure to the neutrons and gamma rays from the bombs. This chapter focuses mainly on the work of groups at two institutions that have actively collaborated since the 1980s in major collection efforts and have shared samples among themselves and with other investigators: the Radiation Effects Research Foundation (RERF) and its predecessor the Atomic Bomb Casualty Commission (ABCC), and Hiroshima University. In addition, a number of others are listed, who also contributed to the literature by their collection of samples. (J.P.N.)

  9. Quantum random number generator

    Science.gov (United States)

    Pooser, Raphael C.

    2016-05-10

    A quantum random number generator (QRNG) and a photon generator for a QRNG are provided. The photon generator may be operated in a spontaneous mode below a lasing threshold to emit photons. Photons emitted from the photon generator may have at least one random characteristic, which may be monitored by the QRNG to generate a random number. In one embodiment, the photon generator may include a photon emitter and an amplifier coupled to the photon emitter. The amplifier may enable the photon generator to be used in the QRNG without introducing significant bias in the random number and may enable multiplexing of multiple random numbers. The amplifier may also desensitize the photon generator to fluctuations in power supplied thereto while operating in the spontaneous mode. In one embodiment, the photon emitter and amplifier may be a tapered diode amplifier.

  10. Steam generator tube extraction

    International Nuclear Information System (INIS)

    Delorme, H.

    1985-05-01

    To enable tube examination on steam generators in service, Framatome has now developed a process for removing sections of steam generator tubes. Tube sections can be removed without being damaged for treating the tube section expanded in the tube sheet

  11. Generating Seismograms with Deep Neural Networks

    Science.gov (United States)

    Krischer, L.; Fichtner, A.

    2017-12-01

    The recent surge of successful uses of deep neural networks in computer vision, speech recognition, and natural language processing, mainly enabled by the availability of fast GPUs and extremely large data sets, is starting to see many applications across all natural sciences. In seismology these are largely confined to classification and discrimination tasks. In this contribution we explore the use of deep neural networks for another class of problems: so called generative models.Generative modelling is a branch of statistics concerned with generating new observed data samples, usually by drawing from some underlying probability distribution. Samples with specific attributes can be generated by conditioning on input variables. In this work we condition on seismic source (mechanism and location) and receiver (location) parameters to generate multi-component seismograms.The deep neural networks are trained on synthetic data calculated with Instaseis (http://instaseis.net, van Driel et al. (2015)) and waveforms from the global ShakeMovie project (http://global.shakemovie.princeton.edu, Tromp et al. (2010)). The underlying radially symmetric or smoothly three dimensional Earth structures result in comparatively small waveform differences from similar events or at close receivers and the networks learn to interpolate between training data samples.Of particular importance is the chosen misfit functional. Generative adversarial networks (Goodfellow et al. (2014)) implement a system in which two networks compete: the generator network creates samples and the discriminator network distinguishes these from the true training examples. Both are trained in an adversarial fashion until the discriminator can no longer distinguish between generated and real samples. We show how this can be applied to seismograms and in particular how it compares to networks trained with more conventional misfit metrics. Last but not least we attempt to shed some light on the black-box nature of

  12. SAAS-CNV: A Joint Segmentation Approach on Aggregated and Allele Specific Signals for the Identification of Somatic Copy Number Alterations with Next-Generation Sequencing Data.

    Science.gov (United States)

    Zhang, Zhongyang; Hao, Ke

    2015-11-01

    Cancer genomes exhibit profound somatic copy number alterations (SCNAs). Studying tumor SCNAs using massively parallel sequencing provides unprecedented resolution and meanwhile gives rise to new challenges in data analysis, complicated by tumor aneuploidy and heterogeneity as well as normal cell contamination. While the majority of read depth based methods utilize total sequencing depth alone for SCNA inference, the allele specific signals are undervalued. We proposed a joint segmentation and inference approach using both signals to meet some of the challenges. Our method consists of four major steps: 1) extracting read depth supporting reference and alternative alleles at each SNP/Indel locus and comparing the total read depth and alternative allele proportion between tumor and matched normal sample; 2) performing joint segmentation on the two signal dimensions; 3) correcting the copy number baseline from which the SCNA state is determined; 4) calling SCNA state for each segment based on both signal dimensions. The method is applicable to whole exome/genome sequencing (WES/WGS) as well as SNP array data in a tumor-control study. We applied the method to a dataset containing no SCNAs to test the specificity, created by pairing sequencing replicates of a single HapMap sample as normal/tumor pairs, as well as a large-scale WGS dataset consisting of 88 liver tumors along with adjacent normal tissues. Compared with representative methods, our method demonstrated improved accuracy, scalability to large cancer studies, capability in handling both sequencing and SNP array data, and the potential to improve the estimation of tumor ploidy and purity.

  13. Microwatt thermoelectric generator

    International Nuclear Information System (INIS)

    Goslee, D.E.; Bustard, T.S.

    1976-01-01

    A microwatt thermoelectric generator suitable for implanting in the body is described. The generator utilizes a nuclear energy source. Provision is made for temporary electrical connection to the generator for testing purposes, and for ensuring that the heat generated by the nuclear source does not bypass the pile. Also disclosed is a getter which is resistant to shrinkage during sintering, and a foil configuration for controlling the radiation of heat from the nuclear source to the hot plate of the pile

  14. Microwatt thermoelectric generator

    International Nuclear Information System (INIS)

    Goslee, D.E.

    1976-01-01

    A microwatt thermoelectric generator suitable for implanting in the body is described. The disclosed generator utilizes a nuclear energy source. Provision is made for temporary electrical connection to the generator for testing purposes, and for ensuring that the heat generated by the nuclear source does not bypass the pile. Also disclosed is a getter which is resistant to shrinkage during sintering, and a foil configuration for controlling the radiation of heat from the nuclear source to the hot plate of the pile

  15. INEL Sample Management Office

    International Nuclear Information System (INIS)

    Watkins, C.

    1994-01-01

    The Idaho National Engineering Laboratory (INEL) Sample Management Office (SMO) was formed as part of the EG ampersand G Idaho Environmental Restoration Program (ERP) in June, 1990. Since then, the SMO has been recognized and sought out by other prime contractors and programs at the INEL. Since December 1991, the DOE-ID Division Directors for the Environmental Restoration Division and Waste Management Division supported the expansion of the INEL ERP SMO into the INEL site wide SMO. The INEL SMO serves as a point of contact for multiple environmental analytical chemistry and laboratory issues (e.g., capacity, capability). The SMO chemists work with project managers during planning to help develop data quality objectives, select appropriate analytical methods, identify special analytical services needs, identify a source for the services, and ensure that requirements for sampling and analysis (e.g., preservations, sample volumes) are clear and technically accurate. The SMO chemists also prepare work scope statements for the laboratories performing the analyses

  16. Radioactive air sampling methods

    CERN Document Server

    Maiello, Mark L

    2010-01-01

    Although the field of radioactive air sampling has matured and evolved over decades, it has lacked a single resource that assimilates technical and background information on its many facets. Edited by experts and with contributions from top practitioners and researchers, Radioactive Air Sampling Methods provides authoritative guidance on measuring airborne radioactivity from industrial, research, and nuclear power operations, as well as naturally occuring radioactivity in the environment. Designed for industrial hygienists, air quality experts, and heath physicists, the book delves into the applied research advancing and transforming practice with improvements to measurement equipment, human dose modeling of inhaled radioactivity, and radiation safety regulations. To present a wide picture of the field, it covers the international and national standards that guide the quality of air sampling measurements and equipment. It discusses emergency response issues, including radioactive fallout and the assets used ...

  17. Interactive Sample Book (ISB)

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen; Lenau, Torben Anker; Guglielmi, Michel

    2009-01-01

    supervisor Torben A. Lenau. Inspiration to use smart materials Interactive textiles are still quite an unknown phenomenon to many. It is thus often difficult to communicate what kind of potentials lie within these materials. This is why the ISB project was started, as a practice based research project...... and senses in relation to integrated decoration and function primarily to indoor applications. The result of the project will be a number of interactive textiles, to be gathered in an interactive sample book (ISB), in a similar way as the sample books of wallpapers one can take home from the shop and choose...... from. In other words, it is a kind of display material, which in a simple manner can illustrate how different techniques and smart materials work. The sample book should display a number of possibilities where sensor technology, smart materials and textiles are mixed to such an extent that the textile...

  18. Uniform random number generators

    Science.gov (United States)

    Farr, W. R.

    1971-01-01

    Methods are presented for the generation of random numbers with uniform and normal distributions. Subprogram listings of Fortran generators for the Univac 1108, SDS 930, and CDC 3200 digital computers are also included. The generators are of the mixed multiplicative type, and the mathematical method employed is that of Marsaglia and Bray.

  19. Generational Pension Plan Designs

    NARCIS (Netherlands)

    Huang, Xiaohong; Mahieu, Ronald

    2010-01-01

    We propose a generational plan for the occupational pension provision in which people from the same generation are pooled in a generational fund. Each fund can set its own policies independently. This plan provides the benefits of differentiation missing in the prevailing collective plan and the

  20. Consolidated nuclear steam generator

    International Nuclear Information System (INIS)

    Jabsen, F.S.; Schluderberg, D.C.; Paulson, A.E.

    1978-01-01

    An improved system of providing power has a unique generating means for nuclear reactors with a number of steam generators in the form of replaceable modular units of the expendable type to attain the optimum in effective and efficient vaporization of fluid during the generating power. The system is most adaptable to undrground power plants and marine usage

  1. Logarithmic-function generator

    Science.gov (United States)

    Caron, P. R.

    1975-01-01

    Solid-state logarithmic-function generator is compact and provides improved accuracy. Generator includes a stable multivibrator feeding into RC circuit. Resulting exponentially decaying voltage is compared with input signal. Generator output is proportional to time required for exponential voltage to decay from preset reference level to level of input signal.

  2. Generation and Context Memory

    Science.gov (United States)

    Mulligan, Neil W.; Lozito, Jeffrey P.; Rosner, Zachary A.

    2006-01-01

    Generation enhances memory for occurrence but may not enhance other aspects of memory. The present study further delineates the negative generation effect in context memory reported in N. W. Mulligan (2004). First, the negative generation effect occurred for perceptual attributes of the target item (its color and font) but not for extratarget…

  3. Radio-isotope generator

    International Nuclear Information System (INIS)

    Benjamins, H.M.

    1983-01-01

    A device is claimed for interrupting an elution process in a radioisotope generator before an elution vial is entirely filled. The generator is simultaneously exposed to sterile air both in the direction of the generator column and of the elution vial

  4. Analysis of monazite samples

    International Nuclear Information System (INIS)

    Kartiwa Sumadi; Yayah Rohayati

    1996-01-01

    The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis

  5. Study of tritium permeation through Peach Bottom Steam Generator tubes

    International Nuclear Information System (INIS)

    Yang, L.; Baugh, W.A.; Baldwin, N.L.

    1977-06-01

    The report describes the equipment developed, samples tested, procedures used, and results obtained in the tritium permeation tests conducted on steam generator tubing samples which were removed from the Peach Bottom Unit No. 1 reactor

  6. HAMMER: Reweighting tool for simulated data samples

    CERN Document Server

    Duell, Stephan; Ligeti, Zoltan; Papucci, Michele; Robinson, Dean

    2016-01-01

    Modern flavour physics experiments, such as Belle II or LHCb, require large samples of generated Monte Carlo events. Monte Carlo events often are processed in a sophisticated chain that includes a simulation of the detector response. The generation and reconstruction of large samples is resource-intensive and in principle would need to be repeated if e.g. parameters responsible for the underlying models change due to new measurements or new insights. To avoid having to regenerate large samples, we work on a tool, The Helicity Amplitude Module for Matrix Element Reweighting (HAMMER), which allows one to easily reweight existing events in the context of semileptonic b → q ` ̄ ν ` analyses to new model parameters or new physics scenarios.

  7. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  8. Motor/generator

    Science.gov (United States)

    Hickam, Christopher Dale [Glasford, IL

    2008-05-13

    A motor/generator is provided for connecting between a transmission input shaft and an output shaft of a prime mover. The motor/generator may include a motor/generator housing, a stator mounted to the motor/generator housing, a rotor mounted at least partially within the motor/generator housing and rotatable about a rotor rotation axis, and a transmission-shaft coupler drivingly coupled to the rotor. The transmission-shaft coupler may include a clamp, which may include a base attached to the rotor and a plurality of adjustable jaws.

  9. Generator for radionuclide

    International Nuclear Information System (INIS)

    Weisner, P.S.; Forrest, T.R.F.

    1985-01-01

    This invention provides a radionuclide generator of the kind in which a parent radionuclide, adsorbed on a column of particulate material, generates a daughter radionuclide which is periodically removed from the column. This invention is particularly concerned with technetium generators using single collection vials. The generator comprises a column, a first reservoir for the eluent, a second reservoir to contain the volume of eluent required for a single elution, and means connecting the first reservoir to the second reservoir and the second reservoir to the column. Such a generator is particularly suitable for operation by vacuum elution

  10. Liquid waste sampling device

    International Nuclear Information System (INIS)

    Kosuge, Tadashi

    1998-01-01

    A liquid pumping pressure regulator is disposed on the midway of a pressure control tube which connects the upper portion of a sampling pot and the upper portion of a liquid waste storage vessel. With such a constitution, when the pressure in the sampling pot is made negative, and liquid wastes are sucked to the liquid pumping tube passing through the sampling pot, the difference between the pressure on the entrance of the liquid pumping pressure regulator of the pressure regulating tube and the pressure at the bottom of the liquid waste storage vessel is made constant. An opening degree controlling meter is disposed to control the degree of opening of a pressure regulating valve for sending actuation pressurized air to the liquid pumping pressure regulator. Accordingly, even if the liquid level of liquid wastes in the liquid waste storage vessel is changed, the height for the suction of the liquid wastes in the liquid pumping tube can be kept constant. With such procedures, sampling can be conducted correctly, and the discharge of the liquid wastes to the outside can be prevented. (T.M.)

  11. IXM gas sampling procedure

    International Nuclear Information System (INIS)

    Pingel, L.A.

    1995-01-01

    Ion Exchange Modules (IXMs) are used at the 105-KE and -KW Fuel Storage Basins to control radionuclide concentrations in the water. A potential safety concern relates to production of hydrogen gas by radiolysis of the water trapped in the ion exchange media of spent IXMs. This document provides a procedure for sampling the gases in the head space of the IXM

  12. Request for wood samples

    NARCIS (Netherlands)

    NN,

    1977-01-01

    In recent years the wood collection at the Rijksherbarium was greatly expanded following a renewed interest in wood anatomy as an aid for solving classification problems. Staff members of the Rijksherbarium added to the collection by taking interesting wood samples with them from their expeditions

  13. Check Sample Abstracts.

    Science.gov (United States)

    Alter, David; Grenache, David G; Bosler, David S; Karcher, Raymond E; Nichols, James; Rajadhyaksha, Aparna; Camelo-Piragua, Sandra; Rauch, Carol; Huddleston, Brent J; Frank, Elizabeth L; Sluss, Patrick M; Lewandrowski, Kent; Eichhorn, John H; Hall, Janet E; Rahman, Saud S; McPherson, Richard A; Kiechle, Frederick L; Hammett-Stabler, Catherine; Pierce, Kristin A; Kloehn, Erica A; Thomas, Patricia A; Walts, Ann E; Madan, Rashna; Schlesinger, Kathie; Nawgiri, Ranjana; Bhutani, Manoop; Kanber, Yonca; Abati, Andrea; Atkins, Kristen A; Farrar, Robert; Gopez, Evelyn Valencerina; Jhala, Darshana; Griffin, Sonya; Jhala, Khushboo; Jhala, Nirag; Bentz, Joel S; Emerson, Lyska; Chadwick, Barbara E; Barroeta, Julieta E; Baloch, Zubair W; Collins, Brian T; Middleton, Owen L; Davis, Gregory G; Haden-Pinneri, Kathryn; Chu, Albert Y; Keylock, Joren B; Ramoso, Robert; Thoene, Cynthia A; Stewart, Donna; Pierce, Arand; Barry, Michelle; Aljinovic, Nika; Gardner, David L; Barry, Michelle; Shields, Lisa B E; Arnold, Jack; Stewart, Donna; Martin, Erica L; Rakow, Rex J; Paddock, Christopher; Zaki, Sherif R; Prahlow, Joseph A; Stewart, Donna; Shields, Lisa B E; Rolf, Cristin M; Falzon, Andrew L; Hudacki, Rachel; Mazzella, Fermina M; Bethel, Melissa; Zarrin-Khameh, Neda; Gresik, M Vicky; Gill, Ryan; Karlon, William; Etzell, Joan; Deftos, Michael; Karlon, William J; Etzell, Joan E; Wang, Endi; Lu, Chuanyi M; Manion, Elizabeth; Rosenthal, Nancy; Wang, Endi; Lu, Chuanyi M; Tang, Patrick; Petric, Martin; Schade, Andrew E; Hall, Geraldine S; Oethinger, Margret; Hall, Geraldine; Picton, Avis R; Hoang, Linda; Imperial, Miguel Ranoa; Kibsey, Pamela; Waites, Ken; Duffy, Lynn; Hall, Geraldine S; Salangsang, Jo-Anne M; Bravo, Lulette Tricia C; Oethinger, Margaret D; Veras, Emanuela; Silva, Elvia; Vicens, Jimena; Silva, Elvio; Keylock, Joren; Hempel, James; Rushing, Elizabeth; Posligua, Lorena E; Deavers, Michael T; Nash, Jason W; Basturk, Olca; Perle, Mary Ann; Greco, Alba; Lee, Peng; Maru, Dipen; Weydert, Jamie Allen; Stevens, Todd M; Brownlee, Noel A; Kemper, April E; Williams, H James; Oliverio, Brock J; Al-Agha, Osama M; Eskue, Kyle L; Newlands, Shawn D; Eltorky, Mahmoud A; Puri, Puja K; Royer, Michael C; Rush, Walter L; Tavora, Fabio; Galvin, Jeffrey R; Franks, Teri J; Carter, James Elliot; Kahn, Andrea Graciela; Lozada Muñoz, Luis R; Houghton, Dan; Land, Kevin J; Nester, Theresa; Gildea, Jacob; Lefkowitz, Jerry; Lacount, Rachel A; Thompson, Hannis W; Refaai, Majed A; Quillen, Karen; Lopez, Ana Ortega; Goldfinger, Dennis; Muram, Talia; Thompson, Hannis

    2009-02-01

    The following abstracts are compiled from Check Sample exercises published in 2008. These peer-reviewed case studies assist laboratory professionals with continuing medical education and are developed in the areas of clinical chemistry, cytopathology, forensic pathology, hematology, microbiology, surgical pathology, and transfusion medicine. Abstracts for all exercises published in the program will appear annually in AJCP.

  14. 0-6760 : improved trip generation data for Texas using workplace and special generator surveys.

    Science.gov (United States)

    2014-08-01

    Trip generation rates play an important role in : transportation planning, which can help in : making informed decisions about future : transportation investment and design. However, : sometimes the rates are derived from small : sample sizes or may ...

  15. Vapor generation methods for explosives detection research

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Ewing, Robert G.; Atkinson, David A.

    2012-12-01

    The generation of calibrated vapor samples of explosives compounds remains a challenge due to the low vapor pressures of the explosives, adsorption of explosives on container and tubing walls, and the requirement to manage (typically) multiple temperature zones as the vapor is generated, diluted, and delivered. Methods that have been described to generate vapors can be classified as continuous or pulsed flow vapor generators. Vapor sources for continuous flow generators are typically explosives compounds supported on a solid support, or compounds contained in a permeation or diffusion device. Sources are held at elevated isothermal temperatures. Similar sources can be used for pulsed vapor generators; however, pulsed systems may also use injection of solutions onto heated surfaces with generation of both solvent and explosives vapors, transient peaks from a gas chromatograph, or vapors generated by s programmed thermal desorption. This article reviews vapor generator approaches with emphasis on the method of generating the vapors and on practical aspects of vapor dilution and handling. In addition, a gas chromatographic system with two ovens that is configurable with up to four heating ropes is proposed that could serve as a single integrated platform for explosives vapor generation and device testing. Issues related to standards, calibration, and safety are also discussed.

  16. Solar thermoelectric generator

    Science.gov (United States)

    Toberer, Eric S.; Baranowski, Lauryn L.; Warren, Emily L.

    2016-05-03

    Solar thermoelectric generators (STEGs) are solid state heat engines that generate electricity from concentrated sunlight. A novel detailed balance model for STEGs is provided and applied to both state-of-the-art and idealized materials. STEGs can produce electricity by using sunlight to heat one side of a thermoelectric generator. While concentrated sunlight can be used to achieve extremely high temperatures (and thus improved generator efficiency), the solar absorber also emits a significant amount of black body radiation. This emitted light is the dominant loss mechanism in these generators. In this invention, we propose a solution to this problem that eliminates virtually all of the emitted black body radiation. This enables solar thermoelectric generators to operate at higher efficiency and achieve said efficient with lower levels of optical concentration. The solution is suitable for both single and dual axis solar thermoelectric generators.

  17. Tissue Sampling Guides for Porcine Biomedical Models.

    Science.gov (United States)

    Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas

    2016-04-01

    This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.

  18. Searches for Fourth Generation Fermions

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, A.; /Fermilab

    2011-09-01

    We present the results from searches for fourth generation fermions performed using data samples collected by the CDF II and D0 Detectors at the Fermilab Tevatron p{bar p} collider. Many of these results represent the most stringent 95% C. L. limits on masses of new fermions to-date. A fourth chiral generation of massive fermions with the same quantum numbers as the known fermions is one of the simplest extensions of the SM with three generations. The fourth generation is predicted in a number of theories, and although historically have been considered disfavored, stands in agreement with electroweak precision data. To avoid Z {yields} {nu}{bar {nu}} constraint from LEP I a fourth generation neutrino {nu}{sub 4} must be heavy: m({nu}{sub 4}) > m{sub Z}/2, where m{sub Z} is the mass of Z boson, and to avoid LEP II bounds a fourth generation charged lepton {ell}{sub 4} must have m({ell}{sub 4}) > 101 GeV/c{sup 2}. At the same time due to sizeable radiative corrections masses of fourth generation fermions cannot be much higher the current lower bounds and masses of new heavy quarks t' and b' should be in the range of a few hundred GeV/c{sup 2}. In the four-generation model the present bounds on the Higgs are relaxed: the Higgs mass could be as large as 1 TeV/c{sup 2}. Furthermore, the CP violation is significantly enhanced to the magnitude that might account for the baryon asymmetry in the Universe. Additional chiral fermion families can also be accommodated in supersymmetric two-Higgs-doublet extensions of the SM with equivalent effect on the precision fit to the Higgs mass. Another possibility is heavy exotic quarks with vector couplings to the W boson Contributions to radiative corrections from such quarks with mass M decouple as 1/M{sup 2} and easily evade all experimental constraints. At the Tevatron p{bar p} collider 4-th generation chiral or vector-like quarks can be either produced strongly in pairs or singly via electroweak production, where the

  19. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  20. Designing the next generation (fifth generation computers)

    International Nuclear Information System (INIS)

    Wallich, P.

    1983-01-01

    A description is given of the designs necessary to develop fifth generation computers. An analysis is offered of problems and developments in parallelism, VLSI, artificial intelligence, knowledge engineering and natural language processing. Software developments are outlined including logic programming, object-oriented programming and exploratory programming. Computer architecture is detailed including concurrent computer architecture

  1. Systems and methods for self-synchronized digital sampling

    Science.gov (United States)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  2. Nonadiabatic transition path sampling

    International Nuclear Information System (INIS)

    Sherman, M. C.; Corcelli, S. A.

    2016-01-01

    Fewest-switches surface hopping (FSSH) is combined with transition path sampling (TPS) to produce a new method called nonadiabatic path sampling (NAPS). The NAPS method is validated on a model electron transfer system coupled to a Langevin bath. Numerically exact rate constants are computed using the reactive flux (RF) method over a broad range of solvent frictions that span from the energy diffusion (low friction) regime to the spatial diffusion (high friction) regime. The NAPS method is shown to quantitatively reproduce the RF benchmark rate constants over the full range of solvent friction. Integrating FSSH within the TPS framework expands the applicability of both approaches and creates a new method that will be helpful in determining detailed mechanisms for nonadiabatic reactions in the condensed-phase.

  3. Cerenkov fiber sampling calorimeters

    International Nuclear Information System (INIS)

    Arrington, K.; Kefford, D.; Kennedy, J.; Pisani, R.; Sanzeni, C.; Segall, K.; Wall, D.; Winn, D.R.; Carey, R.; Dye, S.; Miller, J.; Sulak, L.; Worstell, W.; Efremenko, Y.; Kamyshkov, Y.; Savin, A.; Shmakov, K.; Tarkovsky, E.

    1994-01-01

    Clear optical fibers were used as a Cerenkov sampling media in Pb (electromagnetic) and Cu (hadron) absorbers in spaghetti calorimeters, for high rate and high radiation dose experiments, such as the forward region of high energy colliders. The fiber axes were aligned close to the direction of the incident particles (1 degree--7 degree). The 7 λ deep hadron tower contained 2.8% by volume 1.5 mm diameter core clear plastic fibers. The 27 radiation length deep electromagnetic towers had packing fractions of 6.8% and 7.2% of 1 mm diameter core quartz fibers as the active Cerenkov sampling medium. The energy resolution on electrons and pions, energy response, pulse shapes and angular studies are presented

  4. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  5. Underwater Sediment Sampling Research

    Science.gov (United States)

    2017-01-01

    impacted sediments was found to be directly related to the concentration of crude oil detected in the sediment pore waters . Applying this mathematical...Kurt.A.Hansen@uscg.mil. 16. Abstract (MAXIMUM 200 WORDS ) The USCG R&D Center sought to develop a bench top system to determine the amount of total...scattered. The approach here is to sample the interstitial water between the grains of sand and attempt to determine the amount of oil in and on

  6. ITOUGH2 sample problems

    International Nuclear Information System (INIS)

    Finsterle, S.

    1997-11-01

    This report contains a collection of ITOUGH2 sample problems. It complements the ITOUGH2 User's Guide [Finsterle, 1997a], and the ITOUGH2 Command Reference [Finsterle, 1997b]. ITOUGH2 is a program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis. It is based on the TOUGH2 simulator for non-isothermal multiphase flow in fractured and porous media [Preuss, 1987, 1991a]. The report ITOUGH2 User's Guide [Finsterle, 1997a] describes the inverse modeling framework and provides the theoretical background. The report ITOUGH2 Command Reference [Finsterle, 1997b] contains the syntax of all ITOUGH2 commands. This report describes a variety of sample problems solved by ITOUGH2. Table 1.1 contains a short description of the seven sample problems discussed in this report. The TOUGH2 equation-of-state (EOS) module that needs to be linked to ITOUGH2 is also indicated. Each sample problem focuses on a few selected issues shown in Table 1.2. ITOUGH2 input features and the usage of program options are described. Furthermore, interpretations of selected inverse modeling results are given. Problem 1 is a multipart tutorial, describing basic ITOUGH2 input files for the main ITOUGH2 application modes; no interpretation of results is given. Problem 2 focuses on non-uniqueness, residual analysis, and correlation structure. Problem 3 illustrates a variety of parameter and observation types, and describes parameter selection strategies. Problem 4 compares the performance of minimization algorithms and discusses model identification. Problem 5 explains how to set up a combined inversion of steady-state and transient data. Problem 6 provides a detailed residual and error analysis. Finally, Problem 7 illustrates how the estimation of model-related parameters may help compensate for errors in that model

  7. The Internet of Samples in the Earth Sciences (iSamples)

    Science.gov (United States)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  8. Lunar sample studies

    International Nuclear Information System (INIS)

    1977-01-01

    Lunar samples discussed and the nature of their analyses are: (1) an Apollo 15 breccia which is thoroughly analyzed as to the nature of the mature regolith from which it derived and the time and nature of the lithification process, (2) two Apollo 11 and one Apollo 12 basalts analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography, (3) eight Apollo 17 mare basalts, also analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography. The first seven are shown to be chemically similar although of two main textural groups; the eighth is seen to be distinct in both chemistry and mineralogy, (4) a troctolitic clast from a Fra Mauro breccia, analyzed and contrasted with other high-temperature lunar mineral assemblages. Two basaltic clasts from the same breccia are shown to have affinities with rock 14053, and (5) the uranium-thorium-lead systematics of three Apollo 16 samples are determined; serious terrestrial-lead contamination of the first two samples is attributed to bandsaw cutting in the lunar curatorial facility

  9. Sustainable Mars Sample Return

    Science.gov (United States)

    Alston, Christie; Hancock, Sean; Laub, Joshua; Perry, Christopher; Ash, Robert

    2011-01-01

    The proposed Mars sample return mission will be completed using natural Martian resources for the majority of its operations. The system uses the following technologies: In-Situ Propellant Production (ISPP), a methane-oxygen propelled Mars Ascent Vehicle (MAV), a carbon dioxide powered hopper, and a hydrogen fueled balloon system (large balloons and small weather balloons). The ISPP system will produce the hydrogen, methane, and oxygen using a Sabatier reactor. a water electrolysis cell, water extracted from the Martian surface, and carbon dioxide extracted from the Martian atmosphere. Indigenous hydrogen will fuel the balloon systems and locally-derived methane and oxygen will fuel the MAV for the return of a 50 kg sample to Earth. The ISPP system will have a production cycle of 800 days and the estimated overall mission length is 1355 days from Earth departure to return to low Earth orbit. Combining these advanced technologies will enable the proposed sample return mission to be executed with reduced initial launch mass and thus be more cost efficient. The successful completion of this mission will serve as the next step in the advancement of Mars exploration technology.

  10. Bottom sample taker

    Energy Technology Data Exchange (ETDEWEB)

    Garbarenko, O V; Slonimskiy, L D

    1982-01-01

    In order to improve the quality of the samples taken during offshore exploration from benthic sediments, the proposed design of the sample taker has a device which makes it possible to regulate the depth of submersion of the core lifter. For this purpose the upper part of the core lifter has an inner delimiting ring, and within the core lifter there is a piston suspended on a cable. The position of the piston in relation to the core lifter is previously assigned depending on the compactness of the benthic sediments and is fixed by tension of the cable which is held by a clamp in the cover of the core taker housing. When lowered to the bottom, the core taker is released, and under the influence of hydrostatic pressure of sea water, it enters the sediments. The magnitude of penetration is limited by the distance between the piston and the stopping ring. The piston also guarantees better preservation of the sample when the instrument is lifted to the surface.

  11. Sample-taking apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Tanov, Y I; Ismailov, R A; Orazov, A

    1980-10-07

    The invention refers to the equipment for testing water-bearing levels in loose rocks. Its purpose is to simultaneously remove with the rock sample a separate fluid sample from the assigned interval. The sample-taking apparatus contains a core lifter which can be submerged into the casting string with housing and front endpiece in the form of a rod with a piston which covers the cavity of the core lifter, as well as mechanism for fixing and moving the endpiece within the core lifter cavity. The device differs from the known similar devices because the upper part of the housing of the core lifter is equipped with a filter and mobile casting which covers the filter. In this case the casing is connected to the endpiece rod and the endpiece is installed with the possibility of movement which is limited with fixing in the upper position and in the extreme upper position it divides the core lifter cavity into two parts, filter settling tank and core-receiving cavity.

  12. UMAPRM: Uniformly sampling the medial axis

    KAUST Repository

    Yeh, Hsin-Yi Cindy

    2014-05-01

    © 2014 IEEE. Maintaining clearance, or distance from obstacles, is a vital component of successful motion planning algorithms. Maintaining high clearance often creates safer paths for robots. Contemporary sampling-based planning algorithms That utilize The medial axis, or The set of all points equidistant To Two or more obstacles, produce higher clearance paths. However, They are biased heavily Toward certain portions of The medial axis, sometimes ignoring parts critical To planning, e.g., specific Types of narrow passages. We introduce Uniform Medial Axis Probabilistic RoadMap (UMAPRM), a novel planning variant That generates samples uniformly on The medial axis of The free portion of Cspace. We Theoretically analyze The distribution generated by UMAPRM and show its uniformity. Our results show That UMAPRM\\'s distribution of samples along The medial axis is not only uniform but also preferable To other medial axis samplers in certain planning problems. We demonstrate That UMAPRM has negligible computational overhead over other sampling Techniques and can solve problems The others could not, e.g., a bug Trap. Finally, we demonstrate UMAPRM successfully generates higher clearance paths in The examples.

  13. Visualizing the Sample Standard Deviation

    Science.gov (United States)

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  14. Generational Accounting in Iran

    Directory of Open Access Journals (Sweden)

    Mahdi Salehi

    2013-09-01

    Full Text Available The aim of this paper is to study of the generation accounts for Iranian’s generation. We applied the method of Auerbach, Gokhale and Kotlihoff (1991 on the period 1967-2008 in Iran. Our calculation shows with compare to industrial countries, fiscal burden for Iranian’s population is very chip and that depend on fiscal system in Iran. Except the recent years the rate of tax in Iran has been very low. The generation account for the old people (40 olds is 2117 $ but the future generation (t+1 is 36985 $. The share of male and female, during the years, in this burden is similar. Fiscal burden for Iranian’s generation is low but this population should support other burden that calls inflation. Because when the government do not receive the tax income, a low generation account transfer to price general level.

  15. Evaluation of Oconee steam-generator debris. Final report

    International Nuclear Information System (INIS)

    Rigdon, M.A.; Rubright, M.M.; Sarver, L.W.

    1981-10-01

    Pieces of debris were observed near damaged tubes at the 14th support plate elevation in the Oconee 1-B steam generator. A project was initiated to evaluate the physical and chemical nature of the debris, to identify its source, and to determine its role in tube damage at this elevation. Various laboratory techniques were used to characterize several debris and mill scale samples. Data from these samples were then compared with each other and with literature data. It was concluded that seven of eight debris samples were probably formed in the steam generator. Six of these samples were probably formed by high temperature aqueous corrosion early in the life of the steam generator. The seventh sample was probably formed by the deposition and spalling of magnetite on the Inconel steam generator tubes. None of the debris samples resembled any of the mill scale samples

  16. Distributed generation hits market

    International Nuclear Information System (INIS)

    Anon.

    1997-01-01

    The pace at which vendors are developing and marketing gas turbines and reciprocating engines for small-scale applications may signal the widespread growth of distributed generation. Loosely defined to refer to applications in which power generation equipment is located close to end users who have near-term power capacity needs, distributed generation encompasses a broad range of technologies and load requirements. Disagreement is inevitable, but many industry observers associate distributed generation with applications anywhere from 25 kW to 25 MW. Ten years ago, distributed generation users only represented about 2% of the world market. Today, that figure has increased to about 4 or 5%, and probably could settle in the 20% range within a 3-to-5-year period, according to Michael Jones, San Diego, Calif.-based Solar Turbines Inc. power generation marketing manager. The US Energy Information Administration predicts about 175 GW of generation capacity will be added domestically by 2010. If 20% comes from smaller plants, distributed generation could account for about 35 GW. Even with more competition, it's highly unlikely distributed generation will totally replace current market structures and central stations. Distributed generation may be best suited for making market inroads when and where central systems need upgrading, and should prove its worth when the system can't handle peak demands. Typical applications include small reciprocating engine generators at remote customer sites or larger gas turbines to boost the grid. Additional market opportunities include standby capacity, peak shaving, power quality, cogeneration and capacity rental for immediate demand requirements. Integration of distributed generation systems--using gas-fueled engines, gas-fired combustion engines and fuel cells--can upgrade power quality for customers and reduce operating costs for electric utilities

  17. Third generation nuclear plants

    Science.gov (United States)

    Barré, Bertrand

    2012-05-01

    After the Chernobyl accident, a new generation of Light Water Reactors has been designed and is being built. Third generation nuclear plants are equipped with dedicated systems to insure that if the worst accident were to occur, i.e. total core meltdown, no matter how low the probability of such occurrence, radioactive releases in the environment would be minimal. This article describes the EPR, representative of this "Generation III" and a few of its competitors on the world market.

  18. Modular Stirling Radioisotope Generator

    Science.gov (United States)

    Schmitz, Paul C.; Mason, Lee S.; Schifer, Nicholas A.

    2016-01-01

    High-efficiency radioisotope power generators will play an important role in future NASA space exploration missions. Stirling Radioisotope Generators (SRGs) have been identified as a candidate generator technology capable of providing mission designers with an efficient, high-specific-power electrical generator. SRGs high conversion efficiency has the potential to extend the limited Pu-238 supply when compared with current Radioisotope Thermoelectric Generators (RTGs). Due to budgetary constraints, the Advanced Stirling Radioisotope Generator (ASRG) was canceled in the fall of 2013. Over the past year a joint study by NASA and the Department of Energy (DOE) called the Nuclear Power Assessment Study (NPAS) recommended that Stirling technologies continue to be explored. During the mission studies of the NPAS, spare SRGs were sometimes required to meet mission power system reliability requirements. This led to an additional mass penalty and increased isotope consumption levied on certain SRG-based missions. In an attempt to remove the spare power system, a new generator architecture is considered, which could increase the reliability of a Stirling generator and provide a more fault-tolerant power system. This new generator called the Modular Stirling Radioisotope Generator (MSRG) employs multiple parallel Stirling convertor/controller strings, all of which share the heat from the General Purpose Heat Source (GPHS) modules. For this design, generators utilizing one to eight GPHS modules were analyzed, which provided about 50 to 450 W of direct current (DC) to the spacecraft, respectively. Four Stirling convertors are arranged around each GPHS module resulting in from 4 to 32 Stirling/controller strings. The convertors are balanced either individually or in pairs, and are radiatively coupled to the GPHS modules. Heat is rejected through the housing/radiator, which is similar in construction to the ASRG. Mass and power analysis for these systems indicate that specific

  19. TFTR generator load assessment

    International Nuclear Information System (INIS)

    Heck, F.M.

    1975-10-01

    Typical experimental load demands on the TFTR generators are illustrated based on the electrical characteristics of the field coils, the coil leads, the main bus work, the various auxiliary bus work, the rectifiers, and transformers. The generator MW capacities are shown to be adequate for the proposed experimental operations with allowances made for variations in the final designs. The generator MVA capacities are shown to be adequate provided portions of the TF and EF rectifiers are freewheeled at selected times

  20. Solar fuels generator

    Science.gov (United States)

    Lewis, Nathan S.; Spurgeon, Joshua M.

    2016-10-25

    The solar fuels generator includes an ionically conductive separator between a gaseous first phase and a second phase. A photoanode uses one or more components of the first phase to generate cations during operation of the solar fuels generator. A cation conduit is positioned provides a pathway along which the cations travel from the photoanode to the separator. The separator conducts the cations. A second solid cation conduit conducts the cations from the separator to a photocathode.

  1. Factors affecting the rural domestic waste generation

    Directory of Open Access Journals (Sweden)

    A.R. Darban Astane

    2017-12-01

    Full Text Available The current study was carried out to evaluate the quantity and quality of rural domestic waste generation and to identify the factors affecting it in rural areas of Khodabandeh county in Zanjan Province, Iran. Waste samplings consisted of 318 rural households in 11 villages. In order to evaluate the quality and quantity of the rural domestic waste, waste production was classified into 12 groups and 2 main groups of organic waste and solid waste. Moreover, kriging interpolation technique in ARC-GIS software was used to evaluate the spatial distribution of the generated domestic waste and ultimately multiple regression analysis was used to evaluate the factors affecting the generation of domestic waste. The results of this study showed that the average waste generated by each person was 0.588 kilograms per day. with the share of organic waste generated by each person being 0.409 kilograms per day and the share of solid waste generated by each person being 0.179 kilograms per day. The results from spatial distribution of waste generation showed a certain pattern in three groups and a higher rate of waste generation in the northern and northwestern parts, especially in the subdistrict. The results of multiple regression analysis showed that the households’ income, assets, age, and personal attitude are respectively the most important variables affecting waste generation. The housholds’ attitude and indigenous knowledge on efficient use of materials are also the key factors which can help reducing waste generation.

  2. Microwatt thermoelectric generator

    International Nuclear Information System (INIS)

    Hittman, F.; Bustard, T.S.

    1976-01-01

    A microwatt thermoelectric generator suitable for implanting in the body is described. The generator utilizes a nuclear energy source. Provision is made for temporary electrical connection to the generator for testing purposes, and for ensuring that the heat generated by the nuclear source does not bypass the pile. Also disclosed is a getter which is resistant to shrinkage during sintering, and a foil configuration for controlling the radiation of heat from the nuclear source to the hot plate of the pile. 2 claims, 4 drawing figures

  3. Microwatt thermoelectric generator

    International Nuclear Information System (INIS)

    Barr, H.N.

    1978-01-01

    A microwatt thermoelectric generator suitable for implanting in the body is described. The disclosed generator utilizes a nuclear energy source. Provision is made for temporary electrical connection to the generator for testing purposes, and for ensuring that the heat generated by the nuclear source does not bypass the pile. Also disclosed is a getter which is resistant to shrinkage during sintering, and a foil configuration for controlling the radiation of heat from the nuclear source to the hot plate of the pile. 4 claims, 4 figures

  4. SMUG: Scientific Music Generator

    DEFF Research Database (Denmark)

    Scirea, Marco; A B Barros, Gabriella; Togelius, Julian

    2015-01-01

    Music is based on the real world. Composers use their day-to-day lives as inspiration to create rhythm and lyrics. Procedural music generators are capable of creating good quality pieces, and while some already use the world as inspiration, there is still much to be explored in this. We describe...... a system to generate lyrics and melodies from real-world data, in particular from academic papers. Through this we want to create a playful experience and establish a novel way of generating content (textual and musical) that could be applied to other domains, in particular to games. For melody generation...

  5. NEGATIVE GATE GENERATOR

    Science.gov (United States)

    Jones, C.S.; Eaton, T.E.

    1958-02-01

    This patent relates to pulse generating circuits and more particularly to rectangular pulse generators. The pulse generator of the present invention incorporates thyratrons as switching elements to discharge a first capacitor through a load resistor to initiate and provide the body of a Pulse, and subsequently dlscharge a second capacitor to impress the potential of its charge, with opposite potential polarity across the load resistor to terminate the pulse. Accurate rectangular pulses in the millimicrosecond range are produced across a low impedance by this generator.

  6. Third generation coaching

    DEFF Research Database (Denmark)

    Stelter, Reinhard

    2014-01-01

    Third generation coaching unfolds a new universe for coaching and coaching psychology in the framework of current social research, new learning theories and discourses about personal leadership. Third generation coaching views coaching in a societal perspective. Coaching has become important...... transformation. Coaching thus facilitates new reflections and perspectives, as well as empowerment and support for self-Bildung processes. Third generation coaching focuses on the coach and the coachee in their narrative collaborative partnership. Unlike first generation coaching, where the goal is to help...

  7. Gearless wind power generator

    Energy Technology Data Exchange (ETDEWEB)

    Soederlund, L.; Ridanpaeae, P.; Vihriaelae, H.; Peraelae, R. [Tampere Univ. of Technology (Finland). Lab. of Electricity and Magnetism

    1998-12-31

    During the wind power generator project a design algorithm for a gearless permanent magnet generator with an axially orientated magnetic flux was developed and a 10 kW model machine was constructed. Utilising the test results a variable wind speed system of 100 kW was designed that incorporates a permanent magnet generator, a frequency converter and a fuzzy controller. This system produces about 5-15% more energy than existing types and stresses to the blades are minimised. The type of generator designed in the project represents in general a gearless solution for slow-speed electrical drives. (orig.)

  8. EPRI steam generator programs

    International Nuclear Information System (INIS)

    Martel, L.J.; Passell, T.O.; Bryant, P.E.C.; Rentler, R.M.

    1977-01-01

    The paper describes the current overall EPRI steam generator program plan and some of the ongoing projects. Because of the recent occurrence of a corrosion phenomenon called ''denting,'' which has affected a number of operating utilities, an expanded program plan is being developed which addresses the broad and urgent needs required to achieve improved steam generator reliability. The goal of improved steam generator reliability will require advances in various technologies and also a management philosophy that encourages conscientious efforts to apply the improved technologies to the design, procurement, and operation of plant systems and components that affect the full life reliability of steam generators

  9. Subrandom methods for multidimensional nonuniform sampling.

    Science.gov (United States)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Sampling of temporal networks: Methods and biases

    Science.gov (United States)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  11. Commanding Generation Y: How Generation X Military Leaders Can Better Utilize Generational Tendencies

    Science.gov (United States)

    2013-03-21

    generation ( Baby Boomers ). Although the profession of arms is a time-honored tradition steeped in discipline...senior leadership generational tendencies. Command; Leadership; Generation ; Baby Boomer ; Generation X; Generation Y Unclass Unclass Unclass UU 32 USMC...enable commanders to better lead Generation Y within the U.S. military. Discussion: Baby Boomers , Generation X, and Generation Y are

  12. Pulsed Corona Discharge Generated By Marx Generator

    Science.gov (United States)

    Sretenovic, G. B.; Obradovic, B. M.; Kovacevic, V. V.; Kuraica, M. M.; Puric J.

    2010-07-01

    The pulsed plasma has a significant role in new environmental protection technologies. As a part of a pulsed corona system for pollution control applications, Marx type repetitive pulse generator was constructed and tested in arrangement with wire-plate corona reactor. We performed electrical measurements, and obtained voltage and current signals, and also power and energy delivered per pulse. Ozone formation by streamer plasma in air was chosen to monitor chemical activity of the pulsed corona discharge.

  13. Sample Selection for Training Cascade Detectors.

    Science.gov (United States)

    Vállez, Noelia; Deniz, Oscar; Bueno, Gloria

    2015-01-01

    Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  14. Sample Selection for Training Cascade Detectors.

    Directory of Open Access Journals (Sweden)

    Noelia Vállez

    Full Text Available Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.

  15. Fast Ordered Sampling of DNA Sequence Variants

    Directory of Open Access Journals (Sweden)

    Anthony J. Greenberg

    2018-05-01

    Full Text Available Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects.

  16. Fast Ordered Sampling of DNA Sequence Variants.

    Science.gov (United States)

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  17. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-09-12

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology, possibly one under development at Nonlinear Ion Dynamics (NID), will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL in January 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are reported here. A second sample of isotopically separated copper was provided by NID to PNNL in August 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are also reported here.

  18. Analysis of IFR samples at ANL-E

    International Nuclear Information System (INIS)

    Bowers, D.L.; Sabau, C.S.

    1993-01-01

    The Analytical Chemistry Laboratory analyzes a variety of samples submitted by the different research groups within IFR. This talk describes the analytical work on samples generated by the Plutonium Electrorefiner, Large Scale Electrorefiner and Waste Treatment Studies. The majority of these samples contain Transuranics and necessitate facilities that safely contain these radioisotopes. Details such as: sample receiving, dissolution techniques, chemical separations, Instrumentation used, reporting of results are discussed. The Importance of Interactions between customer and analytical personnel Is also demonstrated

  19. Generation and storage of quantum states using cold atoms

    DEFF Research Database (Denmark)

    Dantan, Aurelien Romain; Josse, Vincent; Cviklinski, Jean

    2006-01-01

    Cold cesium or rubidium atomic samples have a good potential both for generation and storage of nonclassical states of light. Generation of nonclassical states of light is possible through the high non-linearity of cold atomic samples excited close to a resonance line. Quadrature squeezing, polar...

  20. Characterization plan for the Hanford Generating Plant (HGP)

    International Nuclear Information System (INIS)

    Marske, S.G.

    1996-09-01

    This characterization plan describes the sample collection and sample analysis activities to characterize the Hanford Generating Plant and associated solid waste management units (SWMUs). The analytical data will be used to identify the radiological contamination in the Hanford Generating Plant as well as the presence of radiological and hazardous materials in the SWMUs to support further estimates of decontamination interpretation for demolition

  1. Samples and Sampling Protocols for Scientific Investigations | Joel ...

    African Journals Online (AJOL)

    ... from sampling, through sample preparation, calibration to final measurement and reporting. This paper, therefore offers useful information on practical guidance on sampling protocols in line with best practice and international standards. Keywords: Sampling, sampling protocols, chain of custody, analysis, documentation ...

  2. Systematic Sampling and Cluster Sampling of Packet Delays

    OpenAIRE

    Lindh, Thomas

    2006-01-01

    Based on experiences of a traffic flow performance meter this papersuggests and evaluates cluster sampling and systematic sampling as methods toestimate average packet delays. Systematic sampling facilitates for exampletime analysis, frequency analysis and jitter measurements. Cluster samplingwith repeated trains of periodically spaced sampling units separated by randomstarting periods, and systematic sampling are evaluated with respect to accuracyand precision. Packet delay traces have been ...

  3. Design and performance of chromium mist generator

    Directory of Open Access Journals (Sweden)

    Tirgar Aram

    2006-01-01

    Full Text Available Chromium mist generator is an essential tool for conducting researches and making science-based recommendations to evaluate air pollution and its control systems. The purpose of this research was to design and construct a homogenous chromium mist generator and the study of some effective factors including sampling height and distances between samplers in side-by-side sampling on chromium mist sampling method. A mist generator was constructed, using a chromium electroplating bath in pilot scale. Concentration of CrO3 and sulfuric acid in plating solution was 125 g L-1 and 1.25 g L-1, respectively. In order to create permanent air sampling locations, a Plexiglas cylindrical chamber (75 cm height, 55 cm i.d was installed the bath overhead. Sixty holes were produced on the chamber in 3 rows (each 20. The distance between rows and holes was 15 and 7.5 cm, respectively. Homogeneity and effective factors were studied via side-by-side air sampling method. So, 48 clusters of samples were collected on polyvinyl chloride (PVC filters housed in sampling cassettes. Cassettes were located in 35, 50, and 65 cm above the solution surface with less than 7.5 and/or 7.5-15 cm distance between heads. All samples were analyzed according to the NIOSH method 7600. According to the ANOVA test, no significant differences were observed between different sampling locations in side-by-side sampling (P=0.82 and between sampling heights and different samplers distances (P=0.86 and 0.86, respectively. However, there were notable differences between means of coefficient of variations (CV in various heights and distances. It is concluded that the most chromium mist homogeneity could be obtained at height 50 cm from the bath solution surface and samplers distance of < 7.5 cm.

  4. An 'intelligent' approach to radioimmunoassay sample counting employing a microprocessor-controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1978-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore imperative that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. Most of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be related to the counting errors for that sample. The objective of the paper is to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5- to 10-fold to be made. (author)

  5. 'Intelligent' approach to radioimmunoassay sample counting employing a microprocessor controlled sample counter

    International Nuclear Information System (INIS)

    Ekins, R.P.; Sufi, S.; Malan, P.G.

    1977-01-01

    The enormous impact on medical science in the last two decades of microanalytical techniques employing radioisotopic labels has, in turn, generated a large demand for automatic radioisotopic sample counters. Such instruments frequently comprise the most important item of capital equipment required in the use of radioimmunoassay and related techniques and often form a principle bottleneck in the flow of samples through a busy laboratory. It is therefore particularly imperitive that such instruments should be used 'intelligently' and in an optimal fashion to avoid both the very large capital expenditure involved in the unnecessary proliferation of instruments and the time delays arising from their sub-optimal use. The majority of the current generation of radioactive sample counters nevertheless rely on primitive control mechanisms based on a simplistic statistical theory of radioactive sample counting which preclude their efficient and rational use. The fundamental principle upon which this approach is based is that it is useless to continue counting a radioactive sample for a time longer than that required to yield a significant increase in precision of the measurement. Thus, since substantial experimental errors occur during sample preparation, these errors should be assessed and must be releted to the counting errors for that sample. It is the objective of this presentation to demonstrate that the combination of a realistic statistical assessment of radioactive sample measurement, together with the more sophisticated control mechanisms that modern microprocessor technology make possible, may often enable savings in counter usage of the order of 5-10 fold to be made. (orig.) [de

  6. Sampling the Mouse Hippocampal Dentate Gyrus

    Directory of Open Access Journals (Sweden)

    Lisa Basler

    2017-12-01

    Full Text Available Sampling is a critical step in procedures that generate quantitative morphological data in the neurosciences. Samples need to be representative to allow statistical evaluations, and samples need to deliver a precision that makes statistical evaluations not only possible but also meaningful. Sampling generated variability should, e.g., not be able to hide significant group differences from statistical detection if they are present. Estimators of the coefficient of error (CE have been developed to provide tentative answers to the question if sampling has been “good enough” to provide meaningful statistical outcomes. We tested the performance of the commonly used Gundersen-Jensen CE estimator, using the layers of the mouse hippocampal dentate gyrus as an example (molecular layer, granule cell layer and hilus. We found that this estimator provided useful estimates of the precision that can be expected from samples of different sizes. For all layers, we found that a smoothness factor (m of 0 generally provided better estimates than an m of 1. Only for the combined layers, i.e., the entire dentate gyrus, better CE estimates could be obtained using an m of 1. The orientation of the sections impacted on CE sizes. Frontal (coronal sections are typically most efficient by providing the smallest CEs for a given amount of work. Applying the estimator to 3D-reconstructed layers and using very intense sampling, we observed CE size plots with m = 0 to m = 1 transitions that should also be expected but are not often observed in real section series. The data we present also allows the reader to approximate the sampling intervals in frontal, horizontal or sagittal sections that provide CEs of specified sizes for the layers of the mouse dentate gyrus.

  7. NID Copper Sample Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kouzes, Richard T.; Zhu, Zihua

    2011-02-01

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.

  8. Fluid sampling tool

    Science.gov (United States)

    Garcia, A.R.; Johnston, R.G.; Martinez, R.K.

    1999-05-25

    A fluid sampling tool is described for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall. 6 figs.

  9. NID Copper Sample Analysis

    International Nuclear Information System (INIS)

    Kouzes, Richard T.; Zhu, Zihua

    2011-01-01

    The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76 Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76 Ge. The DEMONSTRATOR will utilize 76 Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.

  10. Fluid sampling tool

    Science.gov (United States)

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    1999-05-25

    A fluid sampling tool for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall.

  11. Tritium sampling and measurement

    International Nuclear Information System (INIS)

    Wood, M.J.; McElroy, R.G.; Surette, R.A.; Brown, R.M.

    1993-01-01

    Current methods for sampling and measuring tritium are described. Although the basic techniques have not changed significantly over the last 10 y, there have been several notable improvements in tritium measurement instrumentation. The design and quality of commercial ion-chamber-based and gas-flow-proportional-counter-based tritium monitors for tritium-in-air have improved, an indirect result of fusion-related research in the 1980s. For tritium-in-water analysis, commercial low-level liquid scintillation spectrometers capable of detecting tritium-in-water concentrations as low as 0.65 Bq L-1 for counting times of 500 min are available. The most sensitive method for tritium-in-water analysis is still 3He mass spectrometry. Concentrations as low as 0.35 mBq L-1 can be detected with current equipment. Passive tritium-oxide-in-air samplers are now being used for workplace monitoring and even in some environmental sampling applications. The reliability, convenience, and low cost of passive tritium-oxide-in-air samplers make them attractive options for many monitoring applications. Airflow proportional counters currently under development look promising for measuring tritium-in-air in the presence of high gamma and/or noble gas backgrounds. However, these detectors are currently limited by their poor performance in humidities over 30%. 133 refs

  12. Wroclaw neutrino event generator

    International Nuclear Information System (INIS)

    Nowak, J A

    2006-01-01

    A neutrino event generator developed by the Wroclaw Neutrino Group is described. The physical models included in the generator are discussed and illustrated with the results of simulations. The considered processes are quasi-elastic scattering and pion production modelled by combining the Δ resonance excitation and deep inelastic scattering

  13. When Generations Collide

    Science.gov (United States)

    Fogg, Piper

    2008-01-01

    When four generations converge in the academic workplace, it can create serious culture clashes. It is happening across college campuses--in offices as diverse as admissions, student affairs, legal affairs, and technology. It is especially striking in the faculty ranks, where generational challenges have extra significance amid recruiting efforts,…

  14. Photovoltaic Bias Generator

    Science.gov (United States)

    2018-02-01

    Department of the Army position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an... Interior view of the photovoltaic bias generator showing wrapped-wire side of circuit board...3 Fig. 4 Interior view of the photovoltaic bias generator showing component side of circuit board

  15. Third Generation Coaching

    DEFF Research Database (Denmark)

    Stelter, Reinhard

    2016-01-01

    , Gruppen oder Teams neue Orientierung auf einer tieferen Sinnebene ermöglicht. Im Gegensatz zum Coaching der ersten Generation, bei dem das Erreichen bestimmter, festgeschriebener Ziele im Vordergrund steht, und im Gegensatz zum Coaching der zweiten Generation, in dem wünschenswerte zukünftige...

  16. Hospitality services generate revenue.

    Science.gov (United States)

    Bizouati, S

    1993-01-01

    An increasing number of hospitals are undertaking external revenue-generating activities to supplement their shrinking budgets. Written at the request of Leadership, this article outlines an example of a successful catering service -- a money-generating business that more Canadian hospitals could profitably consider.

  17. Aerodynamically shaped vortex generators

    DEFF Research Database (Denmark)

    Hansen, Martin Otto Laver; Velte, Clara Marika; Øye, Stig

    2016-01-01

    An aerodynamically shaped vortex generator has been proposed, manufactured and tested in a wind tunnel. The effect on the overall performance when applied on a thick airfoil is an increased lift to drag ratio compared with standard vortex generators. Copyright © 2015 John Wiley & Sons, Ltd....

  18. Understanding portable generators

    Energy Technology Data Exchange (ETDEWEB)

    Hills, A.; Hawkins, B. [Guelph Univ., ON (Canada); Clarke, S. [Ontario Ministry of Agriculture, Food and Rural Affairs, Toronto, ON (Canada)

    2000-06-01

    This factsheet is intended to help consumers select a small portable generator for emergency electrical needs. Interest in standby generators has been heightened ever since the prolonged power outage in Eastern Ontario and Southwestern Quebec during the 1998 ice storm and the concern over Y2K related outages. Farmers, in particular, have been reassessing their need for emergency electrical power supply. This document presents some of the factors that should be considered when purchasing and operating a portable generator in the 3 to 12 kW size. It provides a detailed review of power quality and describes the use of tractor-driven power-take-off generators of 15 kW and larger. Several manufacturers make portable generators in many sizes with a whole range of features. This document includes a table depicting generator Feature/Benefit analysis to help consumers understand the differences between features and benefits. A second table provides a check list for generator feature/benefits. Specific details for the operations of various generators are available from manufacturers, distributors and electrical contractors. 2 tabs., 1 fig.

  19. Generative Processes: Thick Drawing

    Science.gov (United States)

    Wallick, Karl

    2012-01-01

    This article presents techniques and theories of generative drawing as a means for developing complex content in architecture design studios. Appending the word "generative" to drawing adds specificity to the most common representation tool and clarifies that such drawings are not singularly about communication or documentation but are…

  20. Generation Y Perspectives

    Science.gov (United States)

    Skytland, Nicholas; Painting, Kristen; Barrera, Aaron; Fitzpatrick, Garret

    2008-01-01

    This viewgraph presentation reviews the perception of NASA and the importance of engaging those people born between 1977 and 2000, also known as Generation Y. It examines some of the differences in attitudes and experiences, and how it reflects on how they view NASA. It also discusses use of the internet in connecting to the people from that generation.

  1. Neutron generator control system

    International Nuclear Information System (INIS)

    Peelman, H.E.; Bridges, J.R.

    1981-01-01

    A method is described of controlling the neutron output of a neutron generator tube used in neutron well logging. The system operates by monitoring the target beam current and comparing a function of this current with a reference voltage level to develop a control signal used in a series regulator to control the replenisher current of the neutron generator tube. (U.K.)

  2. Generation IV national program

    International Nuclear Information System (INIS)

    Preville, M.; Sadhankar, R.; Brady, D.

    2007-01-01

    This paper outlines the Generation IV National Program. This program involves evolutionary and innovative design with significantly higher efficiencies (∼50% compared to present ∼30%) - sustainable, economical, safe, reliable and proliferation resistant - for future energy security. The Generation IV Forum (GIF) effectively leverages the resources of the participants to meet these goals. Ten countries signed the GIF Charter in 2001

  3. Solar Fuel Generator

    Science.gov (United States)

    Lewis, Nathan S. (Inventor); West, William C. (Inventor)

    2017-01-01

    The disclosure provides conductive membranes for water splitting and solar fuel generation. The membranes comprise an embedded semiconductive/photoactive material and an oxygen or hydrogen evolution catalyst. Also provided are chassis and cassettes containing the membranes for use in fuel generation.

  4. The Next Great Generation?

    Science.gov (United States)

    Brownstein, Andrew

    2000-01-01

    Discusses ideas from a new book, "Millennials Rising: The Next Great Generation," (by Neil Howe and William Strauss) suggesting that youth culture is on the cusp of a radical shift with the generation beginning with this year's college freshmen who are typically team oriented, optimistic, and poised for greatness on a global scale. Includes a…

  5. OMG: Open molecule generator

    NARCIS (Netherlands)

    Peironcely, J.E.; Rojas-Chertó, M.; Fichera, D.; Reijmers, T.; Coulier, L.; Faulon, J.-L.; Hankemeier, T.

    2012-01-01

    Computer Assisted Structure Elucidation has been used for decades to discover the chemical structure of unknown compounds. In this work we introduce the first open source structure generator, Open Molecule Generator (OMG), which for a given elemental composition produces all non-isomorphic chemical

  6. Experience and Its Generation

    Science.gov (United States)

    Youqing, Chen

    2006-01-01

    Experience is an activity that arouses emotions and generates meanings based on vivid sensation and profound comprehension. It is emotional, meaningful, and personal, playing a key role in the course of forming and developing one's qualities. The psychological process of experience generation consists of such links as sensing things, arousing…

  7. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  8. Research results: preserving newborn blood samples.

    Science.gov (United States)

    Lewis, Michelle Huckaby; Scheurer, Michael E; Green, Robert C; McGuire, Amy L

    2012-11-07

    Retention and use, without explicit parental permission, of residual dried blood samples from newborn screening has generated public controversy over concerns about violations of family privacy rights and loss of parental autonomy. The public debate about this issue has included little discussion about the destruction of a potentially valuable public resource that can be used for research that may yield improvements in public health. The research community must advocate for policies and infrastructure that promote retention of residual dried blood samples and their use in biomedical research.

  9. Experience and Its Generation

    Institute of Scientific and Technical Information of China (English)

    Chen Youqing

    2006-01-01

    Experience iS an activity that arouses emotions and generates meanings based on vivid sensation and profound compreh ension.It iS emotional,meaningful,and personal,playing a key role in the course of forming and developing one'S qualities.The psychological process of experience generation consists of such links as sensing things,arousing emotions,promoting comprehension and association,generating insights and meanings,and deepening emotional responses.Undergoing things personally by means of direct sensation,taking part in activities,and living life are the most important preconditions of experience generation.Emotional influence,situational edification,and arts edification ale extemal factors that induce experience generation.

  10. Impacts on power generation

    International Nuclear Information System (INIS)

    Myers, J.; Sidebotton, P.

    1998-01-01

    The future impact of the arrival of natural gas in the Maritime provinces on electricity generation in the region was discussed. Currently, electrical generation sources in Nova Scotia include hydro generation (9 per cent), coal generation (80 per cent), heavy fuel oil generation (8 per cent), and light oil, wood chips and purchased power (3 per cent). It is expected that with the introduction of natural gas electric utilities will take advantage of new gas combustion turbines which have high efficiency rates. An overview of Westcoast Power's operations across Canada was also presented. The Company has three projects in the Maritimes - the Courtney Bay project in New Brunswick, the Bayside Power project, the Irving Paper project - in addition to the McMahon cogeneration plant in Taylor, B.C. figs

  11. Quantum random number generator

    Science.gov (United States)

    Soubusta, Jan; Haderka, Ondrej; Hendrych, Martin

    2001-03-01

    Since reflection or transmission of a quantum particle on a beamsplitter is inherently random quantum process, a device built on this principle does not suffer from drawbacks of neither pseudo-random computer generators or classical noise sources. Nevertheless, a number of physical conditions necessary for high quality random numbers generation must be satisfied. Luckily, in quantum optics realization they can be well controlled. We present an easy random number generator based on the division of weak light pulses on a beamsplitter. The randomness of the generated bit stream is supported by passing the data through series of 15 statistical test. The device generates at a rate of 109.7 kbit/s.

  12. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  13. EGG: Empirical Galaxy Generator

    Science.gov (United States)

    Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; Michałowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.

    2018-04-01

    The Empirical Galaxy Generator (EGG) generates fake galaxy catalogs and images with realistic positions, morphologies and fluxes from the far-ultraviolet to the far-infrared. The catalogs are generated by egg-gencat and stored in binary FITS tables (column oriented). Another program, egg-2skymaker, is used to convert the generated catalog into ASCII tables suitable for ingestion by SkyMaker (ascl:1010.066) to produce realistic high resolution images (e.g., Hubble-like), while egg-gennoise and egg-genmap can be used to generate the low resolution images (e.g., Herschel-like). These tools can be used to test source extraction codes, or to evaluate the reliability of any map-based science (stacking, dropout identification, etc.).

  14. TFTR Motor Generator

    International Nuclear Information System (INIS)

    Murray, J.G.; Bronner, G.; Horton, M.

    1977-01-01

    A general description is given of 475 MVA pulsed motor generators for TFTR at Princeton Plasma Physics Laboratory. Two identical generators operating in parallel are capable of supplying 950 MVA for an equivalent square pulse of 6.77 seconds and 4,500 MJ at 0.7 power factor to provide the energy for the pulsed electrical coils and heating system for TFTR. The description includes the operational features of the 15,000 HP wound rotor motors driving each generator with its starting equipment and cycloconverter for controlling speed, power factor, and regulating line voltage during load pulsing where the generator speed changes from 87.5 to 60 Hz frequency variation to provide the 4,500 MJ or energy. The special design characteristics such as fatigue stress calculations for 10 6 cycles of operation, forcing factor on exciter to provide regulation, and low generator impedance are reviewed

  15. Steam generator life management

    International Nuclear Information System (INIS)

    Tapping, R.L.; Nickerson, J.; Spekkens, P.; Maruska, C.

    1998-01-01

    Steam generators are a critical component of a nuclear power reactor, and can contribute significantly to station unavailability, as has been amply demonstrated in Pressurized Water Reactors (PWRs). CANDU steam generators are not immune to steam generator degradation, and the variety of CANDU steam generator designs and tube materials has led to some unexpected challenges. However, aggressive remedial actions, and careful proactive maintenance activities, have led to a decrease in steam generator-related station unavailability of Canadian CANDUs. AECL and the CANDU utilities have defined programs that will enable existing or new steam generators to operate effectively for 40 years. Research and development work covers corrosion and mechanical degradation of tube bundles and internals, chemistry, thermal hydraulics, fouling, inspection and cleaning, as well as provision for specially tool development for specific problem solving. A major driving force is development of CANDU-specific fitness-for-service guidelines, including appropriate inspection and monitoring technology to measure steam generator condition. Longer-range work focuses on development of intelligent on-line monitoring for the feedwater system and steam generator. New designs have reduced risk of corrosion and fouling, are more easily inspected and cleaned, and are less susceptible to mechanical damage. The Canadian CANDU utilities have developed programs for remedial actions to combat degradation of performance (Gentilly-2, Point Lepreau, Bruce A/B, Pickering A/B), and have developed strategic plans to ensure that good future operation is ensured. This report shows how recent advances in cleaning technology are integrated into a life management strategy, discusses downcomer flow measurement as a means of monitoring steam generator condition, and describes recent advances in hideout return as a life management tool. The research and development program, as well as operating experience, has identified

  16. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    Science.gov (United States)

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Container for gaseous samples for irradiation at accelerators

    International Nuclear Information System (INIS)

    Kupsch, H.; Riemenschneider, J.; Leonhardt, J.

    1985-01-01

    The invention concerns a container for gaseous samples for the irradiation at accelerators especially to generate short-lived radioisotopes. The container is also suitable for storage and transport of the target gas and can be multiply reused

  18. Sampling and examination methods used for TMI-2 samples

    International Nuclear Information System (INIS)

    Marley, A.W.; Akers, D.W.; McIsaac, C.V.

    1988-01-01

    The purpose of this paper is to summarize the sampling and examination techniques that were used in the collection and analysis of TMI-2 samples. Samples ranging from auxiliary building air to core debris were collected and analyzed. Handling of the larger samples and many of the smaller samples had to be done remotely and many standard laboratory analytical techniques were modified to accommodate the extremely high radiation fields associated with these samples. The TMI-2 samples presented unique problems with sampling and the laboratory analysis of prior molten fuel debris. 14 refs., 8 figs

  19. RenderGAN: Generating Realistic Labeled Data

    Directory of Open Access Journals (Sweden)

    Leon Sixt

    2018-06-01

    Full Text Available Deep Convolutional Neuronal Networks (DCNNs are showing remarkable performance on many computer vision tasks. Due to their large parameter space, they require many labeled samples when trained in a supervised setting. The costs of annotating data manually can render the use of DCNNs infeasible. We present a novel framework called RenderGAN that can generate large amounts of realistic, labeled images by combining a 3D model and the Generative Adversarial Network framework. In our approach, image augmentations (e.g., lighting, background, and detail are learned from unlabeled data such that the generated images are strikingly realistic while preserving the labels known from the 3D model. We apply the RenderGAN framework to generate images of barcode-like markers that are attached to honeybees. Training a DCNN on data generated by the RenderGAN yields considerably better performance than training it on various baselines.

  20. Estimation of population mean under systematic sampling

    Science.gov (United States)

    Noor-ul-amin, Muhammad; Javaid, Amjad

    2017-11-01

    In this study we propose a generalized ratio estimator under non-response for systematic random sampling. We also generate a class of estimators through special cases of generalized estimator using different combinations of coefficients of correlation, kurtosis and variation. The mean square errors and mathematical conditions are also derived to prove the efficiency of proposed estimators. Numerical illustration is included using three populations to support the results.

  1. Optimal sampling strategy for data mining

    International Nuclear Information System (INIS)

    Ghaffar, A.; Shahbaz, M.; Mahmood, W.

    2013-01-01

    Latest technology like Internet, corporate intranets, data warehouses, ERP's, satellites, digital sensors, embedded systems, mobiles networks all are generating such a massive amount of data that it is getting very difficult to analyze and understand all these data, even using data mining tools. Huge datasets are becoming a difficult challenge for classification algorithms. With increasing amounts of data, data mining algorithms are getting slower and analysis is getting less interactive. Sampling can be a solution. Using a fraction of computing resources, Sampling can often provide same level of accuracy. The process of sampling requires much care because there are many factors involved in the determination of correct sample size. The approach proposed in this paper tries to find a solution to this problem. Based on a statistical formula, after setting some parameters, it returns a sample size called s ufficient sample size , which is then selected through probability sampling. Results indicate the usefulness of this technique in coping with the problem of huge datasets. (author)

  2. Downsampling Non-Uniformly Sampled Data

    Directory of Open Access Journals (Sweden)

    Fredrik Gustafsson

    2007-10-01

    Full Text Available Decimating a uniformly sampled signal a factor D involves low-pass antialias filtering with normalized cutoff frequency 1/D followed by picking out every Dth sample. Alternatively, decimation can be done in the frequency domain using the fast Fourier transform (FFT algorithm, after zero-padding the signal and truncating the FFT. We outline three approaches to decimate non-uniformly sampled signals, which are all based on interpolation. The interpolation is done in different domains, and the inter-sample behavior does not need to be known. The first one interpolates the signal to a uniformly sampling, after which standard decimation can be applied. The second one interpolates a continuous-time convolution integral, that implements the antialias filter, after which every Dth sample can be picked out. The third frequency domain approach computes an approximate Fourier transform, after which truncation and IFFT give the desired result. Simulations indicate that the second approach is particularly useful. A thorough analysis is therefore performed for this case, using the assumption that the non-uniformly distributed sampling instants are generated by a stochastic process.

  3. Sampling for radionuclides and other trace substances

    International Nuclear Information System (INIS)

    Eberhardt, L.L.

    1976-01-01

    Various problems with the environment and an energy crisis have resulted in considerable emphasis on the analysis and understanding of natural systems. The present generation of ecological models suffers greatly from a lack of attention to use of accurate and efficient sampling methods in obtaining the data on which these models are based. Improving ecological sampling requires first of all that the objectives be clearly defined, since different schemes are required for sampling for totals, for changes over time and space, to determine hazards, or for estimating parameters in models. The frequency distributions of most ecological contaminants are not normal, but seem instead to follow a skewed distribution. Coefficients of variation appear to be relatively constant and typical values may range from 0.1 to 1.0 depending on the substance and circumstances. These typical values may be very useful in designing a sampling plan, either for fixed relative variance, or in terms of the sensitivity of a comparison. Several classes of sampling methods are available for particular kinds of objectives. The notion of optimal sampling for parameter estimates is new to ecology, but may possibly be adapted from work done in industrial experimentation to provide a rationale for sampling in time

  4. ExSample. A library for sampling Sudakov-type distributions

    International Nuclear Information System (INIS)

    Plaetzer, Simon

    2011-08-01

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  5. Automatic remote sampling and delivery system incorporating decontamination and disposal of sample bottles

    International Nuclear Information System (INIS)

    Savarkar, V.K.; Mishra, A.K.; Bajpai, D.D.; Nair, M.K.T.

    1990-01-01

    The present generation of reprocessing plants have sampling and delivery systems that have to be operated manually with its associated problems. The complete automation and remotisation of sampling system has hence been considered to reduce manual intervention and personnel exposure. As a part of this scheme an attempt to automate and remotise various steps in sampling system has been made. This paper discusses in detail the development work carried out in this area as well as the tests conducted to incorporate the same in the existing plants. (author). 3 figs

  6. ExSample. A library for sampling Sudakov-type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Plaetzer, Simon

    2011-08-15

    Sudakov-type distributions are at the heart of generating radiation in parton showers as well as contemporary NLO matching algorithms along the lines of the POWHEG algorithm. In this paper, the C++ library ExSample is introduced, which implements adaptive sampling of Sudakov-type distributions for splitting kernels which are in general only known numerically. Besides the evolution variable, the splitting kernels can depend on an arbitrary number of other degrees of freedom to be sampled, and any number of further parameters which are fixed on an event-by-event basis. (orig.)

  7. Compact neutron generator

    Science.gov (United States)

    Leung, Ka-Ngo; Lou, Tak Pui

    2005-03-22

    A compact neutron generator has at its outer circumference a toroidal shaped plasma chamber in which a tritium (or other) plasma is generated. A RF antenna is wrapped around the plasma chamber. A plurality of tritium ion beamlets are extracted through spaced extraction apertures of a plasma electrode on the inner surface of the toroidal plasma chamber and directed inwardly toward the center of neutron generator. The beamlets pass through spaced acceleration and focusing electrodes to a neutron generating target at the center of neutron generator. The target is typically made of titanium tubing. Water is flowed through the tubing for cooling. The beam can be pulsed rapidly to achieve ultrashort neutron bursts. The target may be moved rapidly up and down so that the average power deposited on the surface of the target may be kept at a reasonable level. The neutron generator can produce fast neutrons from a T-T reaction which can be used for luggage and cargo interrogation applications. A luggage or cargo inspection system has a pulsed T-T neutron generator or source at the center, surrounded by associated gamma detectors and other components for identifying explosives or other contraband.

  8. OMG: Open Molecule Generator.

    Science.gov (United States)

    Peironcely, Julio E; Rojas-Chertó, Miguel; Fichera, Davide; Reijmers, Theo; Coulier, Leon; Faulon, Jean-Loup; Hankemeier, Thomas

    2012-09-17

    Computer Assisted Structure Elucidation has been used for decades to discover the chemical structure of unknown compounds. In this work we introduce the first open source structure generator, Open Molecule Generator (OMG), which for a given elemental composition produces all non-isomorphic chemical structures that match that elemental composition. Furthermore, this structure generator can accept as additional input one or multiple non-overlapping prescribed substructures to drastically reduce the number of possible chemical structures. Being open source allows for customization and future extension of its functionality. OMG relies on a modified version of the Canonical Augmentation Path, which grows intermediate chemical structures by adding bonds and checks that at each step only unique molecules are produced. In order to benchmark the tool, we generated chemical structures for the elemental formulas and substructures of different metabolites and compared the results with a commercially available structure generator. The results obtained, i.e. the number of molecules generated, were identical for elemental compositions having only C, O and H. For elemental compositions containing C, O, H, N, P and S, OMG produces all the chemically valid molecules while the other generator produces more, yet chemically impossible, molecules. The chemical completeness of the OMG results comes at the expense of being slower than the commercial generator. In addition to being open source, OMG clearly showed the added value of constraining the solution space by using multiple prescribed substructures as input. We expect this structure generator to be useful in many fields, but to be especially of great importance for metabolomics, where identifying unknown metabolites is still a major bottleneck.

  9. OMG: Open Molecule Generator

    Directory of Open Access Journals (Sweden)

    Peironcely Julio E

    2012-09-01

    Full Text Available Abstract Computer Assisted Structure Elucidation has been used for decades to discover the chemical structure of unknown compounds. In this work we introduce the first open source structure generator, Open Molecule Generator (OMG, which for a given elemental composition produces all non-isomorphic chemical structures that match that elemental composition. Furthermore, this structure generator can accept as additional input one or multiple non-overlapping prescribed substructures to drastically reduce the number of possible chemical structures. Being open source allows for customization and future extension of its functionality. OMG relies on a modified version of the Canonical Augmentation Path, which grows intermediate chemical structures by adding bonds and checks that at each step only unique molecules are produced. In order to benchmark the tool, we generated chemical structures for the elemental formulas and substructures of different metabolites and compared the results with a commercially available structure generator. The results obtained, i.e. the number of molecules generated, were identical for elemental compositions having only C, O and H. For elemental compositions containing C, O, H, N, P and S, OMG produces all the chemically valid molecules while the other generator produces more, yet chemically impossible, molecules. The chemical completeness of the OMG results comes at the expense of being slower than the commercial generator. In addition to being open source, OMG clearly showed the added value of constraining the solution space by using multiple prescribed substructures as input. We expect this structure generator to be useful in many fields, but to be especially of great importance for metabolomics, where identifying unknown metabolites is still a major bottleneck.

  10. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  11. Superconducting current generators

    International Nuclear Information System (INIS)

    Genevey, P.

    1970-01-01

    After a brief summary of the principle of energy storage and liberation with superconducting coils,two current generators are described that create currents in the range 600 to 1400 A, used for two storage experiments of 25 kJ and 50 kJ respectively. The two current generators are: a) a flux pump and b) a superconducting transformer. Both could be developed into more powerful units. The study shows the advantage of the transformer over the flux pump in order to create large currents. The efficiencies of the two generators are 95 per cent and 40 to 60 per cent respectively. (author) [fr

  12. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  13. PULSE SYNTHESIZING GENERATOR

    Science.gov (United States)

    Kerns, Q.A.

    1963-08-01

    >An electronlc circuit for synthesizing electrical current pulses having very fast rise times includes several sinewave generators tuned to progressively higher harmonic frequencies with signal amplitudes and phases selectable according to the Fourier series of the waveform that is to be synthesized. Phase control is provided by periodically triggering the generators at precisely controlled times. The outputs of the generators are combined in a coaxial transmission line. Any frequency-dependent delays that occur in the transmission line can be readily compensated for so that the desired signal wave shape is obtained at the output of the line. (AEC)

  14. Coal-fired generation

    CERN Document Server

    Breeze, Paul

    2015-01-01

    Coal-Fired Generation is a concise, up-to-date and readable guide providing an introduction to this traditional power generation technology. It includes detailed descriptions of coal fired generation systems, demystifies the coal fired technology functions in practice as well as exploring the economic and environmental risk factors. Engineers, managers, policymakers and those involved in planning and delivering energy resources will find this reference a valuable guide, to help establish a reliable power supply address social and economic objectives. Focuses on the evolution of the traditio

  15. Graph Generator Survey

    Energy Technology Data Exchange (ETDEWEB)

    Lothian, Joshua [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Sarah S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sullivan, Blair D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baker, Matthew B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Schrock, Jonathan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Poole, Stephen W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2013-10-01

    The benchmarking effort within the Extreme Scale Systems Center at Oak Ridge National Laboratory seeks to provide High Performance Computing benchmarks and test suites of interest to the DoD sponsor. The work described in this report is a part of the effort focusing on graph generation. A previously developed benchmark, SystemBurn, allowed the emulation of different application behavior profiles within a single framework. To complement this effort, similar capabilities are desired for graph-centric problems. This report examines existing synthetic graph generator implementations in preparation for further study on the properties of their generated synthetic graphs.

  16. Power generation technologies

    CERN Document Server

    Breeze, Paul

    2014-01-01

    The new edition of Power Generation Technologies is a concise and readable guide that provides an introduction to the full spectrum of currently available power generation options, from traditional fossil fuels and the better established alternatives such as wind and solar power, to emerging renewables such as biomass and geothermal energy. Technology solutions such as combined heat and power and distributed generation are also explored. However, this book is more than just an account of the technologies - for each method the author explores the economic and environmental costs and risk factor

  17. Harmonic arbitrary waveform generator

    Science.gov (United States)

    Roberts, Brock Franklin

    2017-11-28

    High frequency arbitrary waveforms have applications in radar, communications, medical imaging, therapy, electronic warfare, and charged particle acceleration and control. State of the art arbitrary waveform generators are limited in the frequency they can operate by the speed of the Digital to Analog converters that directly create their arbitrary waveforms. The architecture of the Harmonic Arbitrary Waveform Generator allows the phase and amplitude of the high frequency content of waveforms to be controlled without taxing the Digital to Analog converters that control them. The Harmonic Arbitrary Waveform Generator converts a high frequency input, into a precision, adjustable, high frequency arbitrary waveform.

  18. Nanosecond neutron generator

    International Nuclear Information System (INIS)

    Lobov, S.I.; Pavlovskaya, N.G.; Pukhov, S.P.

    1991-01-01

    High-voltage nanosecond neutron generator for obtaining neutrons in D-T reaction is described. Yield of 6x10 6 neutron/pulse was generated in a sealed gas-filled diode with a target on the cathode by accelerating pulse voltage of approximately 0.5 MV and length at half-height of 0.5 ns and deuterium pressure of 6x10 -2 Torr. Ways of increasing neutron yield and possibilities of creating generators of nanosecond neutron pulses with great service life are considered

  19. PC Scene Generation

    Science.gov (United States)

    Buford, James A., Jr.; Cosby, David; Bunfield, Dennis H.; Mayhall, Anthony J.; Trimble, Darian E.

    2007-04-01

    AMRDEC has successfully tested hardware and software for Real-Time Scene Generation for IR and SAL Sensors on COTS PC based hardware and video cards. AMRDEC personnel worked with nVidia and Concurrent Computer Corporation to develop a Scene Generation system capable of frame rates of at least 120Hz while frame locked to an external source (such as a missile seeker) with no dropped frames. Latency measurements and image validation were performed using COTS and in-house developed hardware and software. Software for the Scene Generation system was developed using OpenSceneGraph.

  20. Philosophy of power generation

    International Nuclear Information System (INIS)

    Amein, H.; Joyia, Y.; Qureshi, M.N.; Asif, M.

    1995-01-01

    In view of the huge power demand in future, the capital investment requirements for the development of power projects to meet the future energy requirements are so alarming that public sector alone cannot manage to raise funds and participation of the private sector in power generation development has become imperative. This paper discusses a power generation philosophy based on preference to the exploitation of indigenous resources and participation of private sector. In order to have diversification in generation resources, due consideration has been given to the development of nuclear power and even non-conventional but promising technologies of solar, wind, biomass and geothermal etc. (author)