WorldWideScience

Sample records for sampling design results

  1. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  2. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  3. Sample Processor for Life on Icy Worlds (SPLIce): Design and Test Results

    Science.gov (United States)

    Chinn, Tori N.; Lee, Anthony K.; Boone, Travis D.; Tan, Ming X.; Chin, Matthew M.; McCutcheon, Griffin C.; Horne, Mera F.; Padgen, Michael R.; Blaich, Justin T.; Forgione, Joshua B.; hide

    2017-01-01

    We report the design, development, and testing of the Sample Processor for Life on Icy Worlds (SPLIce) system, a microfluidic sample processor to enable autonomous detection of signatures of life and measurements of habitability parameters in Ocean Worlds. This monolithic fluid processing-and-handling system (Figure 1; mass 0.5 kg) retrieves a 50-L-volume sample and prepares it to supply a suite of detection instruments, each with unique preparation needs. SPLIce has potential applications in orbiter missions that sample ocean plumes, such as found in Saturns icy moon Enceladus, or landed missions on the surface of icy satellites, such as Jupiters moon Europa. Answering the question Are we alone in the universe? is captivating and exceptionally challenging. Even general criteria that define life very broadly include a significant role for water [1,2]. Searches for extinct or extant life therefore prioritize locations of abundant water whether in ancient (Mars), or present (Europa and Enceladus) times. Only two previous planetary missions had onboard fluid processing: the Viking Biology Experiments [3] and Phoenixs Wet Chemistry Laboratory (WCL) [4]. SPLIce differs crucially from those systems, including its capability to process and distribute L-volume samples and the integration autonomous control of a wide range of fluidic functions, including: 1) retrieval of fluid samples from an evacuated sample chamber; 2) onboard multi-year storage of dehydrated reagents; 3) integrated pressure, pH, and conductivity measurement; 4) filtration and retention of insoluble particles for microscopy; 5) dilution or vacuum-driven concentration of samples to accommodate instrument working ranges; 6) removal of gas bubbles from sample aliquots; 7) unidirectional flow (check valves); 8) active flow-path selection (solenoid-actuated valves); 9) metered pumping in 100 nL volume increments. The SPLIce manifold, made of three thermally fused layers of precision-machined cyclo

  4. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  5. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  6. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  7. Estimation of AUC or Partial AUC under Test-Result-Dependent Sampling.

    Science.gov (United States)

    Wang, Xiaofei; Ma, Junling; George, Stephen; Zhou, Haibo

    2012-01-01

    The area under the ROC curve (AUC) and partial area under the ROC curve (pAUC) are summary measures used to assess the accuracy of a biomarker in discriminating true disease status. The standard sampling approach used in biomarker validation studies is often inefficient and costly, especially when ascertaining the true disease status is costly and invasive. To improve efficiency and reduce the cost of biomarker validation studies, we consider a test-result-dependent sampling (TDS) scheme, in which subject selection for determining the disease state is dependent on the result of a biomarker assay. We first estimate the test-result distribution using data arising from the TDS design. With the estimated empirical test-result distribution, we propose consistent nonparametric estimators for AUC and pAUC and establish the asymptotic properties of the proposed estimators. Simulation studies show that the proposed estimators have good finite sample properties and that the TDS design yields more efficient AUC and pAUC estimates than a simple random sampling (SRS) design. A data example based on an ongoing cancer clinical trial is provided to illustrate the TDS design and the proposed estimators. This work can find broad applications in design and analysis of biomarker validation studies.

  8. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  9. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  10. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  12. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  13. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  14. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  15. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  16. The Apollo lunar samples collection analysis and results

    CERN Document Server

    Young, Anthony

    2017-01-01

    This book focuses on the specific mission planning for lunar sample collection, the equipment used, and the analysis and findings concerning the samples at the Lunar Receiving Laboratory in Texas. Anthony Young documents the collection of Apollo samples for the first time for readers of all backgrounds, and includes interviews with many of those involved in planning and analyzing the samples. NASA contracted with the U.S. Geologic Survey to perform classroom and field training of the Apollo astronauts. NASA’s Geology Group within the Manned Spacecraft Center in Houston, Texas, helped to establish the goals of sample collection, as well as the design of sample collection tools, bags, and storage containers. In this book, detailed descriptions are given on the design of the lunar sampling tools, the Modular Experiment Transporter used on Apollo 14, and the specific areas of the Lunar Rover vehicle used for the Apollo 15, 16, and 17 missions, which carried the sampling tools, bags, and other related equipment ...

  17. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  18. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  19. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  20. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  1. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  2. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  3. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  4. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  5. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  6. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  7. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  8. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  9. Tank 214-AW-105, grab samples, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final report for tank 241-AW-105 grab samples. Twenty grabs samples were collected from risers 10A and 15A on August 20 and 21, 1996, of which eight were designated for the K Basin sludge compatibility and mixing studies. This document presents the analytical results for the remaining twelve samples. Analyses were performed in accordance with the Compatibility Grab Sampling and Analysis Plan (TSAP) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DO). The results for the previous sampling of this tank were reported in WHC-SD-WM-DP-149, Rev. 0, 60-Day Waste Compatibility Safety Issue and Final Results for Tank 241-A W-105, Grab Samples 5A W-95-1, 5A W-95-2 and 5A W-95-3. Three supernate samples exceeded the TOC notification limit (30,000 microg C/g dry weight). Appropriate notifications were made. No immediate notifications were required for any other analyte. The TSAP requested analyses for polychlorinated biphenyls (PCB) for all liquids and centrifuged solid subsamples. The PCB analysis of the liquid samples has been delayed and will be presented in a revision to this document

  10. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  11. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  12. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  13. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  14. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  15. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  16. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  17. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  18. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  19. An Alternative View of Some FIA Sample Design and Analysis Issues

    Science.gov (United States)

    Paul C. Van Deusen

    2005-01-01

    Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...

  20. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  1. Study designs may influence results

    DEFF Research Database (Denmark)

    Johansen, Christoffer; Schüz, Joachim; Andreasen, Anne-Marie Serena

    2017-01-01

    appeared to show an inverse association, whereas nested case-control and cohort studies showed no association. For allergies, the inverse association was observed irrespective of study design. We recommend that the questionnaire-based case-control design be placed lower in the hierarchy of studies...... for establishing cause-and-effect for diseases such as glioma. We suggest that a state-of-the-art case-control study should, as a minimum, be accompanied by extensive validation of the exposure assessment methods and the representativeness of the study sample with regard to the exposures of interest. Otherwise...

  2. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  3. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  4. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  5. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  6. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  7. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  8. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  9. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  10. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  11. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  12. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  13. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  14. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  15. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  16. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  17. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  18. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.

  19. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  20. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  1. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  2. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  3. Recent Results from the SAMPLE Experiment

    International Nuclear Information System (INIS)

    Ito, Takeyasu M.

    2004-01-01

    The previous two SAMPLE experiments yielded a measurement of the axial e-N form factor G A e substantially different from the theoretical estimate. In order to confirm this observation, a third SAMPLE experiment was carried out at a lower beam energy of 125 MeV (Q2 = 0.038 (GeV/c)2) on a deuterium target. The data analysis is now at the final stage and the results are consistent with the theoretical prediction of the axial form factor G A e . Also, reevaluation of the background dilution factor and the electromagnetic radiative correction for the 200 MeV deuterium data lead to updated results, which are also consistent with the theoretical prediction

  4. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    Directory of Open Access Journals (Sweden)

    Stefano Canessa

    Full Text Available Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a

  5. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  6. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  7. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  8. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  9. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    Science.gov (United States)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  10. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  11. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  12. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  13. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  14. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  15. Multi-saline sample distillation apparatus for hydrogen isotope analyses: design and accuracy. Water-resources investigations

    International Nuclear Information System (INIS)

    Hassan, A.A.

    1981-04-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 degrees C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated

  16. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  17. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  18. Sampling Design of Soil Physical Properties in a Conilon Coffee Field

    Directory of Open Access Journals (Sweden)

    Eduardo Oliveira de Jesus Santos

    Full Text Available ABSTRACT Establishing the number of samples required to determine values of soil physical properties ultimately results in optimization of labor and allows better representation of such attributes. The objective of this study was to analyze the spatial variability of soil physical properties in a Conilon coffee field and propose a soil sampling method better attuned to conditions of the management system. The experiment was performed in a Conilon coffee field in Espírito Santo state, Brazil, under a 3.0 × 2.0 × 1.0 m (4,000 plants ha-1 double spacing design. An irregular grid, with dimensions of 107 × 95.7 m and 65 sampling points, was set up. Soil samples were collected from the 0.00-0.20 m depth from each sampling point. Data were analyzed under descriptive statistical and geostatistical methods. Using statistical parameters, the adequate number of samples for analyzing the attributes under study was established, which ranged from 1 to 11 sampling points. With the exception of particle density, all soil physical properties showed a spatial dependence structure best fitted to the spherical model. Establishment of the number of samples and spatial variability for the physical properties of soils may be useful in developing sampling strategies that minimize costs for farmers within a tolerable and predictable level of error.

  19. Sample Results from MCU Solids Outage

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Washington, A.; Oji, L.; Coleman, C.; Poirier, M.

    2014-09-22

    Savannah River National Laboratory (SRNL) has received several solid and liquid samples from MCU in an effort to understand and recover from the system outage starting on April 6, 2014. SRNL concludes that the presence of solids in the Salt Solution Feed Tank (SSFT) is the likely root cause for the outage, based upon the following discoveries: A solids sample from the extraction contactor #1 proved to be mostly sodium oxalate; A solids sample from the scrub contactor#1 proved to be mostly sodium oxalate; A solids sample from the Salt Solution Feed Tank (SSFT) proved to be mostly sodium oxalate; An archived sample from Tank 49H taken last year was shown to contain a fine precipitate of sodium oxalate; A solids sample from ; A liquid sample from the SSFT was shown to have elevated levels of oxalate anion compared to the expected concentration in the feed. Visual inspection of the SSFT indicated the presence of precipitated or transferred solids, which were likely also in the Salt Solution Receipt Tank (SSRT). The presence of the solids coupled with agitation performed to maintain feed temperature resulted in oxalate solids migration through the MCU system and caused hydraulic issues that resulted in unplanned phase carryover from the extraction into the scrub, and ultimately the strip contactors. Not only did this carryover result in the Strip Effluent (SE) being pushed out of waste acceptance specification, but it resulted in the deposition of solids into several of the contactors. At the same time, extensive deposits of aluminosilicates were found in the drain tube in the extraction contactor #1. However it is not known at this time how the aluminosilicate solids are related to the oxalate solids. The solids were successfully cleaned out of the MCU system. However, future consideration must be given to the exclusion of oxalate solids into the MCU system. There were 53 recommendations for improving operations recently identified. Some additional considerations or

  20. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  1. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  2. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  3. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  4. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  5. Sample Results From Tank 48H Samples HTF-48-14-158, -159, -169, and -170

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hang, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-28

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 48H in support of determining the cause for the unusually high dose rates at the sampling points for this tank. A set of two samples was taken from the quiescent tank, and two additional samples were taken after the contents of the tank were mixed. The results of the analyses of all the samples show that the contents of the tank have changed very little since the analysis of the previous sample in 2012. The solids are almost exclusively composed of tetraphenylborate (TPB) salts, and there is no indication of acceleration in the TPB decomposition. The filtrate composition shows a moderate increase in salt concentration and density, which is attributable to the addition of NaOH for the purposes of corrosion control. An older modeling simulation of the TPB degradation was updated, and the supernate results from a 2012 sample were run in the model. This result was compared to the results from the 2014 recent sample results reported in this document. The model indicates there is no change in the TPB degradation from 2012 to 2014. SRNL measured the buoyancy of the TPB solids in Tank 48H simulant solutions. It was determined that a solution of density 1.279 g/mL (~6.5M sodium) was capable of indefinitely suspending the TPB solids evenly throughout the solution. A solution of density 1.296 g/mL (~7M sodium) caused a significant fraction of the solids to float on the solution surface. As the experiments could not include the effect of additional buoyancy elements such as benzene or hydrogen generation, the buoyancy measurements provide an upper bound estimate of the density in Tank 48H required to float the solids.

  6. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    International Nuclear Information System (INIS)

    Bryant, Charlotte; Carmi, Israel; Cook, Gordon; Gulliksen, Steinar; Harkness, Doug; Heinemeier, Jan; McGee, Edward; Naysmith, Philip; Possnert, Goran; Scott, Marian; Plicht, Hans van der; Strydonck, Mark van

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent 14 C analysis is described. The outcome of the programme will provide a detailed quantification of the uncertainties associated with 14 C measurements including the issues of accuracy and precision. Such projects have become recognised as a fundamental aspect of continuing laboratory quality assurance schemes, providing a mechanism for the harmonisation of measurements and for demonstrating the traceability of results. The design of this study and its rationale are described. In summary, a suite of core samples has been defined which will be made available to both AMS and radiometric laboratories. These core materials are representative of routinely dated material and their ages span the full range of the applied 14 C time-scale. Two of the samples are of wood from the German and Irish dendrochronologies, thus providing a direct connection to the master dendrochronological calibration curve. Further samples link this new inter-comparison to past studies. Sample size and precision have been identified as being of paramount importance in defining dating confidence, and so several core samples have been identified for more in-depth study of these practical issues. In addition to the core samples, optional samples have been identified and prepared specifically for either AMS and/or radiometric laboratories. For AMS laboratories, these include bone, textile, leather and parchment samples. Participation in the study requires a commitment to a minimum of 10 core analyses, with results to be returned within a year

  7. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina B. de [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Farmacia; Oliveira, Bras H. de, E-mail: bho@ufpr.br [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Quimica

    2013-01-15

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C{sub 18} column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min-1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 {+-} 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  8. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    International Nuclear Information System (INIS)

    Oliveira, Karina B. de; Oliveira, Bras H. de

    2013-01-01

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C 18 column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min−1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 ± 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  9. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  10. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    Science.gov (United States)

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  11. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  12. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 4 TANK 21H QUALIFICATION SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2011-06-22

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H to qualify them for use in the Integrated Salt Disposition Program (ISDP) Batch 4 processing. All sample results agree with expectations based on prior analyses where available. No issues with the projected Salt Batch 4 strategy are identified. This revision includes additional data points that were not available in the original issue of the document, such as additional plutonium results, the results of the monosodium titanate (MST) sorption test and the extraction, scrub strip (ESS) test. This report covers the revision to the Tank 21H qualification sample results for Macrobatch (Salt Batch) 4 of the Integrated Salt Disposition Program (ISDP). A previous document covers initial characterization which includes results for a number of non-radiological analytes. These results were used to perform aluminum solubility modeling to determine the hydroxide needs for Salt Batch 4 to prevent the precipitation of solids. Sodium hydroxide was then added to Tank 21 and additional samples were pulled for the analyses discussed in this report. This work was specified by Task Technical Request and by Task Technical and Quality Assurance Plan (TTQAP).

  13. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  15. Assessment of long-term gas sampling design at two commercial manure-belt layer barns.

    Science.gov (United States)

    Chai, Li-Long; Ni, Ji-Qin; Chen, Yan; Diehl, Claude A; Heber, Albert J; Lim, Teng T

    2010-06-01

    -sample data were equal at alpha = 0.05 (P > 0.05) were accepted for both gases. The results proved that the long-term gas sampling design was valid in this instance and suggested that the gas sampling design in these two barns was one of the best on the basis of available long-term monitoring instrumentation at reasonable cost.

  16. Most Recent Sampling Results for Annex III Building

    Science.gov (United States)

    Contains email from Scott Miller, US EPA to Scott Kramer. Subject: Most Recent Sampling Results for Annex III Building. (2:52 PM) and Gore(TM) Surveys Analytical Results U.S. Geological Survey, Montgomery, AL.

  17. DESIGN AND CALIBRATION OF A VIBRANT SAMPLE MAGNETOMETER: CHARACTERIZATION OF MAGNETIC MATERIALS

    Directory of Open Access Journals (Sweden)

    Freddy P. Guachun

    2018-01-01

    Full Text Available This paper presents the process followed in the implementation of a vibrating sample magnetometer (VSM, constructed with materials commonly found in an electromagnetism laboratory. It describes the design, construction, calibration and use in the characterization of some magnetic materials. A VSM measures the magnetic moment of a sample when it is vibrated perpendicular to a uniform magnetic field; Magnetization and magnetic susceptibility can be determined from these readings. This instrument stands out for its simplicity, versatility and low cost, but it is very sensitive and capable of eliminating or minimizing many sources of error that are found in other methods of measurement, allowing to obtain very accurate and reliable results. Its operation is based on the law of magnetic induction of Lenz-Faraday that consists in measuring the induced voltage in coils of detection produced by the variation of the magnetic flux that crosses them. The calibration of the VSM was performed by means of a standard sample (Magnetite and verified by means of a test sample (Nickel.

  18. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  19. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  20. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    Science.gov (United States)

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  1. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  2. Analytical results of Tank 38H core samples -- Fall 1999

    International Nuclear Information System (INIS)

    Swingle, R.F.

    2000-01-01

    Two samples were pulled from Tank 38H in the Fall of 1999: a variable depth sample (VDS) of the supernate was pulled in October and a core sample from the salt layer was pulled in December. Analysis of the rinse from the outside of the core sample indicated no sign of volatile or semivolatile organics. Both supernate and solids from the VDS and the dried core sample solids were analyzed for isotopes which could pose a criticality concern and also for elements which could serve as neutron poisons, as well as other elements. Results of the elemental analyses of these samples show significant elements present to mitigate the potential for nuclear criticality. However, it should be noted the results given for the VDS solids elemental analyses may be higher than the actual concentration in the solids, since the filter paper was dissolved along with the sample solids

  3. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  4. Design and construction of a prototype vaporization calorimeter for the assay of radioisotopic samples

    International Nuclear Information System (INIS)

    Tormey, T.V.

    1979-10-01

    A prototype vaporization calorimeter has been designed and constructed for use in the assay of low power output radioisotopic samples. The prototype calorimeter design was based on that of a previous experimental instrument used by H.P. Stephens, to establish the feasibility of the vaporization calorimetry technique for this type of power measurement. The calorimeter is composed of a mechanical calorimeter assembly together with a data acquisition and control system. Detailed drawings of the calorimeter assembly are included and additional drawings are referenced. The data acquisition system is based on an HP 9825A programmable calculator. A description of the hardware is provided together with a listing of all system software programs. The operating procedure is outlined, including initial setup and operation of all related equipment. Preliminary system performance was evaluated by making a series of four measurements on two nominal 1.5W samples and on a nominal 0.75W sample. Data for these measurements indicate that the absolute accuracy (one standard deviation) is approx. = 0.0035W in this power range, resulting in an estimated relative one standard deviation accuracy of 0.24% at 1.5W and 0.48% at 0.75W

  5. ANALYTICAL RESULTS OF MOX COLEMANITE CONCRETE SAMPLES POURED AUGUST 29, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Best, D.; Cozzi, A.; Reigel, M.

    2012-12-20

    The Mixed Oxide Fuel Fabrication Facility (MFFF) will use colemanite bearing concrete neutron absorber panels credited with attenuating neutron flux in the criticality design analyses and shielding operators from radiation. The Savannah River National Laboratory is tasked with measuring the total density, partial hydrogen density, and partial boron density of the colemanite concrete. Samples poured 8/29/12 were received on 9/20/2012 and analyzed. The average total density of each of the samples measured by the ASTM method C 642 was within the lower bound of 1.88 g/cm{sup 3}. The average partial hydrogen density of samples 8.6.1, 8.7.1, and 8.5.3 as measured using method ASTM E 1311 met the lower bound of 6.04E-02 g/cm{sup 3}. The average measured partial boron density of each sample met the lower bound of 1.65E-01 g/cm{sup 3} measured by the ASTM C 1301 method. The average partial hydrogen density of samples 8.5.1, 8.6.3, and 8.7.3 did not meet the lower bound. The samples, as received, were not wrapped in a moist towel as previous samples and appeared to be somewhat drier. This may explain the lower hydrogen partial density with respect to previous samples.

  6. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  7. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  8. Results of Plutonium Intercalibration in Seawater and Seaweed Samples

    International Nuclear Information System (INIS)

    Fukai, R.; Murray, C.N.

    1976-01-01

    The results of the intercalibration exercise for the measurement of plutonium-239 and 228 in two seawater samples SW-I-1 and SW-I-2 and a marine algae sample AG-I-1 are presented. Seventeen laboratories from 8 countries as well as the IAEA International Laboratory of Marine Radioactivity took part. A discussion of the results and methods used in the analysis is given. It is concluded that in spite of the complicated chemical procedures involved in plutonium analysis, the scatter of the reported results was much smaller than that for fission product radionuclides such as strontium-90, ruthenium-106, cesium-137 etc. (author)

  9. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    Science.gov (United States)

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  10. The 2003 Australian Breast Health Survey: survey design and preliminary results

    Directory of Open Access Journals (Sweden)

    Favelle Simone

    2008-01-01

    Full Text Available Abstract Background The Breast Health Surveys, conducted by the National Breast Cancer Centre (NBCC in 1996 and 2003, are designed to gain insight into the knowledge, attitudes and behaviours of a nationally representative sample of Australian women on issues relevant to breast cancer. In this article, we focus on major aspects of the design and present results on respondents' knowledge about mammographic screening. Methods The 2003 BHS surveyed English-speaking Australian women aged 30–69 without a history of breast cancer using computer-assisted telephone interviewing. Questions covered the following themes: knowledge and perceptions about incidence, mortality and risk; knowledge and behaviour regarding early detection, symptoms and diagnosis; mammographic screening; treatment; and accessibility and availability of information and services. Respondents were selected using a complex sample design involving stratification. Sample weights against Australian population benchmarks were used in all statistical analyses. Means and proportions for the entire population and by age group and area of residence were calculated. Statistical tests were conducted using a level of significance of 0.01. Results Of the 3,144 respondents who consented to being interviewed, 138 (4.4% had a previous diagnosis of breast cancer and were excluded leaving 3,006 completed interviews eligible for analysis. A majority of respondents (61.1% reported ever having had a mammogram and 29.1% identified mammography as being the best way of finding breast cancer. A majority of women (85.9% had heard of the BreastScreen Australia (BSA program, the national mammographic screening program providing free biennial screening mammograms, with 94.5% believing that BSA attendance was available regardless of the presence or absence of symptoms. There have been substantial gains in women's knowledge about mammographic screening over the seven years between the two surveys. Conclusion The

  11. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  12. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  13. Design for mosquito abundance, diversity, and phenology sampling within the National Ecological Observatory Network

    Science.gov (United States)

    Hoekman, D.; Springer, Yuri P.; Barker, C.M.; Barrera, R.; Blackmore, M.S.; Bradshaw, W.E.; Foley, D. H.; Ginsberg, Howard; Hayden, M. H.; Holzapfel, C. M.; Juliano, S. A.; Kramer, L. D.; LaDeau, S. L.; Livdahl, T. P.; Moore, C. G.; Nasci, R.S.; Reisen, W.K.; Savage, H. M.

    2016-01-01

    The National Ecological Observatory Network (NEON) intends to monitor mosquito populations across its broad geographical range of sites because of their prevalence in food webs, sensitivity to abiotic factors and relevance for human health. We describe the design of mosquito population sampling in the context of NEON’s long term continental scale monitoring program, emphasizing the sampling design schedule, priorities and collection methods. Freely available NEON data and associated field and laboratory samples, will increase our understanding of how mosquito abundance, demography, diversity and phenology are responding to land use and climate change.

  14. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  15. Test results of the first 50 kA NbTi full size sample for ITER

    International Nuclear Information System (INIS)

    Ciazynski, D.; Zani, L.; Huber, S.; Stepanov, B.; Karlemo, B.

    2003-01-01

    Within the framework of the research studies for the International Thermonuclear Experimental Reactor (ITER) project, the first full size NbTi conductor sample was fabricated in industry and tested in the SULTAN facility (Villigen, Switzerland). This sample (PF-FSJS), which is relevant to the Poloidal Field coils of ITER, is composed of two parallel straight bars of conductor, connected at bottom through a joint designed according to the Cea twin-box concept. The two conductor legs are identical except for the use of different strands: a nickel plated NbTi strand with a pure copper matrix in one leg, and a bare NbTi strand with copper matrix and internal CuNi barrier in the other leg. The two conductors and the joint were extensively tested regarding DC (direct current) and AC (alternative current) properties. This paper reports on the tests results and analysis, stressing the differences between the two conductor legs and discussing the impact of the test results on the ITER design criteria for conductor and joint. While joint DC resistance, conductors and joint AC losses, fulfilled the ITER requirements, neither conductor could reach its current sharing temperature at relevant ITER currents, due to instabilities. Although the drop in temperature is slight for the CuNi strand cable, it is more significant for the Ni plated strand cable. (authors)

  16. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  17. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  18. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  19. Report: Independent Environmental Sampling Shows Some Properties Designated by EPA as Available for Use Had Some Contamination

    Science.gov (United States)

    Report #15-P-0221, July 21, 2015. Some OIG sampling results showed contamination was still present at sites designated by the EPA as ready for reuse. This was unexpected and could signal a need to implement changes to ensure human health protection.

  20. Shielding design of highly activated sample storage at reactor TRIGA PUSPATI

    International Nuclear Information System (INIS)

    Naim Syauqi Hamzah; Julia Abdul Karim; Mohamad Hairie Rabir; Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh

    2010-01-01

    Radiation protection has always been one of the most important things considered in Reaktor Triga PUSPATI (RTP) management. Currently, demands on sample activation were increased from variety of applicant in different research field area. Radiological hazard may occur if the samples evaluation done were misjudge or miscalculated. At present, there is no appropriate storage for highly activated samples. For that purpose, special irradiated samples storage box should be provided in order to segregate highly activated samples that produce high dose level and typical activated samples that produce lower dose level (1 - 2 mR/ hr). In this study, thickness required by common shielding material such as lead and concrete to reduce highly activated radiotracer sample (potassium bromide) with initial exposure dose of 5 R/ hr to background level (0.05 mR/ hr) were determined. Analyses were done using several methods including conventional shielding equation, half value layer calculation and Micro shield computer code. Design of new irradiated samples storage box for RTP that capable to contain high level gamma radioactivity were then proposed. (author)

  1. Development of a standard data base for FBR core nuclear design (XIII). Analysis of small sample reactivity experiments at ZPPR-9

    International Nuclear Information System (INIS)

    Sato, Wakaei; Fukushima, Manabu; Ishikawa, Makoto

    2000-09-01

    A comprehensive study to evaluate and accumulate the abundant results of fast reactor physics is now in progress at O-arai Engineering Center to improve analytical methods and prediction accuracy of nuclear design for large fast breeder cores such as future commercial FBRs. The present report summarizes the analytical results of sample reactivity experiments at ZPPR-9 core, which has not been evaluated by the latest analytical method yet. The intention of the work is to extend and further generalize the standard data base for FBR core nuclear design. The analytical results of the sample reactivity experiments (samples: PU-30, U-6, DU-6, SS-1 and B-1) at ZPPR-9 core in JUPITER series, with the latest nuclear data library JENDL-3.2 and the analytical method which was established by the JUPITER analysis, can be concluded as follows: The region-averaged final C/E values generally agreed with unity within 5% differences at the inner core region. However, the C/E values of every sample showed the radial space-dependency increasing from center to core edge, especially the discrepancy of B-1 was the largest by 10%. Next, the influence of the present analytical results for the ZPPR-9 sample reactivity to the cross-section adjustment was evaluated. The reference case was a unified cross-section set ADJ98 based on the recent JUPITER analysis. As a conclusion, the present analytical results have sufficient physical consistency with other JUPITER data, and possess qualification as a part of the standard data base for FBR nuclear design. (author)

  2. Design of a Clean Room for Quality Control of an Environmental Sampling in KINAC

    International Nuclear Information System (INIS)

    Yoon, Jongho; Ahn, Gil Hoon; Seo, Hana; Han, Kitek; Park, Il Jin

    2014-01-01

    The objective of environmental sampling and analysis for safeguards is to characterize the nuclear materials handled and the activities conducted at the specific locations. The KINAC is responsible for the conclusions drawn from the analytical results provided by the analytical laboratories. To assure the KINAC of the continuity of the quality of the analytical results provided by the laboratories, the KINAC will implement a quality control(QC) programme. One of the QC programme is to prepare QC samples. The establishment of a clean room is needed to handle QC samples due to stringent control of contamination. The KINAC designed a clean facility with cleanliness of ISO Class 6, the Clean Room for Estimation and Assay of trace Nuclear materials(CREAN) to meet conflicting requirements of a clean room and for handling of nuclear materials according to Korean laws. The clean room will be expected to acquire of a radiation safety license under these conditions in this year and continue to improve it. The construction of the CREAN facility will be completed by the middle of 2015. In terms of QC programme, the establishment of a clean room is essential and will be not only very helpful for setting of quality control system for the national environmental sampling programme but also be applied for the environmental sample analysis techniques to the nuclear forensics

  3. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    Science.gov (United States)

    Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...

  4. Two specialized delayed-neutron detector designs for assays of fissionable elements in water and sediment samples

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Balagna, J.P.; Menlove, H.O.

    1976-01-01

    Two specialized neutron-sensitive detectors are described which are employed for rapid assays of fissionable elements by sensing for delayed neutrons emitted by samples after they have been irradiated in a nuclear reactor. The more sensitive of the two detectors, designed to assay for uranium in water samples, is 40% efficient; the other, designed for sediment sample assays, is 27% efficient. These detectors are also designed to operate under water as an inexpensive shielding against neutron leakage from the reactor and neutrons from cosmic rays. (Auth.)

  5. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  6. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  7. AN EVALUATION OF PRIMARY DATA-COLLECTION MODES IN AN ADDRESS-BASED SAMPLING DESIGN.

    Science.gov (United States)

    Amaya, Ashley; Leclere, Felicia; Carris, Kari; Liao, Youlian

    2015-01-01

    As address-based sampling becomes increasingly popular for multimode surveys, researchers continue to refine data-collection best practices. While much work has been conducted to improve efficiency within a given mode, additional research is needed on how multimode designs can be optimized across modes. Previous research has not evaluated the consequences of mode sequencing on multimode mail and phone surveys, nor has significant research been conducted to evaluate mode sequencing on a variety of indicators beyond response rates. We conducted an experiment within the Racial and Ethnic Approaches to Community Health across the U.S. Risk Factor Survey (REACH U.S.) to evaluate two multimode case-flow designs: (1) phone followed by mail (phone-first) and (2) mail followed by phone (mail-first). We compared response rates, cost, timeliness, and data quality to identify differences across case-flow design. Because surveys often differ on the rarity of the target population, we also examined whether changes in the eligibility rate altered the choice of optimal case flow. Our results suggested that, on most metrics, the mail-first design was superior to the phone-first design. Compared with phone-first, mail-first achieved a higher yield rate at a lower cost with equivalent data quality. While the phone-first design initially achieved more interviews compared to the mail-first design, over time the mail-first design surpassed it and obtained the greatest number of interviews.

  8. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    Science.gov (United States)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  9. Analytical results from Tank 38H criticality Sample HTF-093

    International Nuclear Information System (INIS)

    Wilmarth, W.R.

    2000-01-01

    Resumption of processing in the 242-16H Evaporator could cause salt dissolution in the Waste Concentration Receipt Tank (Tank 38H). Therefore, High Level Waste personnel sampled the tank at the salt surface. Results of elemental analysis of the dried sludge solids from this sample (HTF-093) show significant quantities of neutron poisons (i.e., sodium, iron, and manganese) present to mitigate the potential for nuclear criticality. Comparison of this sample with the previous chemical and radiometric analyses of H-Area Evaporator samples show high poison to actinide ratios

  10. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  11. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  12. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    Science.gov (United States)

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  13. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  14. An Optimization-Based Reconfigurable Design for a 6-Bit 11-MHz Parallel Pipeline ADC with Double-Sampling S&H

    Directory of Open Access Journals (Sweden)

    Wilmar Carvajal

    2012-01-01

    Full Text Available This paper presents a 6 bit, 11 MS/s time-interleaved pipeline A/D converter design. The specification process, from block level to elementary circuits, is gradually covered to draw a design methodology. Both power consumption and mismatch between the parallel chain elements are intended to be reduced by using some techniques such as double and bottom-plate sampling, fully differential circuits, RSD digital correction, and geometric programming (GP optimization of the elementary analog circuits (OTAs and comparators design. Prelayout simulations of the complete ADC are presented to characterize the designed converter, which consumes 12 mW while sampling a 500 kHz input signal. Moreover, the block inside the ADC with the most stringent requirements in power, speed, and precision was sent to fabrication in a CMOS 0.35 μm AMS technology, and some postlayout results are shown.

  15. Mars Rover Sample Return aerocapture configuration design and packaging constraints

    Science.gov (United States)

    Lawson, Shelby J.

    1989-01-01

    This paper discusses the aerodynamics requirements, volume and mass constraints that lead to a biconic aeroshell vehicle design that protects the Mars Rover Sample Return (MRSR) mission elements from launch to Mars landing. The aerodynamic requirements for Mars aerocapture and entry and packaging constraints for the MRSR elements result in a symmetric biconic aeroshell that develops a L/D of 1.0 at 27.0 deg angle of attack. A significant problem in the study is obtaining a cg that provides adequate aerodynamic stability and performance within the mission imposed constraints. Packaging methods that relieve the cg problems include forward placement of aeroshell propellant tanks and incorporating aeroshell structure as lander structure. The MRSR missions developed during the pre-phase A study are discussed with dimensional and mass data included. Further study is needed for some missions to minimize MRSR element volume so that launch mass constraints can be met.

  16. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    Science.gov (United States)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  17. Results of EDS uranium samples characterization after hydrogen loading

    International Nuclear Information System (INIS)

    Chicea, D.; Dash, J.

    2003-01-01

    Several experiments of loading natural uranium foils with hydrogen were done. Electrolysis was used for loading hydrogen into uranium, because it is the most efficient way for H loading. The composition of the surface and near surface of the samples was determined using an Oxford EDS spectrometer on a Scanning Electron Microscope, manufactured by ISI. Images were taken with several magnifications up to 3.4KX. Results reveal that when low current density was used, the surface patterns changed from granules on the surface having a typical size of 2-4 microns to pits under the surface having a typical size under one micron. When high current density was used the surface changed and presented deep fissures. The deep fissures are the result of the mechanical strain induced by the lattice expansion caused by hydrogen absorption. The surface composition was determined before and after hydrogen loading. Uranium, thorium platinum and carbon concentration were measured. Experiments suggest that the amount of thorium increases on the uranium sample with the total electric charge transported through electrolyte. Carbon concentration was found to decrease on the surface of the sample as the total electric charge transported through electrolyte increased. Platinum is used in electrolysis experiment as anode primarily because it does not dissolve in electrolyte and therefore it is not electro-deposited on the cathode surface. The results of the platinum concentration measurements on the surface of the samples we loaded with hydrogen reveal that the platinum concentration increased dramatically as the current density increased and that created platinum spots on the cathode surface. Work is in progress on the subject. (authors)

  18. Results of tritium measurement in environmental samples and drainage

    International Nuclear Information System (INIS)

    Koike, Ryoji; Hirai, Yasuo

    1983-01-01

    In Ibaraki prefecture, the tritium concentration in the drainage from the nuclear facilities has been measured since 1974. Then, with the start of operation of the fuel reprocessing plant in 1977, the tritium concentration in environmental samples was to be measured also in order to examine the effect of the drainage on the environment. The results of the tritium measurement in Ibaraki prefecture up to about 1980 are described: sampling points, sampling and measuring methods, the tritium concentration in the drainage, air, inland water and seawater, respectively. The drainages have been taken from Japan Atomic Power Company, Japan Atomic Energy Research Institute, and Power Reactor and Nuclear Fuel Development Corporation (with the fuel reprocessing plant). The samples of air, inland water and seawater have been taken in the areas concerned. The tritium concentration was measured by a low-background liquid scintillation counter. The measured values in the environment have been generally at low level, not different from other areas. (Mori, K.)

  19. A novel sampling design to explore gene-longevity associations

    DEFF Research Database (Denmark)

    De Rango, Francesco; Dato, Serena; Bellizzi, Dina

    2008-01-01

    To investigate the genetic contribution to familial similarity in longevity, we set up a novel experimental design where cousin-pairs born from siblings who were concordant or discordant for the longevity trait were analyzed. To check this design, two chromosomal regions already known to encompass...... from concordant and discordant siblings. In addition, we analyzed haplotype transmission from centenarians to offspring, and a statistically significant Transmission Ratio Distortion (TRD) was observed for both chromosomal regions in the discordant families (P=0.007 for 6p21.3 and P=0.015 for 11p15.......5). In concordant families, a marginally significant TRD was observed at 6p21.3 only (P=0.06). Although no significant difference emerged between the two groups of cousin-pairs, our study gave new insights on the hindrances to recruiting a suitable sample to obtain significant IBD data on longevity...

  20. The significance of Sampling Design on Inference: An Analysis of Binary Outcome Model of Children’s Schooling Using Indonesian Large Multi-stage Sampling Data

    OpenAIRE

    Ekki Syamsulhakim

    2008-01-01

    This paper aims to exercise a rather recent trend in applied microeconometrics, namely the effect of sampling design on statistical inference, especially on binary outcome model. Many theoretical research in econometrics have shown the inappropriateness of applying i.i.dassumed statistical analysis on non-i.i.d data. These research have provided proofs showing that applying the iid-assumed analysis on a non-iid observations would result in an inflated standard errors which could make the esti...

  1. Sampling design considerations for demographic studies: a case of colonial seabirds

    Science.gov (United States)

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of

  2. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  3. The 2003 Australian Breast Health Survey: survey design and preliminary results.

    Science.gov (United States)

    Villanueva, Elmer V; Jones, Sandra; Nehill, Caroline; Favelle, Simone; Steel, David; Iverson, Donald; Zorbas, Helen

    2008-01-14

    The Breast Health Surveys, conducted by the National Breast Cancer Centre (NBCC) in 1996 and 2003, are designed to gain insight into the knowledge, attitudes and behaviours of a nationally representative sample of Australian women on issues relevant to breast cancer. In this article, we focus on major aspects of the design and present results on respondents' knowledge about mammographic screening. The 2003 BHS surveyed English-speaking Australian women aged 30-69 without a history of breast cancer using computer-assisted telephone interviewing. Questions covered the following themes: knowledge and perceptions about incidence, mortality and risk; knowledge and behaviour regarding early detection, symptoms and diagnosis; mammographic screening; treatment; and accessibility and availability of information and services. Respondents were selected using a complex sample design involving stratification. Sample weights against Australian population benchmarks were used in all statistical analyses. Means and proportions for the entire population and by age group and area of residence were calculated. Statistical tests were conducted using a level of significance of 0.01. Of the 3,144 respondents who consented to being interviewed, 138 (4.4%) had a previous diagnosis of breast cancer and were excluded leaving 3,006 completed interviews eligible for analysis. A majority of respondents (61.1%) reported ever having had a mammogram and 29.1% identified mammography as being the best way of finding breast cancer. A majority of women (85.9%) had heard of the BreastScreen Australia (BSA) program, the national mammographic screening program providing free biennial screening mammograms, with 94.5% believing that BSA attendance was available regardless of the presence or absence of symptoms. There have been substantial gains in women's knowledge about mammographic screening over the seven years between the two surveys. The NBCC Breast Health Surveys provide a valuable picture of the

  4. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  5. Spreading Design of Radioactivity in Sea Water, Algae and Fish Samples inthe Coastal of Muria Peninsula Area

    International Nuclear Information System (INIS)

    Sutjipto; Muryono; Sumining

    2000-01-01

    Spreading design of radioactivity in sea water, brown algae (phaeopyceae)and kerapu fish (epeniphelus) samples in the coastal of Muria peninsula areahas been studied. This research was carried out with designed beside to knowspreading each radioactivity but also spreading design in relation to thecontent of Pu-239 and Cs-137. Samples taken, preparation and analysis basedon the procedures of environmental radioactivity analysis. The instrumentused for the analysis radioactivity were alpha counter with detector ZnS, lowlevel beta counter modified P3TM-BATAN with detector GM and spectrometergamma with detector Ge(Li). Alpha radioactivity obtained of sea water, algaeand fish were the fluctuation form of the natural background. Radionuclide ofPu-239 in samples not detect, because its concentration/radioactivity stillbelow the maximum concentration detection value of Pu-239 for algae and fishwas that 1.10 Bq/g, whereas for sea water was that 0.07 Bq/mL. Result for theradioactivity which give the highest alpha radioactivity obtained on thekerapu fish was that 1.56 x 10 -3 Bq/g, beta radioactivity on sea water wasthat 1.75 x 10 2 mBq/L, gamma radioactivity of K-40 on brown algae was that3.72 x 10 -2 Bq/g and gamma radioactivity of Tl-208 on fish as mentionedabove was that 1.35 x 10 -2 Bq/g. All the peak spectrum gamma energy ofCs-137 do not detect with gamma counter, so there are not the radionuclide ofCs-137 in the samples. Spreading design of radioactivity which occur in thecoastal of Muria peninsula area for alpha radioactivity was found on kerapufish, beta radioactivities on sea water and gamma radioactivity on brownalgae and kerapu fish. (author)

  6. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  7. SEAMIST trademark soil sampling for tritiated water: First year's results

    International Nuclear Information System (INIS)

    Mallon, B.; Martins, S.A.; Houpis, J.L.; Lowry, W.; Cremer, C.D.

    1992-01-01

    SEAMIST trademark is a recently developed sampling system that enables one to measure various soil parameters by means of an inverted, removable, impermeable membrane tube inserted in a borehole. This membrane tube can have various measuring devices installed on it, such as gas ports, adsorbent pads, and electrical sensors. These membrane tubes are made of a laminated polymer. The Lawrence Livermore National Laboratory in Livermore, California, has installed two of these systems to monitor tritium in soil resulting from a leak in an underground storage tank. One tube is equipped with gas ports to sample soil vapor and the other with adsorbent pads to sample soil moisture. Borehole stability was maintained using either sand-filled or air-inflated tubes. Both system implementations yielded concentrations or activities that compared well with the measured concentrations of tritium in the soil taken during borehole construction. In addition, an analysis of the data suggest that both systems prevented the vertical migration of tritium in the boreholes. Also, a neutron probe was successfully used in a blank membrane inserted in one of the boreholes to monitor the moisture in the soil without exposing the probe to the tritium. The neutron log showed excellent agreement with the soil moisture content measured in soil samples taken during borehole construction. This paper describes the two SEAMIST trademark systems used and presents sampling results and comparisons

  8. Testing results of Monte Carlo sampling processes in MCSAD

    International Nuclear Information System (INIS)

    Pinnera, I.; Cruz, C.; Abreu, Y.; Leyva, A.; Correa, C.; Demydenko, C.

    2009-01-01

    The Monte Carlo Simulation of Atom Displacements (MCSAD) is a code implemented by the authors to simulate the complete process of atom displacement (AD) formation. This code makes use of the Monte Carlo (MC) method to sample all the processes involved in the gamma and electronic radiation transport through matter. The kernel of the calculations applied to this code relies on a model based on an algorithm developed by the authors, which firstly splits out multiple electron elastic scattering events from those single ones at higher scattering angles and then, from the last one, sampling those leading to AD at high transferred atomic recoil energies. Some tests have been developed to check the sampling algorithms with the help of the corresponding theoretical distribution functions. Satisfactory results have been obtained, which indicate the strength of the methods and subroutines used in the code. (Author)

  9. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  10. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  11. Single-subject withdrawal designs in delayed matching-to-sample procedures

    OpenAIRE

    Eilifsen, Christoffer; Arntzen, Erik

    2011-01-01

    In most studies of delayed matching-to-sample (DMTS) and stimulus equivalence, the delay has remained fixed throughout a single experimental condition. We wanted to expand on the DMTS and stimulus equivalence literature by examining the effects of using titrating delays with different starting points during the establishment of conditional discriminations prerequisite for stimulus equivalence. In Experiment 1, a variation of a single-subject withdrawal design was used. Ten adults were exposed...

  12. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  13. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  14. Organic analysis of ambient samples collected near Tank 241-C-103: Results from samples collected on May 12, 1994

    International Nuclear Information System (INIS)

    Clauss, T.W.; Ligotke, M.W.; McVeety, B.D.; Lucke, R.B.; Young, J.S.; McCulloch, M.; Fruchter, J.S.; Goheen, S.C.

    1995-06-01

    This report describes organic analyses results from ambient samples collected both upwind and through the vapor sampling system (VSS) near Hanford waste storage Tank 241-C-103 (referred to as Tank C-103). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed. Quantitative results were obtained for organic compounds. Five organic tentatively identified compounds (TICS) were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and the reported concentrations are semiquantitative estimates. In addition, we looked for the 40 standard TO-14 analytes. We observed 39. Of these, only one was observed above the 2-ppbv calibrated instrument detection limit. Dichloromethane was above the detection limits using both methods, but the result from the TO-14 method is traceable to a standard gas mixture and is considered more accurate. Organic analytes were found only in the sample collected through the VSS, suggesting that these compounds were residual contamination from a previous sampling job. Detailed descriptions of the results appear in the text

  15. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  16. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  17. A comparison of results for samples collected with bailers constructed of different materials

    International Nuclear Information System (INIS)

    Thomey, N.; Ogle, R.; Jackson, J.

    1992-01-01

    A bailer is one of the most common sampling devices used to collect ground water samples. Bailers constructed from various materials are available; teflon, polyvinyl chloride (PVC), polyethylene, and stainless steel are all commonly used. It is widely recognized that sample results can be affected by the material from which the bailer is constructed. Teflon and stainless steel are usually recommended based upon their inert properties. The cost of these bailers is significantly higher than other types. For the purposes of petroleum storage tank investigations, sampling devices that would not compromise sample quality but be more economical than teflon or stainless steel would be especially desirable. Water samples were collected using the different types of bailers; teflon, stainless steel, PVC, and polyethylene. Split samples were analyzed for benzene, toluene, ethylbenzene, total xylenes, and Total Petroleum Hydrocarbons. The analytical results were compared to determine if differences were due to normal analytical variances or due to interaction of the sample with the sampling device. No difference was noted in the results which were obtained

  18. A two-stage Bayesian design with sample size reestimation and subgroup analysis for phase II binary response trials.

    Science.gov (United States)

    Zhong, Wei; Koopmeiners, Joseph S; Carlin, Bradley P

    2013-11-01

    Frequentist sample size determination for binary outcome data in a two-arm clinical trial requires initial guesses of the event probabilities for the two treatments. Misspecification of these event rates may lead to a poor estimate of the necessary sample size. In contrast, the Bayesian approach that considers the treatment effect to be random variable having some distribution may offer a better, more flexible approach. The Bayesian sample size proposed by (Whitehead et al., 2008) for exploratory studies on efficacy justifies the acceptable minimum sample size by a "conclusiveness" condition. In this work, we introduce a new two-stage Bayesian design with sample size reestimation at the interim stage. Our design inherits the properties of good interpretation and easy implementation from Whitehead et al. (2008), generalizes their method to a two-sample setting, and uses a fully Bayesian predictive approach to reduce an overly large initial sample size when necessary. Moreover, our design can be extended to allow patient level covariates via logistic regression, now adjusting sample size within each subgroup based on interim analyses. We illustrate the benefits of our approach with a design in non-Hodgkin lymphoma with a simple binary covariate (patient gender), offering an initial step toward within-trial personalized medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Headspace vapor characterization of Hanford Waste Tank SX-102: Results from samples collected on July 19, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    McVeety, B.D.; Evans, J.C.; Clauss, T.W.; Pool, K.H.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-102 (Tank SX-102) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed under the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5046. Samples were collected by WHC on July 19, 1995, using the vapor sampling system (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  20. Headspace vapor characterization of Hanford Waste Tank AX-103: Results from samples collected on June 21, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Clauss, T.W.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-AX-103 (Tank AX-103) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5029. Samples were collected by WHC on June 21, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  1. Headspace vapor characterization of Hanford Waste Tank AX-101: Results from samples collected on June 15, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.; McVeety, B.D.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-AX-101 (Tank AX-101) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) under the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5028. Samples were collected by WHC on June 15, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  2. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  3. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  5. Sampling and energy evaluation challenges in ligand binding protein design.

    Science.gov (United States)

    Dou, Jiayi; Doyle, Lindsey; Jr Greisen, Per; Schena, Alberto; Park, Hahnbeom; Johnsson, Kai; Stoddard, Barry L; Baker, David

    2017-12-01

    The steroid hormone 17α-hydroxylprogesterone (17-OHP) is a biomarker for congenital adrenal hyperplasia and hence there is considerable interest in development of sensors for this compound. We used computational protein design to generate protein models with binding sites for 17-OHP containing an extended, nonpolar, shape-complementary binding pocket for the four-ring core of the compound, and hydrogen bonding residues at the base of the pocket to interact with carbonyl and hydroxyl groups at the more polar end of the ligand. Eight of 16 designed proteins experimentally tested bind 17-OHP with micromolar affinity. A co-crystal structure of one of the designs revealed that 17-OHP is rotated 180° around a pseudo-two-fold axis in the compound and displays multiple binding modes within the pocket, while still interacting with all of the designed residues in the engineered site. Subsequent rounds of mutagenesis and binding selection improved the ligand affinity to nanomolar range, while appearing to constrain the ligand to a single bound conformation that maintains the same "flipped" orientation relative to the original design. We trace the discrepancy in the design calculations to two sources: first, a failure to model subtle backbone changes which alter the distribution of sidechain rotameric states and second, an underestimation of the energetic cost of desolvating the carbonyl and hydroxyl groups of the ligand. The difference between design model and crystal structure thus arises from both sampling limitations and energy function inaccuracies that are exacerbated by the near two-fold symmetry of the molecule. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  6. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  7. Research results: preserving newborn blood samples.

    Science.gov (United States)

    Lewis, Michelle Huckaby; Scheurer, Michael E; Green, Robert C; McGuire, Amy L

    2012-11-07

    Retention and use, without explicit parental permission, of residual dried blood samples from newborn screening has generated public controversy over concerns about violations of family privacy rights and loss of parental autonomy. The public debate about this issue has included little discussion about the destruction of a potentially valuable public resource that can be used for research that may yield improvements in public health. The research community must advocate for policies and infrastructure that promote retention of residual dried blood samples and their use in biomedical research.

  8. SAMPLE RESULTS FROM THE INTEGRATED SALT DISPOSITION PROGRAM MACROBATCH 5 TANK 21H QUALIFICATION MST, ESS AND PODD SAMPLES

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T.; Fink, S.

    2012-04-24

    Savannah River National Laboratory (SRNL) performed experiments on qualification material for use in the Integrated Salt Disposition Program (ISDP) Batch 5 processing. This qualification material was a composite created from recent samples from Tank 21H and archived samples from Tank 49H to match the projected blend from these two tanks. Additionally, samples of the composite were used in the Actinide Removal Process (ARP) and extraction-scrub-strip (ESS) tests. ARP and ESS test results met expectations. A sample from Tank 21H was also analyzed for the Performance Objectives Demonstration Document (PODD) requirements. SRNL was able to meet all of the requirements, including the desired detection limits for all the PODD analytes. This report details the results of the Actinide Removal Process (ARP), Extraction-Scrub-Strip (ESS) and Performance Objectives Demonstration Document (PODD) samples of Macrobatch (Salt Batch) 5 of the Integrated Salt Disposition Program (ISDP).

  9. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Directory of Open Access Journals (Sweden)

    Antonio Di Franco

    Full Text Available Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS. LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1 whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2 the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast. We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within

  10. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Science.gov (United States)

    Di Franco, Antonio; Bulleri, Fabio; Pennetta, Antonio; De Benedetto, Giuseppe; Clarke, K Robert; Guidetti, Paolo

    2014-01-01

    Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the

  11. Space station interior design: Results of the NASA/AIA space station interior national design competition

    Science.gov (United States)

    Haines, R. F.

    1975-01-01

    The results of the NASA/AIA space station interior national design competition held during 1971 are presented in order to make available to those who work in the architectural, engineering, and interior design fields the results of this design activity in which the interiors of several space shuttle size modules were designed for optimal habitability. Each design entry also includes a final configuration of all modules into a complete space station. A brief history of the competition is presented with the competition guidelines and constraints. The first place award entry is presented in detail, and specific features from other selected designs are discussed. This is followed by a discussion of how some of these design features might be applied to terrestrial as well as space situations.

  12. OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection

    Science.gov (United States)

    May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron

    2014-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.

  13. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  14. Robotic Irradiated Sample Handling Concept Design in Reactor TRIGA PUSPATI using Simulation Software

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Nurfarhana Ayuni Joha

    2015-01-01

    This paper introduces the concept design of an Robotic Irradiated Sample Handling Machine using graphical software application, designed as a general, flexible and open platform to work on robotics. Webots has proven to be a useful tool in many fields of robotics, such as manipulator programming, mobile robots control (wheeled, sub-aquatic and walking robots), distance computation, sensor simulation, collision detection, motion planning and so on. Webots is used as the common interface for all the applications. Some practical cases and application for this concept design are illustrated on the paper to present the possibilities of this simulation software. (author)

  15. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  16. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  17. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  18. PEP-II design update and R ampersand D results

    International Nuclear Information System (INIS)

    Zisman, M.S.

    1993-01-01

    We describe the present status of the PEP-II asymmetric β factory design undertaken by SLAC, LBL, and LLNL. Design optimization during the past year and changes from the original CDR design are described. R ampersand D activities have focused primarily on the key technology areas of vacuum, RF, and feedback system design. Recent progress in these areas is described. The R ampersand D results have verified our design assumptions and provide further confidence in the design of PEP-II

  19. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.

  20. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  1. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  2. Design of a sample acquistion system for the Mars exobiological penetrator

    Science.gov (United States)

    Thomson, Ron; Gwynne, Owen

    1988-01-01

    The Mars Exobiological Penetrator will be imbedded into several locations on the Martian surface. It contains various scientific instruments, such as an Alpha-Particle Instrument (API), Differential Scanning Calorimeter (DSC), Evolved Gas Analyzer (EGA) and accelerometers. A sample is required for analysis in the API and DSC. To avoid impact contaminated material, this sample must be taken from soil greater than 2 cm away from the penetrator shell. This study examines the design of a dedicated sampling system including deployment, suspension, fore/after body coupling, sample gathering and placement. To prevent subsurface material from entering the penetrator sampling compartment during impact, a plug is placed in the exit hole of the wall. A U-lever device is used to hold this plug in the penetrator wall. The U-lever rotates upon initial motion of the core-grinder mechanism (CGM), releasing the plug. Research points to a combination of coring and grinding as a plausible solution to the problem of dry drilling. The CGM, driven by two compressed springs, will be deployed along a tracking system. A slowly varying load i.e., springs, is favored over a fixed displacement motion because of its adaptability to different material hardness. However, to accommodate sampling in a low density soil, two dash pots set a maximum transverse velocity. In addition, minimal power use is achieved by unidirectional motion of the CGM. The sample will be transported to the scientific instruments by means of a sample placement tray that is driven by a compressed spring to avoid unnecessary power usage. This paper also explores possible modifications for size, weight, and time as well as possible future studies.

  3. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  4. Baseline Design Compliance Matrix for the Type 4 In Situ Vapor Samplers and Supernate and Sludge and Soft Saltcake Grab Sampling

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    The DOE has identified a need to sample vapor space, exhaust ducts, supernate, sludge, and soft saltcake in waste tanks that store radioactive waste. This document provides the Design Compliance Matrix (DCM) for the Type 4 In-Situ Vapor Sampling (ISVS) system and the Grab Sampling System that are used for completing this type of sampling function. The DCM identifies the design requirements and the source of the requirements for the Type 4 ISVS system and the Grab Sampling system. The DCM is a single-source compilation design requirements for sampling and sampling support equipment and supports the configuration management of these systems

  5. Headspace vapor characterization of Hanford Waste Tank 241-T-110: Results from samples collected on August 31, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    McVeety, B.D.; Thomas, B.L.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-T-110 (Tank T-110) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5056. Samples were collected by WHC on August 31, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  6. Headspace vapor characterization of Hanford Waste Tank 241-TX-111: Results from samples collected on October 12, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-06-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-TX-111 (Tank TX-111) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5069. Samples were collected by WHC on October 12, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  7. Headspace vapor characterization of Hanford Waste Tank 241-SX-109: Results from samples collected on August 1, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-109 (Tank SX-109) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5048. Samples were collected by WHC on August 1, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  8. Headspace vapor characterization of Hanford Waste Tank 241-SX-104: Results from samples collected on July 25, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Thomas, B.L.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-104 (Tank SX-104) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5049. Samples were collected by WHC on July 25, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  9. Headspace vapor characterization of Hanford Waste Tank 241-S-112: Results from samples collected on July 11, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Clauss, T.W.; Pool, K.H.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage Tank 241-S-112 (Tank S-112) at the Hanford. Pacific Northwest National Laboratory (PNNL) is contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5044. Samples were collected by WHC on July 11, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  10. Headspace vapor characterization of Hanford Waste Tank 241-SX-105: Results from samples collected on July 26, 1995. Tank Vapor Characterization Project

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Evans, J.C.

    1996-05-01

    This report describes the results of vapor samples taken from the headspace of waste storage tank 241-SX-105 (Tank SX-105) at the Hanford Site in Washington State. Pacific Northwest National Laboratory (PNNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze samples for inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The analytical work was performed by the PNNL Vapor Analytical Laboratory (VAL) by the Tank Vapor Characterization Project. Work performed was based on a sample and analysis plan (SAP) prepared by WHC. The SAP provided job-specific instructions for samples, analyses, and reporting. The SAP for this sample job was open-quotes Vapor Sampling and Analysis Planclose quotes, and the sample job was designated S5047. Samples were collected by WHC on July 26, 1995, using the Vapor Sampling System (VSS), a truck-based sampling method using a heated probe inserted into the tank headspace

  11. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  12. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  13. [Design of standard voice sample text for subjective auditory perceptual evaluation of voice disorders].

    Science.gov (United States)

    Li, Jin-rang; Sun, Yan-yan; Xu, Wen

    2010-09-01

    To design a speech voice sample text with all phonemes in Mandarin for subjective auditory perceptual evaluation of voice disorders. The principles for design of a speech voice sample text are: The short text should include the 21 initials and 39 finals, this may cover all the phonemes in Mandarin. Also, the short text should have some meanings. A short text was made out. It had 155 Chinese words, and included 21 initials and 38 finals (the final, ê, was not included because it was rarely used in Mandarin). Also, the text covered 17 light tones and one "Erhua". The constituent ratios of the initials and finals presented in this short text were statistically similar as those in Mandarin according to the method of similarity of the sample and population (r = 0.742, P text were statistically not similar as those in Mandarin (r = 0.731, P > 0.05). A speech voice sample text with all phonemes in Mandarin was made out. The constituent ratios of the initials and finals presented in this short text are similar as those in Mandarin. Its value for subjective auditory perceptual evaluation of voice disorders need further study.

  14. A novel ultra-short scanning nuclear microprobe: Design and preliminary results

    International Nuclear Information System (INIS)

    Lebed, S.; Butz, T.; Vogt, J.; Reinert, T.; Spemann, D.; Heitmann, J.; Stachura, Z.; Lekki, J.; Potempa, A.; Styczen, J.; Sulkio-Cleff, B.

    2001-01-01

    The paper describes an optimized scanning nuclear microprobe (MP) with a new ultra-short (total length of 1.85 m) probe forming system based on a divided Russian quadruplet (DRQ) of magnetic quadrupole lenses. Modern electrostatic accelerators have a comparatively high beam brightness of about 10-25 pA/μm 2 /mrad 2 /MeV. This allows the MP proposed to provide a high lateral resolution even with large (1%) parasitic (sextupole and octupole) pole tip field components in all lenses. The features of the design permit the MP operation in the high current and low current modes with a short working distance and inexpensive quadrupole lenses. A new quadrupole doublet design has been developed for the MP. In the present work the calculated features of the new MP are compared with preliminary experimental results obtained with a similar system (total length of 2.3 m) at the INP in Cracow. The new MP is promising for studies of solids or biological samples with high resolutions (0.08-2 μm) in both modes under ambient conditions. A vertical version of the ultra-short MP can be very useful for single ion bombardments of living cells

  15. Gas and liquid sampling for closed canisters in K-West basins - functional design criteria

    International Nuclear Information System (INIS)

    Pitkoff, C.C.

    1994-01-01

    The purpose of this document is to provide functions and requirements for the design and fabrication of equipment for sampling closed canisters in the K-West basin. The samples will be used to help determine the state of the fuel elements in closed canisters. The characterization information obtained will support evaluation and development of processes required for safe storage and disposition of Spent Nuclear Fuel (SNF) materials

  16. Spillway design implications resulting from changes in rainfall extremes

    International Nuclear Information System (INIS)

    Muzik, I.

    1999-01-01

    A study was conducted in order to determine how serious implications regarding spillway design of small dams would result from changes in flood frequencies and magnitudes, because of changes in rainfall regime in turn brought on by climate change due to carbon dioxide accumulation in the atmosphere. The region selected for study was the central Alberta foothills and adjacent prairie environment. A study watershed, representative of the region, was chosen to assess the present and possible future flood frequency-magnitude relationships. A Monte Carlo simulation method was used in conjunction with rainfall-runoff modelling of the study watershed to generate data for flood frequency analysis of maximum annual flood series corresponding to the present and future climate scenarios. The impact of resulting differences in design floods for small dams on spillway design was investigated using the Prairie Farm Rehabilitation Administration small dam design method. Changes in the mean and standard deviation of rainfall depth of design storms in a region will result in new probability distributions of the maximum annual flood flows. A 25% increase in the mean and standard deviation of design rainfall depth resulted in greater increases of 1:2 and 1:100 flood flows than a 50% increase in the standard deviation alone did. Under scenario 1, the 1:2 flood flows increased more than did the 1:100 flows. Scenario 2 produced opposite results, whereby the 1:100 flows increased more than did the 1:2 flows. It seems that a climate change of the type of scenario 1 would result in a more severe increase in flood flows than scenario 2 would. Retrofitting existing spillways of small dams would in most cases require increasing flow capacities of both operating and auxilliary spillways. 23 refs

  17. Understanding Creative Design Processes by Integrating Sketching and CAD Modelling Design Environments: A Preliminary Protocol Result from Architectural Designers

    Directory of Open Access Journals (Sweden)

    Yi Teng Shih

    2015-11-01

    Full Text Available This paper presents the results of a preliminary protocol study of the cognitive behaviour of architectural designers during the design process. The aim is to better understand the similarities and differences in cognitive behaviour using Sequential Mixed Media (SMM and Alternative Mixed Media (AMM approaches, and how switching between media may impact on design processes. Two participants with at least one-year’s professional design experience and a Bachelor of Design degree, and competence in both sketching and computer-aid design (CAD modelling participated in the study. Video recordings of participants working on different projects were coded using the Function-Behaviour-Structure (FBS coding scheme. Participants were also interviewed and their explanations about their switching behaviours were categorised into three types: S→C, S/C↹R and C→S. Preliminary results indicate that switching between media may influence how designers identify problems and develop solutions. In particular, two design issues were identified.  These relate to the FBS coding scheme, where structure (S and behaviour derived from structure (Bs, change to documentation (D after switching from sketching to CAD modelling (S→C. These switches make it possible for designers to integrate both approaches into one design medium and facilitate their design processes in AMM design environments.

  18. Results of the Characterization and Dissolution Tests of Samples from Tank 16H

    International Nuclear Information System (INIS)

    Hay, M.S.

    1999-01-01

    Samples from Tank 16H annulus and one sample from the tank interior were characterized to provide a source term for use in fate and transport modeling. Four of the annulus samples appeared to be similar based on visual examination and were combined to form a composite. One of the annulus samples appeared to be different from the other four based on visual examination and was analyzed separately. The analytical results of the tank interior sample indicate the sample is composed predominantly of iron containing compounds. Both of the annulus samples are composed mainly of sodium salts, however, the composite sample contained significantly more sludge/sand material of low solubilitity. The characterization of the tank 16H annulus and tank interior samples was hampered by the high dose rate and the nature of the samples. The difficulties resulted in large uncertainties in the analytical data. The large uncertainties coupled with the number of important species below detection limits indicate the need for reanalysis of the Tank 16H samples as funding becomes available. Recommendations on potential remedies for these difficulties are provided. In general, none of the reagents appeared to be effective in dissolving the composite sample even after two contacts at elevated temperature. In contrast to the composite sample, all of the reagents dissolved a large percentage of the HTF-087 solids after two contacts at ambient temperature

  19. Evaluation of design flood estimates with respect to sample size

    Science.gov (United States)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  20. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  1. Test of a sample container for shipment of small size plutonium samples with PAT-2

    International Nuclear Information System (INIS)

    Kuhn, E.; Aigner, H.; Deron, S.

    1981-11-01

    A light-weight container for the air transport of plutonium, to be designated PAT-2, has been developed in the USA and is presently undergoing licensing. The very limited effective space for bearing plutonium required the design of small size sample canisters to meet the needs of international safeguards for the shipment of plutonium samples. The applicability of a small canister for the sampling of small size powder and solution samples has been tested in an intralaboratory experiment. The results of the experiment, based on the concept of pre-weighed samples, show that the tested canister can successfully be used for the sampling of small size PuO 2 -powder samples of homogeneous source material, as well as for dried aliquands of plutonium nitrate solutions. (author)

  2. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    Science.gov (United States)

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Design of sample analysis device for iodine adsorption efficiency test in NPPs

    International Nuclear Information System (INIS)

    Ji Jinnan

    2015-01-01

    In nuclear power plants, iodine adsorption efficiency test is used to check the iodine adsorption efficiency of the iodine adsorber. The iodine adsorption efficiency can be calculated through the analysis of the test sample, and thus to determine if the performance of the adsorber meets the requirement on the equipment operation and emission. Considering the process of test and actual demand, in this paper, a special device for the analysis of this kind of test sample is designed. The application shows that the device is with convenient operation and high reliability and accurate calculation, and improves the experiment efficiency and reduces the experiment risk. (author)

  4. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  5. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  6. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  7. Field Investigation Plan for 1301-N and 1325-N Facilities Sampling to Support Remedial Design

    International Nuclear Information System (INIS)

    Weiss, S. G.

    1998-01-01

    This field investigation plan (FIP) provides for the sampling and analysis activities supporting the remedial design planning for the planned removal action for the 1301-N and 1325-N Liquid Waste Disposal Facilities (LWDFs), which are treatment, storage,and disposal (TSD) units (cribs/trenches). The planned removal action involves excavation, transportation, and disposal of contaminated material at the Environmental Restoration Disposal Facility (ERDF).An engineering study (BHI 1997) was performed to develop and evaluate various options that are predominantly influenced by the volume of high- and low-activity contaminated soil requiring removal. The study recommended that additional sampling be performed to supplement historical data for use in the remedial design

  8. The design of high-temperature thermal conductivity measurements apparatus for thin sample size

    Directory of Open Access Journals (Sweden)

    Hadi Syamsul

    2017-01-01

    Full Text Available This study presents the designing, constructing and validating processes of thermal conductivity apparatus using steady-state heat-transfer techniques with the capability of testing a material at high temperatures. This design is an improvement from ASTM D5470 standard where meter-bars with the equal cross-sectional area were used to extrapolate surface temperature and measure heat transfer across a sample. There were two meter-bars in apparatus where each was placed three thermocouples. This Apparatus using a heater with a power of 1,000 watts, and cooling water to stable condition. The pressure applied was 3.4 MPa at the cross-sectional area of 113.09 mm2 meter-bar and thermal grease to minimized interfacial thermal contact resistance. To determine the performance, the validating process proceeded by comparing the results with thermal conductivity obtained by THB 500 made by LINSEIS. The tests showed the thermal conductivity of the stainless steel and bronze are 15.28 Wm-1K-1 and 38.01 Wm-1K-1 with a difference of test apparatus THB 500 are −2.55% and 2.49%. Furthermore, this apparatus has the capability to measure the thermal conductivity of the material to a temperature of 400°C where the results for the thermal conductivity of stainless steel is 19.21 Wm-1K-1 and the difference was 7.93%.

  9. Development of sample assay system equipped with 3He Alternative Neutron Detectors (ASAS). (2) Results of ASAS measurement test

    International Nuclear Information System (INIS)

    Tanigawa, Masafumi; Mukai, Yasunobu; Kurita, Tsutomu; Makino, Risa; Nakamura, Hironobu; Tobita, Hiroshi; Ohzu, Akira; Kureta, Masatoshi; Seya, Michio

    2015-01-01

    Against the background of the serious shortage of 3 He gas, design and development of a new detector equipped ZnS/ 10 B 2 O 3 ceramic scintillation neutron detectors in JAEA, with the support of the government (the Ministry of Education, Culture, Sports, Science and Technology). The design of the alternative 3 He detector is referred from INVS (INVentory Sample assay system (HLNCC (High Level Neutron Coincidence Counter) type)) which is being used for the verification of MOX powder etc. and is named it as ASAS (Alternative Sample Assay System). In order to prove the Pu quantitative performance as an alternative technology, several measurement tests and comparison test with INVS were conducted using ASAS. In these tests, evaluation of fundamental performance (counting efficiency and die-away time) and uncertainty evaluations were implemented. As a result, although fundamental performance of ASAS was not achieved to the one of INVS, we could confirm that ASAS has almost the same Pu quantitative performance including measurement uncertainty as that of INVS. (author)

  10. Gamma ray spectrometry results from core samples collected for RESUME 95

    International Nuclear Information System (INIS)

    Sanderson, D.C.W.; Allyson, J.D.; Toivonen, H.; Honkamaa, T.

    1997-01-01

    Field sampling of an airfield at Vesivehmaa, near Vaeaeksy, Finland (Area I) was carried out between 26-29 May 1995, to establish the radionuclide deposition and inventory of Chernobyl derived 137 Cs, and natural radionuclides. The objective was to establish a common calibration site for in-situ and airborne gamma spectrometers, for Exercise RESUME 95 conducted in August 1995. The report presents the sampling details, handling and treatment. The analyses are discussed with particular emphasis given to 137 Ca, 134 Cs, 40 K, 214 Bi and 208 radionuclides, and the quantification of their respective deposition and inventories. The results have been used to estimate the effective concentrations of nuclides at the calibration site for in-situ and airborne gamma spectrometry, and the depth distribution. For 137 Cs the weighted mean activity per unit area takes on values of 50.7±5.2 kBq m -2 at 1 m ground clearance, 51.1±6.9 kBq m -2 at 50 m height and 47.9±8.5 kBq m -2 at 100 m. The similarity of these values confirms the suitability of the Vesivehmaa site for comparison of in-situ and airborne results despite variations of a factor of two between results from individual cores. The mean α/ρ value for 137 Cs in Area I is 0.77±0.10 cm 2 g -1 (relaxation mass per unit area, β 1.31±0.15 gcm -2 ). Additional soil sampling across parts of Area II (a 6x3 km area selected for mapping Chernobyl deposition) was carried out. The mean level of 137 Cs activity from these samples was 92.4±63 kBq m -2 , a sample taken near Laihansuo showing the largest value obtained at 172 kBq m -2 . (EG)

  11. Results-Based Organization Design for Technology Entrepreneurs

    Directory of Open Access Journals (Sweden)

    Chris McPhee

    2012-05-01

    Full Text Available Faced with considerable uncertainty, entrepreneurs would benefit from clearly defined objectives, a plan to achieve these objectives (including a reasonable expectation that this plan will work, as well as a means to measure progress and make requisite course corrections. In this article, the author combines the benefits of results-based management with the benefits of organization design to describe a practical approach that technology entrepreneurs can use to design their organizations so that they deliver desired outcomes. This approach links insights from theory and practice, builds logical connections between entrepreneurial activities and desired outcomes, and measures progress toward those outcomes. This approach also provides a mechanism for entrepreneurs to make continual adjustments and improvements to their design and direction in response to data, customer and stakeholder feedback, and changes in their business environment.

  12. Design of modified annulus air sampling system for the detection of leakage in waste transfer line

    International Nuclear Information System (INIS)

    Deokar, U.V; Khot, A.R.; Mathew, P.; Ganesh, G.; Tripathi, R.M.; Srivastava, Srishti

    2018-01-01

    Various liquid waste streams are generated during the operation of reprocessing plant. The High Level (HL), Intermediate Level (IL) and Low Level (LL) liquid wastes generated, are transferred from reprocessing plant to Waste Management Facility. These respective waste streams are transferred through pipe-in-pipe lines along the shielded concrete trench. For detection of radioactive leakage from primary waste transfer line into secondary line, sampling of the annulus air between the two pipes is carried out. The currently installed pressurized annulus air sampling system did not have online leakage detection provision. Hence, there are chances of personal exposure and airborne activity in the working area. To overcome these design flaws, free air flow modified online annulus air sampling system with more safety features is designed

  13. Influence of a sampling review process for radiation oncology quality assurance in cooperative group clinical trials -- results of the Radiation Therapy Oncology Group (RTOG) analysis

    International Nuclear Information System (INIS)

    Martin, Linda A.; Krall, John M.; Curran, Walter J.; Leibel, Steven A.; Cox, James D.

    1995-01-01

    The Radiation Therapy Oncology Group (RTOG) designed a random sampling process and observed its influence upon radiotherapy review mechanisms in cooperative group clinical trials. The method of sampling cases for review was modeled from sampling techniques commonly used in pharmaceutical quality assurance programs, and applied to the initial (on-study) review of protocol cases. 'In control' (IC) status is defined for a given facility as the ability to meet minimum compliance standards. Upon achieving IC status, activation of the sampling process was linked to the rate of continued patient accrual for each participating institution in a given protocol. The sampling design specified that ≥ 30% cases not in compliance would be detected with 80% power. A total of 458 cases was analyzed for initial review findings in four RTOG Phase III protocols. Initial review findings were compared with retrospective (final) review results. Of the 458 cases analyzed, 370 underwent initial review at on-study, while 88 did not require review as they were enrolled from institutions that had demonstrated protocol compliance. In the group that had both initial and final review, (345(370)) (93%) were found to have followed the protocol or had a minor variation. Of the exempted cases, (79(88)) (90%) were found to be per protocol or a minor variant. The sampling process proved itself to be cost-effective and resulted in a noticeable reduction in the workload, thus providing an improved approach to resource allocation for the group. Continued evaluation of the sampling mechanism is appropriate as study designs and participants vary over time, and as more data become available to study. Further investigation of individual protocol compliance is appropriate to identify problems specific to new trial investigations

  14. Clean Sampling of an Englacial Conduit at Blood Falls, Antarctica - Some Experimental and Numerical Results

    Science.gov (United States)

    Kowalski, Julia; Francke, Gero; Feldmann, Marco; Espe, Clemens; Heinen, Dirk; Digel, Ilya; Clemens, Joachim; Schüller, Kai; Mikucki, Jill; Tulaczyk, Slawek M.; Pettit, Erin; Berry Lyons, W.; Dachwald, Bernd

    2017-04-01

    There is significant interest in sampling subglacial environments for geochemical and microbiological studies, yet those environments are typically difficult to access. Existing ice-drilling technologies make it cumbersome to maintain microbiologically clean access for sample acquisition and environmental stewardship of potentially fragile subglacial aquatic ecosystems. With the "IceMole", a minimally invasive, maneuverable subsurface ice probe, we have developed a clean glacial exploration technology for in-situ analysis and sampling of glacial ice and sub- and englacial materials. Its design is based on combining melting and mechanical stabilization, using an ice screw at the tip of the melting head to maintain firm contact between the melting head and the ice. The IceMole can change its melting direction by differential heating of the melting head and optional side wall heaters. Downward, horizontal and upward melting, as well as curve driving and penetration of particulate-ladden layers has already been demonstrated in several field tests. This maneuverability of the IceMole also necessitates a sophisticated on-board navigation system, capable of autonomous operations. Therefore, between 2012 and 2014, a more advanced probe was developed as part of the "Enceladus Explorer" (EnEx) project. The EnEx-IceMole offers systems for accurate positioning, based on in-ice attitude determination, acoustic positioning, ultrasonic obstacle and target detection, which is all integrated through a high-level sensor fusion algorithm. In December 2014, the EnEx-IceMole was used for clean access into a unique subglacial aquatic environment at Blood Falls, Antarctica, where an englacial brine sample was successfully obtained after about 17 meters of oblique melting. Particular attention was paid to clean protocols for sampling for geochemical and microbiological analysis. In this contribution, we will describe the general technological approach of the IceMole and report on the

  15. Summary of Test Results for Daya Bay Rock Samples

    International Nuclear Information System (INIS)

    Onishi, Celia Tiemi; Dobson, Patrick; Nakagawa, Seiji

    2004-01-01

    A series of analytical tests was conducted on a suite of granitic rock samples from the Daya Bay region of southeast China. The objective of these analyses was to determine key rock properties that would affect the suitability of this location for the siting of a neutrino oscillation experiment. This report contains the results of chemical analyses, rock property measurements, and a calculation of the mean atomic weight

  16. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  17. Results of Fall 1994 sampling of gunite and associated tanks at the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-06-01

    This Technical Memorandum, was developed under Work Breakdown Structure 1.4.12.6.1.01.41.12.02. 11 (Activity Data Sheet 3301, ''WAG 1''). This document provides the Environmental Restoration Program with analytical results from liquid and sludge samples from the Gunite and Associated Tanks (GAAT). Information provided in this report forms part of the technical basis for criticality safety, systems safety, engineering design, and waste management as they apply to the GAAT treatability study and remediation

  18. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  19. Sample preparation for the HAW project and experimental results from the HFR

    International Nuclear Information System (INIS)

    Garcia Celma, A.; Wees, H. van; Miralles, L.

    1990-09-01

    This report deals with the preparation and analysis of samples, during the period May 1989-November 1989, for the High-Active Waste (HAW) project, a large-scale in situ test being performed underground in the Asse salt mine, Remlingen FRG. The development of the technical procedures required, and the scientific results, which regard mostly characterization of Potasas del Llobregat sample, are reported. Prior to using the samples in both the H.A.W. and the H.F.R. experiments they have to be machined to fit their holders. Technical improvements for machining samples of salt are reported. (H.W.). 9 refs.; 68 figs.; 10 tabs

  20. Experimental study of glass sampling devices

    International Nuclear Information System (INIS)

    Jouan, A.; Moncouyoux, J.P.; Meyere, A.

    1992-01-01

    Two high-level liquid waste containment glass sampling systems have been designed and built. The first device fits entirely inside a standard glass storage canister, and may thus be used in facilities not initially designed for this function. It has been tested successfully in the nonradioactive prototype unit at Marcoule. The work primarily covered the design and construction of an articulated arm supporting the sampling vessel, and the mechanisms necessary for filling the vessel and recovering the sample. System actuation and operation are fully automatic, and the resulting sample is representative of the glass melt. Implementation of the device is delicate however, and its reliability is estimated at about 75%. A second device was designed specifically for new vitrification facilities. It is installed directly on the glass melting furnace, and meets process operating and quality control requirements. Tests conducted at the Marcoule prototype vitrification facility demonstrated the feasibility of the system. Special attention was given to the sampling vessel transfer mechanisms, with two filling and controlled sample cooling options

  1. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  2. Waste tank vapor project: Vapor space characterization of waste tank 241-BY-104: Results from samples collected on June 24, 1994

    International Nuclear Information System (INIS)

    Clauss, T.W.; Ligotke, M.W.; McVeety, B.D.; Pool, K.H.; Lucke, R.B.; Fruchter, J.S.; Goheen, S.C.

    1994-11-01

    This report describes results of the analyses of tank-headspace samples taken from Hanford waste Tank 241-BY-104 (referred to as Tank BY-104) on June 24, 1994. The Pacific Northwest Laboratory (PNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and analyze inorganic and organic samples collected from the tank headspace. The sample job was designated S4019 and was performed by WHC on June 24, 1994 using the vapor sampling system (VSS). The results of the analyses are expected to be used in the determination of safety and toxicological issues related to the tank-headspace gas as described in the WHC report entitled Data Quality Objectives for Generic In-Tank Health and Safety Vapor Issue Resolution, WHC-SD-WM-DQO-002, Rev. 0. Sampling devices, including 16 sorbent trains (for inorganic analyses), and 5 SUMMA trademark canisters (for organic analyses), were supplied to the WHC sampling staff on June 20, 1994. Samples were taken (by WHC) on June 24. The samples were returned from the field on June 27. The inorganic samples delivered to PNL on chain-of-custody (COC) 006893 included 16 sorbent trains as described in Tables 2.2, 2.3, and 2.4. Additional inorganic blank spikes were obtained from related sample jobs. SUMMA trademark samples delivered to PNL on COC 006896 included one ambient air sample, one ambient-air sample through the sampling system, and three tank-headspace SUMMA trademark canister samples. The samples were inspected upon delivery to the 326/23B laboratory and logged into PNL laboratory record book 55408. Custody of the sorbent trains was transferred to PNL personnel performing the inorganic analysis and stored at refrigerated (≤10 degrees C) temperature until the time of analysis. Access to the 326/23B laboratory is limited to PNL personnel working on the waste-tank safety program

  3. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  4. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  5. Sampling Lesbian, Gay, and Bisexual Populations

    Science.gov (United States)

    Meyer, Ilan H.; Wilson, Patrick A.

    2009-01-01

    Sampling has been the single most influential component of conducting research with lesbian, gay, and bisexual (LGB) populations. Poor sampling designs can result in biased results that will mislead other researchers, policymakers, and practitioners. Investigators wishing to study LGB populations must therefore devote significant energy and…

  6. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  7. Gamma ray spectrometry results from core samples collected for RESUME 95

    Energy Technology Data Exchange (ETDEWEB)

    Sanderson, D.C.W.; Allyson, J.D. [SURRC, East Kilbride, Scotland (United Kingdom); Toivonen, H.; Honkamaa, T. [STUK, Helsinki (Finland)

    1997-12-31

    Field sampling of an airfield at Vesivehmaa, near Vaeaeksy, Finland (Area I) was carried out between 26-29 May 1995, to establish the radionuclide deposition and inventory of Chernobyl derived {sup 137}Cs, and natural radionuclides. The objective was to establish a common calibration site for in-situ and airborne gamma spectrometers, for Exercise RESUME 95 conducted in August 1995. The report presents the sampling details, handling and treatment. The analyses are discussed with particular emphasis given to {sup 137}Ca, {sup 134}Cs, {sup 40}K, {sup 214}Bi and {sup 208} radionuclides, and the quantification of their respective deposition and inventories. The results have been used to estimate the effective concentrations of nuclides at the calibration site for in-situ and airborne gamma spectrometry, and the depth distribution. For {sup 137}Cs the weighted mean activity per unit area takes on values of 50.7{+-}5.2 kBq m{sup -2} at 1 m ground clearance, 51.1{+-}6.9 kBq m{sup -2} at 50 m height and 47.9{+-}8.5 kBq m{sup -2} at 100 m. The similarity of these values confirms the suitability of the Vesivehmaa site for comparison of in-situ and airborne results despite variations of a factor of two between results from individual cores. The mean {alpha}/{rho} value for {sup 137}Cs in Area I is 0.77{+-}0.10 cm{sup 2}g{sup -1} (relaxation mass per unit area, {beta} 1.31{+-}0.15 gcm{sup -2}). Additional soil sampling across parts of Area II (a 6x3 km area selected for mapping Chernobyl deposition) was carried out. The mean level of {sup 137}Cs activity from these samples was 92.4{+-}63 kBq m{sup -2}, a sample taken near Laihansuo showing the largest value obtained at 172 kBq m{sup -2}. (EG). 17 refs.

  8. Gamma ray spectrometry results from core samples collected for RESUME 95

    Energy Technology Data Exchange (ETDEWEB)

    Sanderson, D C.W.; Allyson, J D [SURRC, East Kilbride, Scotland (United Kingdom); Toivonen, H; Honkamaa, T [STUK, Helsinki (Finland)

    1998-12-31

    Field sampling of an airfield at Vesivehmaa, near Vaeaeksy, Finland (Area I) was carried out between 26-29 May 1995, to establish the radionuclide deposition and inventory of Chernobyl derived {sup 137}Cs, and natural radionuclides. The objective was to establish a common calibration site for in-situ and airborne gamma spectrometers, for Exercise RESUME 95 conducted in August 1995. The report presents the sampling details, handling and treatment. The analyses are discussed with particular emphasis given to {sup 137}Ca, {sup 134}Cs, {sup 40}K, {sup 214}Bi and {sup 208} radionuclides, and the quantification of their respective deposition and inventories. The results have been used to estimate the effective concentrations of nuclides at the calibration site for in-situ and airborne gamma spectrometry, and the depth distribution. For {sup 137}Cs the weighted mean activity per unit area takes on values of 50.7{+-}5.2 kBq m{sup -2} at 1 m ground clearance, 51.1{+-}6.9 kBq m{sup -2} at 50 m height and 47.9{+-}8.5 kBq m{sup -2} at 100 m. The similarity of these values confirms the suitability of the Vesivehmaa site for comparison of in-situ and airborne results despite variations of a factor of two between results from individual cores. The mean {alpha}/{rho} value for {sup 137}Cs in Area I is 0.77{+-}0.10 cm{sup 2}g{sup -1} (relaxation mass per unit area, {beta} 1.31{+-}0.15 gcm{sup -2}). Additional soil sampling across parts of Area II (a 6x3 km area selected for mapping Chernobyl deposition) was carried out. The mean level of {sup 137}Cs activity from these samples was 92.4{+-}63 kBq m{sup -2}, a sample taken near Laihansuo showing the largest value obtained at 172 kBq m{sup -2}. (EG). 17 refs.

  9. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject`s body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  10. Urine sample collection protocols for bioassay samples

    Energy Technology Data Exchange (ETDEWEB)

    MacLellan, J.A.; McFadden, K.M.

    1992-11-01

    In vitro radiobioassay analyses are used to measure the amount of radioactive material excreted by personnel exposed to the potential intake of radioactive material. The analytical results are then used with various metabolic models to estimate the amount of radioactive material in the subject's body and the original intake of radioactive material. Proper application of these metabolic models requires knowledge of the excretion period. It is normal practice to design the bioassay program based on a 24-hour excretion sample. The Hanford bioassay program simulates a total 24-hour urine excretion sample with urine collection periods lasting from one-half hour before retiring to one-half hour after rising on two consecutive days. Urine passed during the specified periods is collected in three 1-L bottles. Because the daily excretion volume given in Publication 23 of the International Commission on Radiological Protection (ICRP 1975, p. 354) for Reference Man is 1.4 L, it was proposed to use only two 1-L bottles as a cost-saving measure. This raised the broader question of what should be the design capacity of a 24-hour urine sample kit.

  11. REXEBTS, design and initial commissioning results

    CERN Document Server

    Wenander, F; Jonson, B; Liljeby, L; Nyman, G H; Rensfelt, K G; Skeppstedt, Ö; Wolf, B

    2001-01-01

    The REXEDIS is an Electron Beam Ion Source (EBIS) developed particularly for charge breeding of rare and short-lived isotopes produced at ISOLDE for the REX-ISOLDE post accelerator. Bunches of singly charged radioactive ions are injected into the EBIS and charge bred to a charge-to-mass ratio of approximately 1/4 and thereafter extracted and injected into a short LINAC. This novel concept, employing a Penning trap to bunch and cool the ions from an on-line mass separator prior to charge breeding in an EBIS, results in an efficient and compact system. In this article the final REXEBIS design is presented together with results from the first tests. (19 refs).

  12. Design of an automatic sample changer for the measurement of neutron flux by gamma spectrometry

    International Nuclear Information System (INIS)

    Gago, Javier; Bruna, Ruben; Baltuano, Oscar; Montoya, Eduardo; Descreaux, Killian

    2014-01-01

    This paper presents calculus, selection and components design for the construction of an automatic system in order to measure neutron flux in a working nuclear reactor by the gamma spectrometry technique using samples irradiated on the RP-10 nucleus. This system will perform the measurement of interchanging 100 samples in a programed and automatic way, reducing operation time by the user and obtaining more accurate measures. (authors).

  13. SOLVENT HOLD TANK SAMPLE RESULTS FOR MCU-13-189, MCU-13-190, AND MCU-13-191: QUARTERLY SAMPLE FROM SEPTEMBER 2013

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F.; Taylor-Pashow, K.

    2013-10-31

    Savannah River National Laboratory (SRNL) analyzed solvent samples from Modular Caustic-Side Solvent Extraction Unit (MCU) in support of continuing operations. A quarterly analysis of the solvent is required to maintain solvent composition within specifications. Analytical results of the analyses of Solvent Hold Tank (SHT) samples MCU-13-189, MCU-13-190, and MCU-13-191 received on September 4, 2013 are reported. The results show that the solvent (remaining heel in the SHT tank) at MCU contains excess Isopar L and a deficit concentration of modifier and trioctylamine when compared to the standard MCU solvent. As with the previous solvent sample results, these analyses indicate that the solvent does not require Isopar L trimming at this time. Since MCU is switching to NGS, there is no need to add TOA nor modifier. SRNL also analyzed the SHT sample for {{sup 137}Cs content and determined the measured value is within tolerance and the value has returned to levels observed in 2011.

  14. Results For The Third Quarter 2010 Tank 50 WAC Slurry Sample: Chemical And Radionuclide Contaminant Results

    International Nuclear Information System (INIS)

    Reigel, M.; Bibler, N.

    2010-01-01

    This report details the chemical and radionuclide contaminant results for the characterization of the 2010 Third Quarter sampling of Tank 50 for the Saltstone Waste Acceptance Criteria (WAC). Information from this characterization will be used by Liquid Waste Operations (LWO) to support the transfer of low-level aqueous waste from Tank 50 to the Salt Feed Tank in the Saltstone Facility in Z-Area, where the waste will be immobilized. This information is also used to update the Tank 50 Waste Characterization System. The following conclusions are drawn from the analytical results provided in this report: (i) The concentrations of the reported chemical and radioactive contaminants were less than their respective WAC targets or limits unless noted in this section. (ii) The reported detection limits for 94 Nb, 247 Cm and 249 Cf are above the requested limits from Reference 4. However, they are below the limits established in Reference 3. (iii) The reported detection limit for 242m Am is greater than the requested limit from Attachment 8.4 of the WAC. (iv) The reported detection limit for Isopar L is greater than the limit from Table 3 of the WAC. (v) The reported concentration of Isopropanol is greater than the limit from Table 4 of the WAC. (vi) Isopar L and Norpar 13 have limited solubility in aqueous solutions making it difficult to obtain consistent and reliable sub-samples. The values reported in this memo are the concentrations in the sub-sample as detected by the GC/MS; however, the results may not accurately represent the concentrations of the analytes in Tank 50.

  15. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    Science.gov (United States)

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  16. Stirling cryocooler test results and design model verification

    International Nuclear Information System (INIS)

    Shimko, M.A.; Stacy, W.D.; McCormick, J.A.

    1990-01-01

    This paper reports on progress in developing a long-life Stirling cycle cryocooler for space borne applications. It presents the results from tests on a preliminary breadboard version of the cryocooler used to demonstrate the feasibility of the technology and to validate the regenerator design code used in its development. This machine achieved a cold-end temperature of 65 K while carrying a 1/2 Watt cooling load. The basic machine is a double-acting, flexure-bearing, split Stirling design with linear electromagnetic drives for the expander and compressors. Flat metal diaphragms replace pistons for both sweeping and sealing the machine working volumes. In addition, the double-acting expander couples to a laminar-channel counterflow recuperative heat exchanger for regeneration. A PC compatible design code was developed for this design approach that calculates regenerator loss including heat transfer irreversibilities, pressure drop, and axial conduction in the regenerator walls

  17. Verification of aseismic design model by using experimental results

    International Nuclear Information System (INIS)

    Mizuno, N.; Sugiyama, N.; Suzuki, T.; Shibata, Y.; Miura, K.; Miyagawa, N.

    1985-01-01

    A lattice model is applied as an analysis model for an aseismic design of the Hamaoka nuclear reactor building. With object to verify an availability of this design model, two reinforced concrete blocks are constructed on the ground and the forced vibration tests are carried out. The test results are well followed by simulation analysis using the lattice model. Damping value of the ground obtained from the test is more conservative than the design value. (orig.)

  18. Rheology and TIC/TOC results of ORNL tank samples

    International Nuclear Information System (INIS)

    Pareizs, J. M.; Hansen, E. K.

    2013-01-01

    The Savannah River National Laboratory (SRNL)) was requested by Oak Ridge National Laboratory (ORNL) to perform total inorganic carbon (TIC), total organic carbon (TOC), and rheological measurements for several Oak Ridge tank samples. As received slurry samples were diluted and submitted to SRNL-Analytical for TIC and TOC analyses. Settled solids yield stress (also known as settled shear strength) of the as received settled sludge samples were determined using the vane method and these measurements were obtained 24 hours after the samples were allowed to settled undisturbed. Rheological or flow properties (Bingham Plastic viscosity and Bingham Plastic yield stress) were determined from flow curves of the homogenized or well mixed samples. Other targeted total suspended solids (TSS) concentrations samples were also analyzed for flow properties and these samples were obtained by diluting the as-received sample with de-ionized (DI) water

  19. Optimizing sampling design to deal with mist-net avoidance in Amazonian birds and bats.

    Directory of Open Access Journals (Sweden)

    João Tiago Marques

    Full Text Available Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas.

  20. Sample Results from the Interim Salt Disposition Program Macrobatch 8 Tank 21H Qualification Samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Washington, A. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-01-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 8 for the Interim Salt Disposition Program (ISDP). An Actinide Removal Process (ARP) and several Extraction-Scrub- Strip (ESS) tests were also performed. This document reports characterization data on the samples of Tank 21H as well as simulated performance of ARP and the Modular Caustic Side Solvent Extraction (CSSX) Unit (MCU). No issues with the projected Salt Batch 8 strategy are identified. A demonstration of the monosodium titanate (MST) (0.2 g/L) removal of strontium and actinides provided acceptable average decontamination factors for plutonium of 2.62 (4 hour) and 2.90 (8 hour); and average strontium decontamination factors of 21.7 (4 hour) and 21.3 (8 hour). These values are consistent with results from previous salt batch ARP tests. The two ESS tests also showed acceptable performance with extraction distribution ratios (D(Cs)) values of 52.5 and 50.4 for the Next Generation Solvent (NGS) blend (from MCU) and NGS (lab prepared), respectively. These values are consistent with results from previous salt batch ESS tests. Even though the performance is acceptable, SRNL recommends that a model for predicting extraction behavior for cesium removal for the blended solvent and NGS be developed in order to improve our predictive capabilities for the ESS tests.

  1. Sample results from the Interim Salt Disposition Program Macrobatch 8 Tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Washington, II, A. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-01-13

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 8 for the Interim Salt Disposition Program (ISDP). An Actinide Removal Process (ARP) and several Extraction-Scrub-Strip (ESS) tests were also performed. This document reports characterization data on the samples of Tank 21H as well as simulated performance of ARP and the Modular Caustic Side Solvent Extraction (CSSX) Unit (MCU). No issues with the projected Salt Batch 8 strategy are identified. A demonstration of the monosodium titanate (MST) (0.2 g/L) removal of strontium and actinides provided acceptable average decontamination factors for plutonium of 2.62 (4 hour) and 2.90 (8 hour); and average strontium decontamination factors of 21.7 (4 hour) and 21.3 (8 hour). These values are consistent with results from previous salt batch ARP tests. The two ESS tests also showed acceptable performance with extraction distribution ratios (D(Cs)) values of 52.5 and 50.4 for the Next Generation Solvent (NGS) blend (from MCU) and NGS (lab prepared), respectively. These values are consistent with results from previous salt batch ESS tests. Even though the performance is acceptable, SRNL recommends that a model for predicting extraction behavior for cesium removal for the blended solvent and NGS be developed in order to improve our predictive capabilities for the ESS tests.

  2. [Confirming Indicators of Qualitative Results by Chromatography-mass Spectrometry in Biological Samples].

    Science.gov (United States)

    Liu, S D; Zhang, D M; Zhang, W; Zhang, W F

    2017-04-01

    Because of the exist of complex matrix, the confirming indicators of qualitative results for toxic substances in biological samples by chromatography-mass spectrometry are different from that in non-biological samples. Even in biological samples, the confirming indicators are different in various application areas. This paper reviews the similarities and differences of confirming indicators for the analyte in biological samples by chromatography-mass spectrometry in the field of forensic toxicological analysis and other application areas. These confirming indicators include retention time (RT), relative retention time (RRT), signal to noise (S/N), characteristic ions, relative abundance of characteristic ions, parent ion-daughter ion pair and abundance ratio of ion pair, etc. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  3. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    Science.gov (United States)

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from 60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Vapor space characterization of waste tank 241-C-101: Results from samples collected on 9/1/94

    International Nuclear Information System (INIS)

    Lucke, R.B.; Clauss, T.W.; Ligotke, M.W.

    1995-11-01

    This report describes results of the analyses of tank-headspace samples taken from the Hanford waste Tank 241-C-101 (referred to as Tank C-101) and the ambient air collected - 30 ft upwind near the tank and through the VSS near the tank. Pacific Northwest Laboratory (PNL) contracted with Westinghouse Hanford Company (WHC) to provide sampling devices and to analyze inorganic and organic analytes collected from the tank headspace and ambient air near the tank. The sample job was designated S4056, and samples were collected by WHC on September 1, 1994, using the vapor sampling system (VSS). The samples were inspected upon delivery to the 326/23B laboratory and logged into PNL record book 55408 before implementation of PNL Technical Procedure PNL-TVP-07. Custody of the sorbent traps was transferred to PNL personnel performing the inorganic analysis and stored at refrigerated (≤ 10 degrees C) temperature until the time of analysis. The canisters were stored in the 326/23B laboratory at ambient (25 degrees C) temperature until the time of the analysis. Access to the 326/23B laboratory is limited to PNL personnel working on the waste-tank safety program. Analyses described in this report were performed at PNL in the 300 area of the Hanford Reservation. Analytical methods that were used are described in the text. In summary, sorbent traps for inorganic analyses containing sample materials were either weighed (for water analysis) or desorbed with the appropriate aqueous solutions (for NH 3 , NO 2 , and NO analyses). The aqueous extracts were analyzed either by selective electrode or by ion chromatography (IC). Organic analyses were performed using cryogenic preconcentration followed by gas chromatography/mass spectrometry (GC/MS)

  5. ANALYTICAL RESULTS OF MOX COLEMANITE CONCRETE SAMPLE PBC-44.2

    Energy Technology Data Exchange (ETDEWEB)

    Best, D.; Cozzi, A.; Reigel, M.

    2012-12-20

    The Mixed Oxide Fuel Fabrication Facility (MFFF) will use colemanite bearing concrete neutron absorber panels credited with attenuating neutron flux in the criticality design analyses and shielding operators from radiation. The Savannah River National Laboratory is tasked with measuring the total density, partial hydrogen density, and partial boron density of the colemanite concrete. Sample PBC-44.2 was received on 9/20/2012 and analyzed. The average total density measured by the ASTM method C 642 was 2.03 g/cm{sup 3}, within the lower bound of 1.88 g/cm3. The average partial hydrogen density was 6.64E-02 g/cm{sup 3} as measured using method ASTM E 1311 and met the lower bound of 6.04E-02 g/cm{sup 3}. The average measured partial boron density was 1.70E-01 g/cm{sup 3} which met the lower bound of 1.65E-01 g/cm{sup 3} measured by the ASTM C 1301 method.

  6. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  7. Concepts in sample size determination

    Directory of Open Access Journals (Sweden)

    Umadevi K Rao

    2012-01-01

    Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.

  8. The radiologic technologists' health study in South Korea: study design and baseline results.

    Science.gov (United States)

    Lee, Won Jin; Ha, Mina; Hwang, Seung-sik; Lee, Kyoung-Mu; Jin, Young-Woo; Jeong, Meeseon; Jun, Jae Kwan; Cha, Eun Shil; Ko, Yousun; Choi, Kyung-Hwa; Lee, Jung-Eun

    2015-08-01

    To describe the study design, methods, and baseline results of a prospective cohort of radiologic technologists which we have initiated in South Korea. The cohort participants were enrolled through a self-administered questionnaire survey administered from April 2012 to May 2013. Survey data were linked with radiation dosimetry, a cancer registry, and health insurance data by personal identification numbers. A nationwide representative survey was also conducted using a stratified random sampling design with face-to-face interviews. A total of 12,387 radiologic technologists were enrolled, which accounted for approximately 63% of all diagnostic radiologic technologists working in South Korea. For nationwide survey, 585 workers were interviewed using the detailed questionnaire, and buccal cells were also collected by scraping the inside of the cheek. The majority of study subjects were under 50-year-old and male workers. The average annual effective dose of radiation declined both men (from 2.75 to 1.43 mSv) and women (from 1.34 to 0.95 mSv) over the period of 1996-2011. A total of 99 cancers (66 cancers in men and 33 in women) were reported from 1992 to 2010. The standardized incidence ratio of all cancer combined was significantly lower in men (SIR = 0.75, 95% CI 0.58-0.96) than general population, but the ratios for thyroid cancer were significantly higher than expected among both men and women. This cohort provides comprehensive information on work activities and health status of diagnostic radiologic technologists. In addition, the nationwide representative sample provides unique opportunities compared with previous radiologic technologist studies.

  9. The radiologic technologists' health study in South Korea. Study design and baseline results

    International Nuclear Information System (INIS)

    Lee, Won Jin; Ha, Mina; Hwang, Seung-sik

    2015-01-01

    To describe the study design, methods, and baseline results of a prospective cohort of radiologic technologists which we have initiated in South Korea. The cohort participants were enrolled through a self-administered questionnaire survey administered from April 2012 to May 2013. Survey data were linked with radiation dosimetry, a cancer registry, and health insurance data by personal identification numbers. A nationwide representative survey was also conducted using a stratified random sampling design with face-to-face interviews. A total of 12,387 radiologic technologists were enrolled, which accounted for approximately 63 % of all diagnostic radiologic technologists working in South Korea. For nationwide survey, 585 workers were interviewed using the detailed questionnaire, and buccal cells were also collected by scraping the inside of the cheek. The majority of study subjects were under 50-year-old and male workers. The average annual effective dose of radiation declined both men (from 2.75 to 1.43 mSv) and women (from 1.34 to 0.95 mSv) over the period of 1996-2011. A total of 99 cancers (66 cancers in men and 33 in women) were reported from 1992 to 2010. The standardized incidence ratio of all cancer combined was significantly lower in men (SIR = 0.75, 95 % CI 0.58-0.96) than general population, but the ratios for thyroid cancer were significantly higher than expected among both men and women. This cohort provides comprehensive information on work activities and health status of diagnostic radiologic technologists. In addition, the nationwide representative sample provides unique opportunities compared with previous radiologic technologist studies.

  10. Effect of order of draw of blood samples during phlebotomy on routine biochemistry results.

    Science.gov (United States)

    Sulaiman, Raashda A; Cornes, Michael P; Whitehead, Simon J; Othonos, Nadia; Ford, Clare; Gama, Rousseau

    2011-11-01

    To investigate whether incorrect order of draw of blood samples during phlebotomy causes in vitro potassium ethylenediaminetetraacetic acid EDTA (kEDTA) contamination of blood samples. Serum kEDTA, potassium, calcium, magnesium, alkaline phosphatase, zinc and iron concentrations were measured in blood samples drawn before and after collecting blood into kEDTA containing sample tubes by an experienced phlebotomist using the Sarstedt Safety Monovette system. EDTA was undetectable in all samples. The concentrations of other analytes were similar in blood samples drawn before and after collection of the EDTA blood sample. Order of draw of blood samples using the Sarstedt Safety Monovette system has no effect on serum biochemistry results, when samples are taken by an experienced phlebotomist.

  11. Analytical results from salt batch 9 routine DSSHT and SEHT monthly samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-01

    Strip Effluent Hold Tank (SEHT) and Decontaminated Salt Solution Hold Tank (DSSHT) samples from several of the “microbatches” of Integrated Salt Disposition Project (ISDP) Salt Batch (“Macrobatch”) 9 have been analyzed for 238Pu, 90Sr, 137Cs, cations (Inductively Coupled Plasma Emission Spectroscopy - ICPES), and anions (Ion Chromatography Anions - IC-A). The analytical results from the current microbatch samples are similar to those from previous macrobatch samples. The Cs removal continues to be acceptable, with decontamination factors (DF) averaging 25700 (107% RSD). The bulk chemistry of the DSSHT and SEHT samples do not show any signs of unusual behavior, other than lacking the anticipated degree of dilution that is calculated to occur during Modular Caustic-Side Solvent Extraction Unit (MCU) processing.

  12. Results of groundwater monitoring and vegetation sampling at Everest, Kansas, in 2009 .

    Energy Technology Data Exchange (ETDEWEB)

    LaFreniere, L. M.; Environmental Science Division

    2010-05-13

    In April 2008, the Commodity Credit Corporation of the U.S. Department of Agriculture (CCC/USDA) conducted groundwater sampling for the analysis of volatile organic compounds (VOCs) in the existing network of monitoring points at Everest, Kansas (Argonne 2008). The objective of the 2008 investigation was to monitor the distribution of carbon tetrachloride contamination in groundwater previously identified in CCC/USDA site characterization and groundwater sampling studies at Everest in 2000-2006 (Argonne 2001, 2003, 2006a,b). The work at Everest is being undertaken on behalf of the CCC/USDA by Argonne National Laboratory, under the oversight of the Kansas Department of Health and Environment (KDHE). The findings of the 2008 investigation were as follows: (1) Measurements of groundwater levels obtained manually and through the use of automatic recorders demonstrated a consistent pattern of groundwater flow - and inferred contaminant migration - to the north-northwest from the former CCC/USDA facility toward the Nigh property, and then west-southwest from the Nigh property toward the intermittent creek that lies west of the former CCC/USDA facility and the Nigh property. (2) The range of concentrations and the areal distribution of carbon tetrachloride identified in the groundwater at Everest in April 2008 were generally consistent with previous results. The results of the 2008 sampling (reflecting the period from 2006 to 2008) and the earlier investigations at Everest (representing the period from 2000 to 2006) show that no significant downgradient extension of the carbon tetrachloride plume occurred from 2000 to 2008. (3) The slow contaminant migration indicated by the monitoring data is qualitatively consistent with the low groundwater flow rates in the Everest aquifer unit estimated previously on the basis of site-specific hydraulic testing (Argonne 2006a,b). (4) The April 2008 and earlier sampling results demonstrate that the limits of the plume have been

  13. Results of modeling advanced BWR fuel designs using CASMO-4

    International Nuclear Information System (INIS)

    Knott, D.; Edenius, M.

    1996-01-01

    Advanced BWR fuel designs from General Electric, Siemens and ABB-Atom have been analyzed using CASMO-4 and compared against fission rate distributions and control rod worths from MCNP. Included in the analysis were fuel storage rack configurations and proposed mixed oxide (MOX) designs. Results are also presented from several cycles of SIMULATE-3 core follow analysis, using nodal data generated by CASMO-4, for cycles in transition from 8x8 designs to advanced fuel designs. (author)

  14. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-01-01

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  15. Gasbuggy, New Mexico, Hydrologic and Natural Gas Sampling and Analysis Results for 2009

    International Nuclear Information System (INIS)

    2009-11-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted hydrologic and natural gas sampling for the Gasbuggy, New Mexico, site on June 16, and 17, 2009. Hydrologic sampling consists of collecting water samples from water wells and surface water locations. Natural gas sampling consists of collecting both gas samples and samples of produced water from gas production wells. The water well samples were analyzed for gamma-emitting radionuclides and tritium. Surface water samples were analyzed for tritium. Water samples from gas production wells were analyzed for gamma-emitting radionuclides, gross alpha, gross beta, and tritium. Natural gas samples were analyzed for tritium and carbon-14. Water samples were analyzed by ALS Laboratory Group in Fort Collins, Colorado, and natural gas samples were analyzed by Isotech Laboratories in Champaign, Illinois. Concentrations of tritium and gamma-emitting radionuclides in water samples collected in the vicinity of the Gasbuggy site continue to demonstrate that the sample locations have not been impacted by detonation-related contaminants. Results from the sampling of natural gas from producing wells demonstrate that the gas wells nearest the Gasbuggy site are not currently impacted by detonation-related contaminants. Annual sampling of the gas production wells nearest the Gasbuggy site for gas and produced water will continue for the foreseeable future. The sampling frequency of water wells and surface water sources in the surrounding area will be reduced to once every 5 years. The next hydrologic sampling event at water wells, springs, and ponds will be in 2014.

  16. Characterization of Tank 16H Annulus Samples Part II: Leaching Results

    International Nuclear Information System (INIS)

    Hay, M.; Reboul, S.

    2012-01-01

    The closure of Tank 16H will require removal of material from the annulus of the tank. Samples from Tank 16H annulus were characterized and tested to provide information to evaluate various alternatives for removing the annulus waste. The analysis found all four annulus samples to be composed mainly of Si, Na, and Al and lesser amounts of other elements. The XRD data indicate quartz (SiO 2 ) and sodium aluminum nitrate silicate hydrate (Na 8 (Al 6 Si 6 O 24 )(NO 3 ) 2 .4H 2 O) as the predominant crystalline mineral phases in the samples. The XRD data also indicate the presence of crystalline sodium nitrate (NaNO 3 ), sodium nitrite (NaNO 2 ), gibbsite (Al(OH) 3 ), hydrated sodium bicarbonate (Na 3 H(CO 3 ) 2 .2H 2 O), and muscovite (KAl 2 (AlSi 3 O 10 )(OH) 2 ). Based on the weight of solids remaining at the end of the test, the water leaching test results indicate 20-35% of the solids dissolved after three contacts with an approximately 3:1 volume of water at 45 C. The chemical analysis of the leachates and the XRD results of the remaining solids indicate sodium salts of nitrate, nitrite, sulfate, and possibly carbonate/bicarbonate make up the majority of the dissolved material. The majority of these salts were dissolved in the first water contact and simply diluted with each subsequent water contact. The water leaching removed large amounts of the uranium in two of the samples and approximately 1/3 of the 99 Tc from all four samples. Most of the other radionuclides analyzed showed low solubility in the water leaching test. The oxalic acid leaching test result indicate approximately 34-47% of the solids in the four annulus samples will dissolve after three contacts with an approximately 3:1 volume of acid to solids at 45 C. The same sodium salts found in the water leaching test comprise the majority of dissolved material in the oxalic acid leaching test. However, the oxalic acid was somewhat more effective in dissolving radionuclides than the water leach. In

  17. The design and results of an algorithm for intelligent ground vehicles

    Science.gov (United States)

    Duncan, Matthew; Milam, Justin; Tote, Caleb; Riggins, Robert N.

    2010-01-01

    This paper addresses the design, design method, test platform, and test results of an algorithm used in autonomous navigation for intelligent vehicles. The Bluefield State College (BSC) team created this algorithm for its 2009 Intelligent Ground Vehicle Competition (IGVC) robot called Anassa V. The BSC robotics team is comprised of undergraduate computer science, engineering technology, marketing students, and one robotics faculty advisor. The team has participated in IGVC since the year 2000. A major part of the design process that the BSC team uses each year for IGVC is a fully documented "Post-IGVC Analysis." Over the nine years since 2000, the lessons the students learned from these analyses have resulted in an ever-improving, highly successful autonomous algorithm. The algorithm employed in Anassa V is a culmination of past successes and new ideas, resulting in Anassa V earning several excellent IGVC 2009 performance awards, including third place overall. The paper will discuss all aspects of the design of this autonomous robotic system, beginning with the design process and ending with test results for both simulation and real environments.

  18. A liquid scintillation counter specifically designed for samples deposited on a flat matrix

    International Nuclear Information System (INIS)

    Potter, C.G.; Warner, G.T.

    1986-01-01

    A prototype liquid scintillation counter has been designed to count samples deposited as a 6x16 array on a flat matrix. Applications include the counting of labelled cells processed by a cell harvester from 96-well microtitration plates onto glass fibre filters and of DNA samples directly deposited onto nitrocellulose or nylon transfer membranes (e.g. 'Genescreen' NEN) for genetic studies by dot-blot hybridisation. The whole filter is placed in a bag with 4-12 ml of scintillant, sufficient to count all 96 samples. Nearest-neighbour intersample cross talk ranged from 0.004% for 3 H to 0.015% for 32 P. Background was 1.4 counts/min for glass fibre and 0.7 counts/min for 'Genescreen' in the 3 H channel: for 14 C the respective figures were 5.3 and 4.3 counts/min. Counting efficiency for 3 H-labelled cells on glass fibre was 54%(E 2 /B=2053) and 26% for tritiated thymidine spotted on 'Genescreen'(E 2 /B=980). Similar 14 C samples gave figures on 97%(E 2 /B=1775) and 81(E 2 B=1526) respectively. Electron emission counting from samples containing 125 I and 51 Cr was also possible. (U.K.)

  19. NEON terrestrial field observations: designing continental scale, standardized sampling

    Science.gov (United States)

    R. H. Kao; C.M. Gibson; R. E. Gallery; C. L. Meier; D. T. Barnett; K. M. Docherty; K. K. Blevins; P. D. Travers; E. Azuaje; Y. P. Springer; K. M. Thibault; V. J. McKenzie; M. Keller; L. F. Alves; E. L. S. Hinckley; J. Parnell; D. Schimel

    2012-01-01

    Rapid changes in climate and land use and the resulting shifts in species distributions and ecosystem functions have motivated the development of the National Ecological Observatory Network (NEON). Integrating across spatial scales from ground sampling to remote sensing, NEON will provide data for users to address ecological responses to changes in climate, land use,...

  20. Sample design and gamma-ray counting strategy of neutron activation system for triton burnup measurements in KSTAR

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jungmin [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Cheon, Mun Seong [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Hwang, Y.S. [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of)

    2016-11-01

    Highlights: • Sample design for triton burnup ratio measurement is carried out. • Samples for 14.1 MeV neutron measurements are selected for KSTAR. • Si and Cu are the most suitable materials for d-t neutron measurements. • Appropriate γ-ray counting strategies for each selected sample are established. - Abstract: On the purpose of triton burnup measurements in Korea Superconducting Tokamak Advanced Research (KSTAR) deuterium plasmas, appropriate neutron activation system (NAS) samples for 14.1 MeV d-t neutron measurements have been designed and gamma-ray counting strategy is established. Neutronics calculations are performed with the MCNP5 neutron transport code for the KSTAR neutral beam heated deuterium plasma discharges. Based on those calculations and the assumed d-t neutron yield, the activities induced by d-t neutrons are estimated with the inventory code FISPACT-2007 for candidate sample materials: Si, Cu, Al, Fe, Nb, Co, Ti, and Ni. It is found that Si, Cu, Al, and Fe are suitable for the KSATR NAS in terms of the minimum detectable activity (MDA) calculated based on the standard deviation of blank measurements. Considering background gamma-rays radiated from surrounding structures activated by thermalized fusion neutrons, appropriate gamma-ray counting strategy for each selected sample is established.

  1. LEDA RF distribution system design and component test results

    International Nuclear Information System (INIS)

    Roybal, W.T.; Rees, D.E.; Borchert, H.L.; McCarthy, M.; Toole, L.

    1998-01-01

    The 350 MHz and 700 MHz RF distribution systems for the Low Energy Demonstration Accelerator (LEDA) have been designed and are currently being installed at Los Alamos National Laboratory. Since 350 MHz is a familiar frequency used at other accelerator facilities, most of the major high-power components were available. The 700 MHz, 1.0 MW, CW RF delivery system designed for LEDA is a new development. Therefore, high-power circulators, waterloads, phase shifters, switches, and harmonic filters had to be designed and built for this applications. The final Accelerator Production of Tritium (APT) RF distribution systems design will be based on much of the same technology as the LEDA systems and will have many of the RF components tested for LEDA incorporated into the design. Low power and high-power tests performed on various components of these LEDA systems and their results are presented here

  2. Relative Efficiencies of a Three-Stage Versus a Two-Stage Sample Design For a New NLS Cohort Study. 22U-884-38.

    Science.gov (United States)

    Folsom, R. E.; Weber, J. H.

    Two sampling designs were compared for the planned 1978 national longitudinal survey of high school seniors with respect to statistical efficiency and cost. The 1972 survey used a stratified two-stage sample of high schools and seniors within schools. In order to minimize interviewer travel costs, an alternate sampling design was proposed,…

  3. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  4. A weighted sampling algorithm for the design of RNA sequences with targeted secondary structure and nucleotide distribution.

    Science.gov (United States)

    Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme

    2013-07-01

    The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.

  5. Tank 241-C-111 headspace gas and vapor sample results - August 1993 samples

    International Nuclear Information System (INIS)

    Huckaby, J.L.

    1994-01-01

    Tank 241-C-111 is on the ferrocyanide Watch List. Gas and vapor samples were collected to assure safe conditions before planned intrusive work was performed. Sample analyses showed that hydrogen is about ten times higher in the tank headspace than in ambient air. Nitrous oxide is about sixty times higher than ambient levels. The hydrogen cyanide concentration was below 0.04 ppbv, and the average NO x concentration was 8.6 ppmv

  6. Design and evaluation of a new Peltier-cooled laser ablation cell with on-sample temperature control.

    Science.gov (United States)

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; Sanz-Medel, Alfredo

    2014-01-27

    A new custom-built Peltier-cooled laser ablation cell is described. The proposed cryogenic cell combines a small internal volume (20 cm(3)) with a unique and reliable on-sample temperature control. The use of a flexible temperature sensor, directly located on the sample surface, ensures a rigorous sample temperature control throughout the entire analysis time and allows instant response to any possible fluctuation. In this way sample integrity and, therefore, reproducibility can be guaranteed during the ablation. The refrigeration of the proposed cryogenic cell combines an internal refrigeration system, controlled by a sensitive thermocouple, with an external refrigeration system. Cooling of the sample is directly carried out by 8 small (1 cm×1 cm) Peltier elements placed in a circular arrangement in the base of the cell. These Peltier elements are located below a copper plate where the sample is placed. Due to the small size of the cooling electronics and their circular allocation it was possible to maintain a peephole under the sample for illumination allowing a much better visualization of the sample, a factor especially important when working with structurally complex tissue sections. The analytical performance of the cryogenic cell was studied using a glass reference material (SRM NIST 612) at room temperature and at -20°C. The proposed cell design shows a reasonable signal washout (signal decay within less than 10 s to background level), high sensitivity and good signal stability (in the range 6.6-11.7%). Furthermore, high precision (0.4-2.6%) and accuracy (0.3-3.9%) in the isotope ratio measurements were also observed operating the cell both at room temperature and at -20°C. Finally, experimental results obtained for the cell application to qualitative elemental imaging of structurally complex tissue samples (e.g. eye sections from a native frozen porcine eye and fresh flower leaves) demonstrate that working in cryogenic conditions is critical in such

  7. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  8. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang; Hansen, Charles

    2013-01-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  9. Preliminary Results from a Superconducting Photocathode Sample Cavity

    CERN Document Server

    Kneisel, Peter; Lipski, Andrzej; Sekutowicz, Jacek

    2005-01-01

    Pure niobium has been proposed as a photocathode material and recently a successful test has been conducted with a niobium single cell cavity to extract photo-currents from the surface of this cavity. However, the quantum efficiency of niobium is ~2·10-4, whereas electrodeposited lead has a ~15 times higher quantum efficiency. We have designed and tested a photo-injector niobium cavity, which can be used to insert photo-cathodes made of different materials in the high electric field region of the cavity. Experiments have been conducted with niobium and lead, which show that neither the Q- values of the cavity nor the obtainable surface fields are significantly lowered. This paper reports about the results from these tests.

  10. Scintigraphy and venous sampling in endocrine adrenal diseases. Clinical results in 85 patients

    International Nuclear Information System (INIS)

    Feltrin, G.P.; Maffessanti, M.; Miotto, D.; Mantero, F.; Macri, G.; Romani, S.

    1979-01-01

    The results obtained by adrenal scanning and venous sampling in 85 patients affected by various forms of adrenal pathology are reported and discussed. Pheochromocytoma rarely needs venous catheterization and blood sampling, since arteriography is almost always capable to visualize it. Scintigraphy alone is generally accurate enough to distinguish between bilateral hyperplasia and tumors in Cushing's and adrenogenital syndromes (100% of personal observations); only a tumoral situation benefits by venous catheterization. Blood samples and venography must be preceded by scintigraphy in Conn's syndrome

  11. Results for the first quarter calendar year 2017 tank 50H salt solution sample

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-04-12

    In this memorandum, the chemical and radionuclide contaminant results from the First Quarter Calendar Year 2017 (CY17) sample of Tank 50H salt solution are presented in tabulated form. The First Quarter CY17 Tank 50H samples [a 200 mL sample obtained 6” below the surface (HTF-50-17-7) and a 1 L sample obtained 66” from the tank bottom (HTF-50-17-8)] were obtained on January 15, 2017 and received at Savannah River National Laboratory (SRNL) on January 16, 2017. Prior to obtaining the samples from Tank 50H, a single pump was run at least 4.4 hours and the samples were pulled immediately after pump shut down. All volatile organic analysis (VOA) and semi-volatile organic analysis (SVOA) were performed on the surface sample and all other analyses were performed on the variable depth sample. The information from this characterization will be used by Savannah River Remediation (SRR) for the transfer of aqueous waste from Tank 50H to the Saltstone Production Facility, where the waste will be treated and disposed of in the Saltstone Disposal Facility. This memorandum compares results, where applicable, to Saltstone Waste Acceptance Criteria (WAC) limits and targets. The chemical and radionuclide contaminant results from the characterization of the First Quarter CY17 sampling of Tank 50H were requested by SRR personnel and details of the testing are presented in the SRNL Task Technical and Quality Assurance Plan (TTQAP). This memorandum is part of Deliverable 2 from SRR request. Data pertaining to the regulatory limits for Resource Conservation and Recovery Act (RCRA) metals will be documented at a later time per the TTQAP for the Tank 50H saltstone task.

  12. Results of stainless steel canister corrosion studies and environmental sample investigations

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Charles R. [Sandia National Laboratories, Albuquerque, NM (United States); Enos, David [Sandia National Laboratories, Albuquerque, NM (United States)

    2014-12-01

    This progress report describes work being done at Sandia National Laboratories (SNL) to assess the localized corrosion performance of container/cask materials used in the interim storage of used nuclear fuel. The work involves both characterization of the potential physical and chemical environment on the surface of the storage canisters and how it might evolve through time, and testing to evaluate performance of the canister materials under anticipated storage conditions. To evaluate the potential environment on the surface of the canisters, SNL is working with the Electric Power Research Institute (EPRI) to collect and analyze dust samples from the surface of in-service SNF storage canisters. In FY 13, SNL analyzed samples from the Calvert Cliffs Independent Spent Fuel Storage Installation (ISFSI); here, results are presented for samples collected from two additional near-marine ISFSI sites, Hope Creek NJ, and Diablo Canyon CA. The Hope Creek site is located on the shores of the Delaware River within the tidal zone; the water is brackish and wave action is normally minor. The Diablo Canyon site is located on a rocky Pacific Ocean shoreline with breaking waves. Two types of samples were collected: SaltSmart™ samples, which leach the soluble salts from a known surface area of the canister, and dry pad samples, which collected a surface salt and dust using a swipe method with a mildly abrasive ScotchBrite™ pad. The dry samples were used to characterize the mineralogy and texture of the soluble and insoluble components in the dust via microanalytical techniques, including mapping X-ray Fluorescence spectroscopy and Scanning Electron Microscopy. For both Hope Creek and Diablo Canyon canisters, dust loadings were much higher on the flat upper surfaces of the canisters than on the vertical sides. Maximum dust sizes collected at both sites were slightly larger than 20 μm, but Phragmites grass seeds ~1 mm in size, were observed on the tops of the Hope Creek canisters

  13. Enhanced CANDU 6 design assist probabilistic safety assessment results and insights

    International Nuclear Information System (INIS)

    Torabi, T.; Bettig, R.; Iliescu, P.; Robinson, J.; Santamaura, P.; Skorupska, B.; Tyagi, A.K.; Vencel, I.

    2013-01-01

    The Enhanced CANDU 6(EC6) is a 700 MWe reactor, which has evolved from the well-established CANDU line of reactors, which are heavy-water moderated, and heavy-water cooled horizontal pressure tube reactors, using natural uranium fuel. The EC6 design retains the generic CANDU design features, while incorporating innovations and state-of-the-art technologies to ensure competitiveness with other design with respect to operation, performance and economics. A design assist probabilistic safety assessment (PSA) was conducted during the design change phase of the project. The purpose of the assessment was to assess internal events during at-power operation and identify the design improvements and additional features needed to comply with the latest regulatory requirements in Canada and compete with other reactor designs, internationally. The PSA results show that the EC6 plant response to the postulated initiating events is well balanced, and the design meets its safety objectives. This paper summarizes the results and insights gained during the development of the PSA models for at-power internal events. (author)

  14. Solvent hold tank sample results for MCU-16-1363-1365. November 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-22

    Savannah River National Laboratory (SRNL) received one set of three Solvent Hold Tank (SHT) samples (MCU-16-1363-1364-1365), pulled on 11/15/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1363-1364-1365 indicated the Isopar™L concentration is at its nominal level (100%). The extractant (MaxCalix) and the modifier (CS- 7SB) are 8% and 2 % below their nominal concentrations. The suppressor (TiDG) is 7% below its nominal concentration. This analysis confirms the trim and Isopar™ additions to the solvent in November. This analysis also indicates the solvent did not require further additions. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.

  15. Finding Biomarker Signatures in Pooled Sample Designs: A Simulation Framework for Methodological Comparisons

    Directory of Open Access Journals (Sweden)

    Anna Telaar

    2010-01-01

    Full Text Available Detection of discriminating patterns in gene expression data can be accomplished by using various methods of statistical learning. It has been proposed that sample pooling in this context would have negative effects; however, pooling cannot always be avoided. We propose a simulation framework to explicitly investigate the parameters of patterns, experimental design, noise, and choice of method in order to find out which effects on classification performance are to be expected. We use a two-group classification task and simulated gene expression data with independent differentially expressed genes as well as bivariate linear patterns and the combination of both. Our results show a clear increase of prediction error with pool size. For pooled training sets powered partial least squares discriminant analysis outperforms discriminance analysis, random forests, and support vector machines with linear or radial kernel for two of three simulated scenarios. The proposed simulation approach can be implemented to systematically investigate a number of additional scenarios of practical interest.

  16. Results For The Third Quarter Calendar Year 2016 Tank 50H Salt Solution Sample

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-10-13

    In this memorandum, the chemical and radionuclide contaminant results from the Third Quarter Calendar Year 2016 (CY16) sample of Tank 50H salt solution are presented in tabulated form. The Third Quarter CY16 Tank 50H samples (a 200 mL sample obtained 6” below the surface (HTF-5-16-63) and a 1 L sample obtained 66” from the tank bottom (HTF-50-16-64)) were obtained on July 14, 2016 and received at Savannah River National Laboratory (SRNL) on the same day. Prior to obtaining the samples from Tank 50H, a single pump was run at least 4.4 hours, and the samples were pulled immediately after pump shut down. The information from this characterization will be used by Defense Waste Processing Facility (DWPF) & Saltstone Facility Engineering for the transfer of aqueous waste from Tank 50H to the Saltstone Production Facility, where the waste will be treated and disposed of in the Saltstone Disposal Facility. This memorandum compares results, where applicable, to Saltstone Waste Acceptance Criteria (WAC) limits and targets. Data pertaining to the regulatory limits for Resource Conservation and Recovery Act (RCRA) metals will be documented at a later time per the Task Technical and Quality Assurance Plan (TTQAP) for the Tank 50H saltstone task. The chemical and radionuclide contaminant results from the characterization of the Third Quarter CY16 sampling of Tank 50H were requested by Savannah River Remediation (SRR) personnel and details of the testing are presented in the SRNL TTQAP.

  17. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    International Nuclear Information System (INIS)

    JANICEK, G.P.

    2000-01-01

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance

  18. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    Energy Technology Data Exchange (ETDEWEB)

    JANICEK, G.P.

    2000-06-08

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance.

  19. Practical experience applied to the design of injection and sample manifolds to perform in-place surveillance tests according to ANSI/ASME N-510

    Energy Technology Data Exchange (ETDEWEB)

    Banks, E.M.; Wikoff, W.O.; Shaffer, L.L. [NUCON International, Inc., Columbus, OH (United States)

    1997-08-01

    At the current level of maturity and experience in the nuclear industry, regarding testing of air treatment systems, it is now possible to design and qualify injection and sample manifolds for most applications. While the qualification of sample manifolds is still in its infancy, injection manifolds have reached a mature stage that helps to eliminate the {open_quotes}hit or miss{close_quotes} type of design. During the design phase, manifolds can be adjusted to compensate for poor airflow distribution, laminar flow conditions, and to take advantage of any system attributes. Experience has shown that knowing the system attributes before the design phase begins is an essential element to a successful manifold design. The use of a spreadsheet type program commonly found on most personal computers can afford a greater flexibility and a reduction in time spent in the design phase. The experience gained from several generations of manifold design has culminated in a set of general design guidelines. Use of these guidelines, along with a good understanding of the type of testing (theoretical and practical), can result in a good manifold design requiring little or no field modification. The requirements for manifolds came about because of the use of multiple banks of components and unconventional housing inlet configurations. Multiple banks of adsorbers and pre and post HEPA`s required that each bank be tested to insure that each one does not exceed a specific allowable leakage criterion. 5 refs., 5 figs., 1 tab.

  20. Results of the Washington Passive Solar Design/Build Competition

    Energy Technology Data Exchange (ETDEWEB)

    Nylen, N.

    1981-01-01

    In an effort to encourage the design, construction, and marketing of moderately priced passive solar homes in Washington state, the Western Solar Utilization Network (Western SUN) recently sponsored the Washington Passive Solar Design/Build Competition. The competition drew an overwhelming response from designers and builders throughout Washington. Thermal performance of the designs was evaluated by a technical review committee, and final selections were made by the Competition Jury in accordance with the following criteria: perceived market acceptance, thermal performance, cost effectiveness, simplicity of design and operation, and completeness of the passive concept. Design contract awards totaling $50,000 were made available to winners in four categories, including single and multi-family, new and remodeled residences. In order to receive the award in its entirety, winning design/build teams are required to construct their design by April, 1983. As a result of the competition, a great deal was learned about the attitudes and knowledge of professionals and the general public regarding the use of solar energy in Washington state. Among the points that will be highlighted in this paper are the following: (1) a design/build competition is an effective vehicle for promoting solar energy among professionals in the housing community as well as the general public; (2) passive solar techniques can contribute significantly to the heating and cooling needs of residential housing throughout the state of Washington; (3) there is a great deal of interest and talent among the designers and builders of solar residences in Washington; and (4) follow-up activities, including the promotion of winning designs, the systematic collection of performance data, and identification of the major obstacles confronting designers and builders of solar homes, are critical to the success of the program in achieving both its short-term and long-term goals.

  1. Assessing representativeness of sampling methods for reaching men who have sex with men: a direct comparison of results obtained from convenience and probability samples.

    Science.gov (United States)

    Schwarcz, Sandra; Spindler, Hilary; Scheer, Susan; Valleroy, Linda; Lansky, Amy

    2007-07-01

    Convenience samples are used to determine HIV-related behaviors among men who have sex with men (MSM) without measuring the extent to which the results are representative of the broader MSM population. We compared results from a cross-sectional survey of MSM recruited from gay bars between June and October 2001 to a random digit dial telephone survey conducted between June 2002 and January 2003. The men in the probability sample were older, better educated, and had higher incomes than men in the convenience sample, the convenience sample enrolled more employed men and men of color. Substance use around the time of sex was higher in the convenience sample but other sexual behaviors were similar. HIV testing was common among men in both samples. Periodic validation, through comparison of data collected by different sampling methods, may be useful when relying on survey data for program and policy development.

  2. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns.

    Science.gov (United States)

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello

    2014-03-01

    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  3. Results from tests of TFL Hydragard sampling loop

    International Nuclear Information System (INIS)

    Steimke, J.L.

    1995-03-01

    When the Defense Waste Processing Facility (DWPF) is operational, processed radioactive sludge will be transferred in batches to the Slurry Mix Evaporator (SME), where glass frit will be added and the contents concentrated by boiling. Batches of the slurry mixture are transferred from the SME to the Melter Feed Tank (MFT). Hydragard reg-sign sampling systems are used on the SME and the MFT for collecting slurry samples in vials for chemical analysis. An accurate replica of the Hydragard sampling system was built and tested in the thermal Fluids Laboratory (TFL) to determine the hydragard accuracy. It was determined that the original Hydragard valve frequently drew a non-representative sample stream through the sample vial that ranged from frit enriched to frit depleted. The Hydragard valve was modified by moving the plunger and its seat backwards so that the outer surface of the plunger was flush with the inside diameter of the transfer line when the valve was open. The slurry flowing through the vial accurately represented the composition of the slurry in the reservoir for two types of slurries, different dilution factors, a range of transfer flows and a range of vial flows. It was then found that the 15 ml of slurry left in the vial when the Hydragard valve was closed, which is what will be analyzed at DWPF, had a lower ratio of frit to sludge as characterized by the lithium to iron ratio than the slurry flowing through it. The reason for these differences is not understood at this time but it is recommended that additional experimentation be performed with the TFL Hydragard loop to determine the cause

  4. Solvent Hold Tank Sample Results for MCU-16-934-935-936: June 2016 Monthly Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-30

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-934-935-936), pulled on 07/01/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-934-935-936 indicated the Isopar™L concentration is above its nominal level (101%). The modifier (CS-7SB) and the TiDG concentrations are 8% and 29 % below their nominal concentrations. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended. No impurities above the 1000 ppm level were found in this solvent by the Semi-Volatile Organic Analysis (SVOA). No impurities were observed in the Hydrogen Nuclear Magnetic Resonance (HNMR). However, up to 21.1 ± 4 micrograms of mercury per gram of solvent (or 17.5 μg/mL) was detected in this sample (as determined by the XRF method of undigested sample). The current gamma level (1.41E5 dpm/mL) confirmed that the gamma concentration has returned to previous levels (as observed in the late 2015 samples) where the process operated normally and as expected.

  5. 60-Day waste compatibility safety issues and final results for AY-102 grab samples

    Energy Technology Data Exchange (ETDEWEB)

    Nuzum, J.L.

    1997-01-31

    Four grab samples (2AY-96-15, 2AY-96-16, 2AY-96-17, and 2AY-96-18) were taken from Riser 15D of Tank 241-AY-102 on October 8, 1996, and received by 222-S Laboratory on October 8, 1996. These samples were analyzed in accordance with Compatibility Grab Sampling and Analysis Plan (TSAP) and Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) in support of the Waste Compatibility Program. No notifications were required based on sample results.

  6. Design of cross-sensitive temperature and strain sensor based on sampled fiber grating

    Directory of Open Access Journals (Sweden)

    Zhang Xiaohang

    2017-02-01

    Full Text Available In this paper,a cross-sensitive temperature and strain sensor based on sampled fiber grating is designed.Its temperature measurement range is -50-200℃,and the strain measurement rangeis 0-2 000 με.The characteristics of the sensor are obtained using simulation method.Utilizing SPSS software,we found the dual-parameter matrix equations of measurement of temperature and strain,and calibrated the four sensing coefficients of the matrix equations.

  7. System design specification for rotary mode core sample trucks No. 2, 3, and 4 programmable logic controller

    International Nuclear Information System (INIS)

    Dowell, J.L.; Akers, J.C.

    1995-01-01

    The system this document describes controls several functions of the Core Sample Truck(s) used to obtain nuclear waste samples from various underground storage tanks at Hanford. The system will monitor the sampling process and provide alarms and other feedback to insure the sampling process is performed within the prescribed operating envelope. The intended audience for this document is anyone associated with rotary or push mode core sampling. This document describes the Alarm and Control logic installed on Rotary Mode Core Sample Trucks (RMCST) number-sign 2, 3, and 4. It is intended to define the particular requirements of the RMCST alarm and control operation (not defined elsewhere) sufficiently for detailed design to implement on a Programmable Logic Controller (PLC)

  8. RANKED SET SAMPLING FOR ECOLOGICAL RESEARCH: ACCOUNTING FOR THE TOTAL COSTS OF SAMPLING

    Science.gov (United States)

    Researchers aim to design environmental studies that optimize precision and allow for generalization of results, while keeping the costs of associated field and laboratory work at a reasonable level. Ranked set sampling is one method to potentially increase precision and reduce ...

  9. Balanced sampling

    NARCIS (Netherlands)

    Brus, D.J.

    2015-01-01

    In balanced sampling a linear relation between the soil property of interest and one or more covariates with known means is exploited in selecting the sampling locations. Recent developments make this sampling design attractive for statistical soil surveys. This paper introduces balanced sampling

  10. Considerations on sample holder design and custom-made non-polarizable electrodes for Spectral Induced Polarization measurements on unsaturated soils

    Science.gov (United States)

    Kaouane, C.; Chouteau, M. C.; Fauchard, C.; Cote, P.

    2014-12-01

    Spectral Induced Polarization (SIP) is a geophysical method sensitive to water content, saturation and grain size distribution. It could be used as an alternative to nuclear probes to assess the compaction of soils in road works. To evaluate the potential of SIP as a practical tool, we designed an experiment for complex conductivity measurements on unsaturated soil samples.Literature presents a large variety of sample holders and designs, each depending on the context. Although we might find some precise description about the sample holder, exact replication is not always possible. Furthermore, the potential measurements are often done using custom-made Ag/AgCl electrodes and very few indications are given on their reliability with time and temperature. Our objective is to perform complex conductivity measurements on soil samples compacted in a PVC cylindrical mould (10 cm-long, 5 cm-diameter) according to geotechnical standards. To expect homogeneous current density, electrical current is transmitted through the sample via chambers filled with agar gel. Agar gel is a good non-polarizable conductor within the frequency range (1 mHz -20kHz). Its electrical properties are slightly known. We measured increasing of agar-agar electrical conductivity in time. We modelled the influence of this variation on the measurement. If the electrodes are located on the sample, it is minimized. Because of the dimensions at stake and the need for simple design, potential electrodes are located outside the sample, hence the gel contributes to the measurements. Since the gel is fairly conductive, we expect to overestimate the sample conductivity. Potential electrodes are non-polarizable Ag/AgCl electrodes. To avoid any leakage, the KCl solution in the electrodes is replaced by saturated KCl-agar gel. These electrodes are low cost and show a low, stable, self-potential (<1mV). In addition, the technique of making electrode can be easily reproduced and storage and maintenance are simple

  11. A belt charging system for the Vivitron - design, early results

    International Nuclear Information System (INIS)

    Helleboid, J.M.; Gaudiot, G.

    1990-10-01

    A specific belt charging system has been designed, built and assembled for the 35 MV Vivitron. 100 m long belt is used. Together with main features of the design, experimental studies, tests in a pilot machine and the results of the very early tests of the real system are reviewed

  12. Determination of Sr-90 in milk samples from the study of statistical results

    Directory of Open Access Journals (Sweden)

    Otero-Pazos Alberto

    2017-01-01

    Full Text Available The determination of 90Sr in milk samples is the main objective of radiation monitoring laboratories because of its environmental importance. In this paper the concentration of activity of 39 milk samples was obtained through radiochemical separation based on selective retention of Sr in a cationic resin (Dowex 50WX8, 50-100 mesh and subsequent determination by a low-level proportional gas counter. The results were checked by performing the measurement of the Sr concentration by using the flame atomic absorption spectroscopy technique, to finally obtain the mass of 90Sr. From the data obtained a statistical treatment was performed using linear regressions. A reliable estimate of the mass of 90Sr was obtained based on the gravimetric technique, and secondly, the counts per minute of the third measurement in the 90Sr and 90Y equilibrium, without having to perform the analysis. These estimates have been verified with 19 milk samples, obtaining overlapping results. The novelty of the manuscript is the possibility of determining the concentration of 90Sr in milk samples, without the need to perform the third measurement in the equilibrium.

  13. Shock-induced explosive chemistry in a deterministic sample configuration.

    Energy Technology Data Exchange (ETDEWEB)

    Stuecker, John Nicholas; Castaneda, Jaime N.; Cesarano, Joseph, III (,; ); Trott, Wayne Merle; Baer, Melvin R.; Tappan, Alexander Smith

    2005-10-01

    Explosive initiation and energy release have been studied in two sample geometries designed to minimize stochastic behavior in shock-loading experiments. These sample concepts include a design with explosive material occupying the hole locations of a close-packed bed of inert spheres and a design that utilizes infiltration of a liquid explosive into a well-defined inert matrix. Wave profiles transmitted by these samples in gas-gun impact experiments have been characterized by both velocity interferometry diagnostics and three-dimensional numerical simulations. Highly organized wave structures associated with the characteristic length scales of the deterministic samples have been observed. Initiation and reaction growth in an inert matrix filled with sensitized nitromethane (a homogeneous explosive material) result in wave profiles similar to those observed with heterogeneous explosives. Comparison of experimental and numerical results indicates that energetic material studies in deterministic sample geometries can provide an important new tool for validation of models of energy release in numerical simulations of explosive initiation and performance.

  14. The radiologic technologists' health study in South Korea. Study design and baseline results

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jin [Korea Univ. College of Medicine, Seoul (Korea, Republic of). Dept. of Preventive Medicine; Ha, Mina [Dankook Univ. College of Medicine, Cheonan (Korea, Republic of). Dept. of Preventive Medicine; Hwang, Seung-sik [Inha Univ. School of Medicine, Incheon (Korea, Republic of). Dept. of Social and Preventive Medicine; and others

    2015-08-15

    To describe the study design, methods, and baseline results of a prospective cohort of radiologic technologists which we have initiated in South Korea. The cohort participants were enrolled through a self-administered questionnaire survey administered from April 2012 to May 2013. Survey data were linked with radiation dosimetry, a cancer registry, and health insurance data by personal identification numbers. A nationwide representative survey was also conducted using a stratified random sampling design with face-to-face interviews. A total of 12,387 radiologic technologists were enrolled, which accounted for approximately 63 % of all diagnostic radiologic technologists working in South Korea. For nationwide survey, 585 workers were interviewed using the detailed questionnaire, and buccal cells were also collected by scraping the inside of the cheek. The majority of study subjects were under 50-year-old and male workers. The average annual effective dose of radiation declined both men (from 2.75 to 1.43 mSv) and women (from 1.34 to 0.95 mSv) over the period of 1996-2011. A total of 99 cancers (66 cancers in men and 33 in women) were reported from 1992 to 2010. The standardized incidence ratio of all cancer combined was significantly lower in men (SIR = 0.75, 95 % CI 0.58-0.96) than general population, but the ratios for thyroid cancer were significantly higher than expected among both men and women. This cohort provides comprehensive information on work activities and health status of diagnostic radiologic technologists. In addition, the nationwide representative sample provides unique opportunities compared with previous radiologic technologist studies.

  15. Results of Remediation and Verification Sampling for the 600-270 Horseshoe Landfill

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2005-12-14

    This report presents the results of the 2005 remedial action and verification soil sampling conducted at the 600-270 waste site after removal of soil containing residual concentrations of dichlorodiphenyl trichloroethane and its breakdown products dichlorodiphenyl dichloroethylene and dichlorodiphenyl dichloroethane. The remediation was performed in response to post-closure surface soil sampling performed between 1998 and 2003 that indicated the presence of residual DDT contamination exceeding the Record of Decision for the 1100 Area National Priorities List site cleanup criteria of 1 mg/kg that was established for the original 1994 cleanup activities.

  16. Results of Remediation and Verification Sampling for the 600-270 Horseshoe Landfill

    International Nuclear Information System (INIS)

    Thompson, W.S.

    2005-01-01

    This report presents the results of the 2005 remedial action and verification soil sampling conducted at the 600-270 waste site after removal of soil containing residual concentrations of dichlorodiphenyl trichloroethane and its breakdown products dichlorodiphenyl dichloroethylene and dichlorodiphenyl dichloroethane. The remediation was performed in response to post-closure surface soil sampling performed between 1998 and 2003 that indicated the presence of residual DDT contamination exceeding the Record of Decision for the 1100 Area National Priorities List site cleanup criteria of 1 mg/kg that was established for the original 1994 cleanup activities.

  17. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  18. Design, Results and Plans for Power Beaming Competitive Challenge

    International Nuclear Information System (INIS)

    Shelef, Ben

    2008-01-01

    In our context, Power Beaming refers to the extraction of useable electrical power from a directed electromagnetic beam. In order to promote interest in this technology, the Spaceward Foundation proposed and is managing a technology prize challenge based on a Space Elevator design scenario. The challenge has a prize purse of $2M, provided by NASA's Centennial Challenges office. This paper covers the considerations that went into the design of the challenge, a brief chronology of past results, and plans for the future

  19. Understanding the cluster randomised crossover design: a graphical illustraton of the components of variation and a sample size tutorial.

    Science.gov (United States)

    Arnup, Sarah J; McKenzie, Joanne E; Hemming, Karla; Pilcher, David; Forbes, Andrew B

    2017-08-15

    In a cluster randomised crossover (CRXO) design, a sequence of interventions is assigned to a group, or 'cluster' of individuals. Each cluster receives each intervention in a separate period of time, forming 'cluster-periods'. Sample size calculations for CRXO trials need to account for both the cluster randomisation and crossover aspects of the design. Formulae are available for the two-period, two-intervention, cross-sectional CRXO design, however implementation of these formulae is known to be suboptimal. The aims of this tutorial are to illustrate the intuition behind the design; and provide guidance on performing sample size calculations. Graphical illustrations are used to describe the effect of the cluster randomisation and crossover aspects of the design on the correlation between individual responses in a CRXO trial. Sample size calculations for binary and continuous outcomes are illustrated using parameters estimated from the Australia and New Zealand Intensive Care Society - Adult Patient Database (ANZICS-APD) for patient mortality and length(s) of stay (LOS). The similarity between individual responses in a CRXO trial can be understood in terms of three components of variation: variation in cluster mean response; variation in the cluster-period mean response; and variation between individual responses within a cluster-period; or equivalently in terms of the correlation between individual responses in the same cluster-period (within-cluster within-period correlation, WPC), and between individual responses in the same cluster, but in different periods (within-cluster between-period correlation, BPC). The BPC lies between zero and the WPC. When the WPC and BPC are equal the precision gained by crossover aspect of the CRXO design equals the precision lost by cluster randomisation. When the BPC is zero there is no advantage in a CRXO over a parallel-group cluster randomised trial. Sample size calculations illustrate that small changes in the specification of

  20. Analysis of Clinical Cohort Data Using Nested Case-control and Case-cohort Sampling Designs. A Powerful and Economical Tool.

    Science.gov (United States)

    Ohneberg, K; Wolkewitz, M; Beyersmann, J; Palomar-Martinez, M; Olaechea-Astigarraga, P; Alvarez-Lerma, F; Schumacher, M

    2015-01-01

    Sampling from a large cohort in order to derive a subsample that would be sufficient for statistical analysis is a frequently used method for handling large data sets in epidemiological studies with limited resources for exposure measurement. For clinical studies however, when interest is in the influence of a potential risk factor, cohort studies are often the first choice with all individuals entering the analysis. Our aim is to close the gap between epidemiological and clinical studies with respect to design and power considerations. Schoenfeld's formula for the number of events required for a Cox' proportional hazards model is fundamental. Our objective is to compare the power of analyzing the full cohort and the power of a nested case-control and a case-cohort design. We compare formulas for power for sampling designs and cohort studies. In our data example we simultaneously apply a nested case-control design with a varying number of controls matched to each case, a case cohort design with varying subcohort size, a random subsample and a full cohort analysis. For each design we calculate the standard error for estimated regression coefficients and the mean number of distinct persons, for whom covariate information is required. The formula for the power of a nested case-control design and the power of a case-cohort design is directly connected to the power of a cohort study using the well known Schoenfeld formula. The loss in precision of parameter estimates is relatively small compared to the saving in resources. Nested case-control and case-cohort studies, but not random subsamples yield an attractive alternative for analyzing clinical studies in the situation of a low event rate. Power calculations can be conducted straightforwardly to quantify the loss of power compared to the savings in the num-ber of patients using a sampling design instead of analyzing the full cohort.

  1. Tank farms backlog soil sample and analysis results supporting a contained-in determination

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, C.L., Fluor Daniel Hanford

    1997-02-27

    Soil waste is generated from Tank Farms and associated Tank Farms facilities operations. The soil is a mixed waste because it is an environmental media which contains tank waste, a listed mixed waste. The soil is designated with the listed waste codes (FOO1 through F005) which have been applied to all tank wastes. The scope of this report includes Tank Farms soil managed under the Backlog program. The Backlog Tank Farm soil in storage consists of drums and 5 boxes (originally 828 drums). The Backlog Waste Program dealt with 2276 containers of solid waste generated by Tank Farms operations during the time period from 1989 through early 1993. The containers were mismanaged by being left in the field for an extended period of time without being placed into permitted storage. As a corrective action for this situation, these containers were placed in interim storage at the Central Waste Complex (CWC) pending additional characterization. The Backlog Waste Analysis Plan (BWAP) (RL 1993) was written to define how Backlog wastes would be evaluated for proper designation and storage. The BWAP was approved in August 1993 and all work required by the BWAP was completed by July 1994. This document presents results of testing performed in 1992 & 1996 that supports the attainment of a Contained-In Determination for Tank Farm Backlog soils. The analytical data contained in this report is evaluated against a prescribed decision rule. If the decision rule is satisfied then the Washington State Department of ecology (Ecology) may grant a Contained-In Determination. A Contained-In Determination for disposal to an unlined burial trench will be requested from Ecology . The decision rule and testing requirements provided by Ecology are described in the Tank Farms Backlog Soil Sample Analysis Plan (SAP) (WHC 1996).

  2. Results of initial analyses of the salt (macro) batch 11 Tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-23

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Interim Salt Disposition Project (ISDP) Salt (Macro) Batch 11 for processing through the Actinide Removal Process (ARP) and the Modular Caustic-Side Solvent Extraction Unit (MCU). This document reports the initial results of the analyses of samples of Tank 21H. Analysis of the Tank 21H Salt (Macro) Batch 11 composite sample indicates that the material does not display any unusual characteristics or observations, such as floating solids, the presence of large amounts of solids, or unusual colors. Further sample results will be reported in a future document. This memo satisfies part of Deliverable 3 of the Technical Task Request (TTR).

  3. Will the alphabet soup of design criteria affect discrete choice experiment results?

    DEFF Research Database (Denmark)

    Olsen, Søren Bøye; Meyerhoff, Jürgen

    2017-01-01

    Every discrete choice experiment needs one, but the impacts of a statistical design on the results are still not well understood. Comparative studies have found that efficient designs outperform especially orthogonal designs. What has been little studied is whether efficient designs come at a cos...

  4. 45-Day safety screen results for Tank 241-C-101, auger sample 95-AUG-019

    International Nuclear Information System (INIS)

    Sasaki, L.M.

    1995-01-01

    One auger sample from Tank 241-C-101 was received by the 222-S Laboratory and underwent safety screening analyses--differential scanning calorimetry (DSC), thermogravimetric analysis (TGA), and total alpha analysis--in accordance with the tank characterization plan. Analytical results for the TGA on the crust sample (the uppermost portion of the auger sample) (sample number S95T000823) were less than the safety screening notification limit of 17 weight percent water. Verbal and written notifications were made on May 3, 1995. No exotherms were observed in the DSC analyses and the total alpha results were well below the safety screening notification limit. This report includes the primary safety screening results obtained from the analyses and copies of all DSC and TGA raw data scans as requested per the TCP. Although not included in this report, a photograph of the extruded sample was taken and is available. This report also includes bulk density measurements required by Characterization Plant Engineering. Additional analyses (pH, total organic carbon, and total inorganic carbon) are being performed on the drainable liquid at the request of Characterization Process Control; these analyses will be reported at a later date in a final report for this auger sample. Tank C-101 is not part of any of the four Watch Lists

  5. Solvent Hold Tank Sample Results for MCU-16-596-597-598: April 2016 Monthly Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL). Advanced Characterization and Processing; Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL). Research Support

    2016-07-12

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-596-597-598), pulled on 04/30/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-596-597-598 indicated the Isopar™L concentration is above its nominal level (102%). The modifier (CS-7SB) is 14% below its nominal concentration, while the TiDG and MaxCalix concentrations are at and above their nominal concentrations, respectively. This analysis confirms the solvent may require the addition of modifier. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.

  6. Cast Stone Oxidation Front Evaluation: Preliminary Results For Samples Exposed To Moist Air

    International Nuclear Information System (INIS)

    Langton, C. A.; Almond, P. M.

    2013-01-01

    The rate of oxidation is important to the long-term performance of reducing salt waste forms because the solubility of some contaminants, e.g., technetium, is a function of oxidation state. TcO 4 - in the salt solution is reduced to Tc(IV) and has been shown to react with ingredients in the waste form to precipitate low solubility sulfide and/or oxide phases. Upon exposure to oxygen, the compounds containing Tc(IV) oxidize to the pertechnetate ion, Tc(VII)O 4 - , which is very soluble. Consequently the rate of technetium oxidation front advancement into a monolith and the technetium leaching profile as a function of depth from an exposed surface are important to waste form performance and ground water concentration predictions. An approach for measuring contaminant oxidation rate (effective contaminant specific oxidation rate) based on leaching of select contaminants of concern is described in this report. In addition, the relationship between reduction capacity and contaminant oxidation is addressed. Chromate (Cr(VI) was used as a non-radioactive surrogate for pertechnetate, Tc(VII), in Cast Stone samples prepared with 5 M Simulant. Cast Stone spiked with pertechnetate was also prepared and tested. Depth discrete subsamples spiked with Cr were cut from Cast Stone exposed to Savannah River Site (SRS) outdoor ambient temperature fluctuations and moist air. Depth discrete subsamples spiked with Tc-99 were cut from Cast Stone exposed to laboratory ambient temperature fluctuations and moist air. Similar conditions are expected to be encountered in the Cast Stone curing container. The leachability of Cr and Tc-99 and the reduction capacities, measured by the Angus-Glasser method, were determined for each subsample as a function of depth from the exposed surface. The results obtained to date were focused on continued method development and are preliminary and apply to the sample composition and curing / exposure conditions described in this report. The Cr oxidation front

  7. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  8. Tank 101-SY Window E core sample: Interpretation of results

    International Nuclear Information System (INIS)

    Reynolds, D.A.

    1993-02-01

    A full depth core sample was taken for tank 241-SY-101 in December 1991 during a time period called ''Window E.'' This was the second full depth core sample from this tank during the year. The core had two major portions that are known as the convective zone and the nonconvective zone. A crust was on the top of tank but as poorly sampled. The analysis of the Window E core sample stressed segment composite chemical analysis instead of segment by segment as in Window C. Adiabatic calorimetry on samples from both cores showed a slow self heating reaction above 150 degrees C on dried samples. The exothermic events were milder than similar synthetic samples. The chemical and physical properties complemented the information from Window C. The Window E material from the convective zone was more viscous than the Window C convective zone material. The nonconvective zone viscosities were similar for both cores. Heating and dilution tests were made to test mitigation concepts

  9. Tank 101-SY Window E core sample: Interpretation of results

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, D.A.

    1993-02-01

    A full depth core sample was taken for tank 241-SY-101 in December 1991 during a time period called ``Window E.`` This was the second full depth core sample from this tank during the year. The core had two major portions that are known as the convective zone and the nonconvective zone. A crust was on the top of tank but as poorly sampled. The analysis of the Window E core sample stressed segment composite chemical analysis instead of segment by segment as in Window C. Adiabatic calorimetry on samples from both cores showed a slow self heating reaction above 150{degrees}C on dried samples. The exothermic events were milder than similar synthetic samples. The chemical and physical properties complemented the information from Window C. The Window E material from the convective zone was more viscous than the Window C convective zone material. The nonconvective zone viscosities were similar for both cores. Heating and dilution tests were made to test mitigation concepts.

  10. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  11. Sample results from the interim salt disposition program macrobatch 9 tank 21H qualification samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-11-01

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 9 for the Interim Salt Disposition Program (ISDP). This document reports characterization data on the samples of Tank 21H.

  12. Blood gas sample spiking with total parenteral nutrition, lipid emulsion, and concentrated dextrose solutions as a model for predicting sample contamination based on glucose result.

    Science.gov (United States)

    Jara-Aguirre, Jose C; Smeets, Steven W; Wockenfus, Amy M; Karon, Brad S

    2018-05-01

    Evaluate the effects of blood gas sample contamination with total parenteral nutrition (TPN)/lipid emulsion and dextrose 50% (D50) solutions on blood gas and electrolyte measurement; and determine whether glucose concentration can predict blood gas sample contamination with TPN/lipid emulsion or D50. Residual lithium heparin arterial blood gas samples were spiked with TPN/lipid emulsion (0 to 15%) and D50 solutions (0 to 2.5%). Blood gas (pH, pCO2, pO2), electrolytes (Na+, K+ ionized calcium) and hemoglobin were measured with a Radiometer ABL90. Glucose concentration was measured in separated plasma by Roche Cobas c501. Chart review of neonatal blood gas results with glucose >300 mg/dL (>16.65 mmol/L) over a seven month period was performed to determine whether repeat (within 4 h) blood gas results suggested pre-analytical errors in blood gas results. Results were used to determine whether a glucose threshold could predict contamination resulting in blood gas and electrolyte results with greater than laboratory-defined allowable error. Samples spiked with 5% or more TPN/lipid emulsion solution or 1% D50 showed glucose concentration >500 mg/dL (>27.75 mmol/L) and produced blood gas (pH, pO 2 , pCO 2 ) results with greater than laboratory-defined allowable error. TPN/lipid emulsion, but not D50, produced greater than allowable error in electrolyte (Na + ,K + ,Ca ++ ,Hb) results at these concentrations. Based on chart review of 144 neonatal blood gas results with glucose >250 mg/dL received over seven months, four of ten neonatal intensive care unit (NICU) patients with glucose results >500 mg/dL and repeat blood gas results within 4 h had results highly suggestive of pre-analytical error. Only 3 of 36 NICU patients with glucose results 300-500 mg/dL and repeat blood gas results within 4 h had clear pre-analytical errors in blood gas results. Glucose concentration can be used as an indicator of significant blood sample contamination with either TPN

  13. Trade-off results and preliminary designs of Near-Term Hybrid Vehicles

    Science.gov (United States)

    Sandberg, J. J.

    1980-01-01

    Phase I of the Near-Term Hybrid Vehicle Program involved the development of preliminary designs of electric/heat engine hybrid passenger vehicles. The preliminary designs were developed on the basis of mission analysis, performance specification, and design trade-off studies conducted independently by four contractors. THe resulting designs involve parallel hybrid (heat engine/electric) propulsion systems with significant variation in component selection, power train layout, and control strategy. Each of the four designs is projected by its developer as having the potential to substitute electrical energy for 40% to 70% of the petroleum fuel consumed annually by its conventional counterpart.

  14. Results of Self-Absorption Study on the Versapor 3000 Filters for Radioactive Particulate Air Sampling

    International Nuclear Information System (INIS)

    Barnett, J.M.

    2008-01-01

    Since the mid-1980s the Pacific Northwest National Laboratory (PNNL) has used a value of 0.85 as a correction factor for the self absorption of activity of particulate radioactive air samples. More recently, an effort was made to evaluate the current particulate radioactive air sample filters (Versapor(reg s ign) 3000) used at PNNL for self absorption effects. There were two methods used in the study, (1) to compare the radioactivity concentration by direct gas-flow proportional counting of the filter to the results obtained after acid digestion of the filter and counting again by gas-flow proportional detection and (2) to evaluate sample filters by high resolution visual/infrared microscopy to determine the depth of material loading on or in the filter fiber material. Sixty samples were selected from the archive for acid digestion in the first method and about 30 samples were selected for high resolution visual/infrared microscopy. Mass loading effects were also considered. From the sample filter analysis, large error is associated with the average self absorption factor, however, when the data is compared directly one-to-one, statistically, there appears to be good correlation between the two analytical methods. The mass loading of filters evaluated was <0.2 mg cm-2 and was also compared against other published results. The microscopy analysis shows the sample material remains on the top of the filter paper and does not imbed into the filter media. Results of the microscopy evaluation lead to the conclusion that there is not a mechanism for significant self absorption. The overall conclusion is that self-absorption is not a significant factor in the analysis of filters used at PNNL for radioactive air stack sampling of radionuclide particulates and that an applied correction factor is conservative in determining overall sample activity. A new self absorption factor of 1.0 is recommended

  15. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    Science.gov (United States)

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  16. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  17. Design and field results of a walk-through EDS

    Science.gov (United States)

    Wendel, Gregory J.; Bromberg, Edward E.; Durfee, Memorie K.; Curby, William A.

    1997-01-01

    A walk-through portal sampling module which incorporates active sampling has been developed. The module uses opposing wands which actively brush the subjects exterior clothing to disturb explosive traces. These traces are entrained in an air stream and transported to a High Speed GC- chemiluminescence explosives detection system. This combination provides automatic screening of passengers at rates of 10 per minute. The system exhibits sensitivity and selectivity which equals or betters that available from commercially available manual equipment. The systems has been developed for deployment at border crossings, airports and other security screening points. Detailed results of laboratory tests and airport field trials are reviewed.

  18. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  19. Solvent hold tank sample results for MCU-16-1363-1364-1365: November 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-16

    Savannah River National Laboratory (SRNL) received one set of three Solvent Hold Tank (SHT) samples (MCU-16-1363-1364-1365), pulled on 11/15/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1363-1364-1365 indicated the Isopar™L concentration is at its nominal level (100%). The extractant (MaxCalix) and the modifier (CS- 7SB) are 8% and 2 % below their nominal concentrations. The suppressor (TiDG) is 7% below its nominal concentration. A summary of the concentration of the relevant solvent components is shown below.

  20. Solvent hold tank sample results for MCU-16-1317-1318-1319: September 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-16

    Savannah River National Laboratory (SRNL) received one set of three Solvent Hold Tank (SHT) samples (MCU-16-1317-1318-1319), pulled on 09/12/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1317-1318-1319 indicated the Isopar™L concentration is above its nominal level (102%). The extractant (MaxCalix) and the modifier (CS-7SB) are 5% and 9% below their nominal concentrations. The suppressor (TiDG) is 76% below its nominal concentration. A summary of the concentration of the relevant solvent components is shown below.

  1. Solvent hold tank sample results for MCU-16-1247-1248-1249: August 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-06-16

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-1247-1248-1249), pulled on 08/22/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1247-1248-1249 indicated the Isopar™L concentration is above its nominal level (101%). The extractant (MaxCalix) and the modifier (CS-7SB) are 7% and 9 % below their nominal concentrations. The suppressor (TiDG) is 63% below its nominal concentration. A summary of the concentration of the relevant solvent components is shown below.

  2. What Makes Professional Development Effective? Results from a National Sample of Teachers.

    Science.gov (United States)

    Garet, Michael S.; Porter, Andrew C.; Desimone, Laura; Birman, Beatrice F.; Yoon, Kwang Suk

    2001-01-01

    Used a national probability sample of 1,027 mathematics and science teachers to provide a large-scale empirical comparison of effects of different characteristics of professional development on teachers' learning. Results identify three core features of professional development that have significant positive effects on teachers' self-reported…

  3. Sample size re-assessment leading to a raised sample size does not inflate type I error rate under mild conditions.

    Science.gov (United States)

    Broberg, Per

    2013-07-19

    One major concern with adaptive designs, such as the sample size adjustable designs, has been the fear of inflating the type I error rate. In (Stat Med 23:1023-1038, 2004) it is however proven that when observations follow a normal distribution and the interim result show promise, meaning that the conditional power exceeds 50%, type I error rate is protected. This bound and the distributional assumptions may seem to impose undesirable restrictions on the use of these designs. In (Stat Med 30:3267-3284, 2011) the possibility of going below 50% is explored and a region that permits an increased sample size without inflation is defined in terms of the conditional power at the interim. A criterion which is implicit in (Stat Med 30:3267-3284, 2011) is derived by elementary methods and expressed in terms of the test statistic at the interim to simplify practical use. Mathematical and computational details concerning this criterion are exhibited. Under very general conditions the type I error rate is preserved under sample size adjustable schemes that permit a raise. The main result states that for normally distributed observations raising the sample size when the result looks promising, where the definition of promising depends on the amount of knowledge gathered so far, guarantees the protection of the type I error rate. Also, in the many situations where the test statistic approximately follows a normal law, the deviation from the main result remains negligible. This article provides details regarding the Weibull and binomial distributions and indicates how one may approach these distributions within the current setting. There is thus reason to consider such designs more often, since they offer a means of adjusting an important design feature at little or no cost in terms of error rate.

  4. Tank 241-SY-102, January 2000 Compatibility Grab Samples Analytical Results for the Final Report

    International Nuclear Information System (INIS)

    BELL, K.E.

    2000-01-01

    This document is the format IV, final report for the tank 241-SY-102 (SY-102) grab samples taken in January 2000 to address waste compatibility concerns. Chemical, radiochemical, and physical analyses on the tank SY-102 samples were performed as directed in Comparability Grab Sampling and Analysis Plan for Fiscal Year 2000 (Sasaki 1999). No notification limits were exceeded. Preliminary data on samples 2SY-99-5, -6, and -7 were reported in ''Format II Report on Tank 241-SY-102 Waste Compatibility Grab Samples Taken in January 2000'' (Lockrem 2000). The data presented here represent the final results

  5. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  6. Design + energy: results of a national student design competition

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    A national competition for students in schools of architecture was conducted during the Spring of 1980. The competition was the first of a series of competitions that emphasized the integration of architectural design and energy considerations in medium-scale building projects, and specifically applying passive solar design strategies and the appropriate use of brick masonry materials. Some 300 faculty members and over 2200 students representing 80 of the 92 US architecture schools participated in the program. A summary is presented of the program and the range of submissions grouped by problem types and general climatic region.

  7. Sampling flies or sampling flaws? Experimental design and inference strength in forensic entomology.

    Science.gov (United States)

    Michaud, J-P; Schoenly, Kenneth G; Moreau, G

    2012-01-01

    Forensic entomology is an inferential science because postmortem interval estimates are based on the extrapolation of results obtained in field or laboratory settings. Although enormous gains in scientific understanding and methodological practice have been made in forensic entomology over the last few decades, a majority of the field studies we reviewed do not meet the standards for inference, which are 1) adequate replication, 2) independence of experimental units, and 3) experimental conditions that capture a representative range of natural variability. Using a mock case-study approach, we identify design flaws in field and lab experiments and suggest methodological solutions for increasing inference strength that can inform future casework. Suggestions for improving data reporting in future field studies are also proposed.

  8. 75 Ah and 10 boilerplate nickel-hydrogen battery designs and test results

    Science.gov (United States)

    Daman, M. E.; Manzo, Michelle A.; Chang, R.; Cruz, E.

    1992-01-01

    The results of initial characterization testing of 75 Ah actively cooled bipolar battery designs and 10 boilerplate nickel-hydrogen battery designs are presented. The results demonstrate the extended cycle life capability of the Ah batteries and the high capacity utilizations at various discharge rates of the nickel-hydrogen batteries.

  9. Status report on the Zagreb Radiocarbon Laboratory - AMS and LSC results of VIRI intercomparison samples

    Science.gov (United States)

    Sironić, Andreja; Krajcar Bronić, Ines; Horvatinčić, Nada; Barešić, Jadranka; Obelić, Bogomil; Felja, Igor

    2013-01-01

    A new line for preparation of the graphite samples for 14C dating by Accelerator Mass Spectrometry (AMS) in the Zagreb Radiocarbon Laboratory has been validated by preparing graphite from various materials distributed within the Fifth International Radiocarbon Intercomparison (VIRI) study. 14C activity of prepared graphite was measured at the SUERC AMS facility. The results are statistically evaluated by means of the z-score and u-score values. The mean z-score value of 28 prepared VIRI samples is (0.06 ± 0.23) showing excellent agreement with the consensus VIRI values. Only one sample resulted in the u-score value above the limit of acceptability (defined for the confidence interval of 99%) and this was probably caused by a random contamination of the graphitization rig. After the rig had been moved to the new adapted and isolated room, all u-score values laid within the acceptable limits. Our LSC results of VIRI intercomparison samples are also presented and they are all accepted according to the u-score values.

  10. Status report on the Zagreb Radiocarbon Laboratory - AMS and LSC results of VIRI intercomparison samples

    Energy Technology Data Exchange (ETDEWEB)

    Sironic, Andreja [Department of Experimental Physics, Ruder Boskovic Institute, Bijenicka 54, 10000 Zagreb (Croatia); Krajcar Bronic, Ines, E-mail: krajcar@irb.hr [Department of Experimental Physics, Ruder Boskovic Institute, Bijenicka 54, 10000 Zagreb (Croatia); Horvatincic, Nada; Baresic, Jadranka; Obelic, Bogomil; Felja, Igor [Department of Experimental Physics, Ruder Boskovic Institute, Bijenicka 54, 10000 Zagreb (Croatia)

    2013-01-15

    A new line for preparation of the graphite samples for {sup 14}C dating by Accelerator Mass Spectrometry (AMS) in the Zagreb Radiocarbon Laboratory has been validated by preparing graphite from various materials distributed within the Fifth International Radiocarbon Intercomparison (VIRI) study. {sup 14}C activity of prepared graphite was measured at the SUERC AMS facility. The results are statistically evaluated by means of the z-score and u-score values. The mean z-score value of 28 prepared VIRI samples is (0.06 {+-} 0.23) showing excellent agreement with the consensus VIRI values. Only one sample resulted in the u-score value above the limit of acceptability (defined for the confidence interval of 99%) and this was probably caused by a random contamination of the graphitization rig. After the rig had been moved to the new adapted and isolated room, all u-score values laid within the acceptable limits. Our LSC results of VIRI intercomparison samples are also presented and they are all accepted according to the u-score values.

  11. Groundwater-quality data in 12 GAMA study units: Results from the 2006–10 initial sampling period and the 2008–13 trend sampling period, California GAMA Priority Basin Project

    Science.gov (United States)

    Mathany, Timothy M.

    2017-03-09

    The Priority Basin Project (PBP) of the Groundwater Ambient Monitoring and Assessment (GAMA) program was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey in cooperation with the California State Water Resources Control Board. From 2004 through 2012, the GAMA-PBP collected samples and assessed the quality of groundwater resources that supply public drinking water in 35 study units across the State. Selected sites in each study unit were sampled again approximately 3 years after initial sampling as part of an assessment of temporal trends in water quality by the GAMA-PBP. Twelve of the study units, initially sampled during 2006–11 (initial sampling period) and sampled a second time during 2008–13 (trend sampling period) to assess temporal trends, are the subject of this report.The initial sampling was designed to provide a spatially unbiased assessment of the quality of untreated groundwater used for public water supplies in the 12 study units. In these study units, 550 sampling sites were selected by using a spatially distributed, randomized, grid-based method to provide spatially unbiased representation of the areas assessed (grid sites, also called “status sites”). After the initial sampling period, 76 of the previously sampled status sites (approximately 10 percent in each study unit) were randomly selected for trend sampling (“trend sites”). The 12 study units sampled both during the initial sampling and during the trend sampling period were distributed among 6 hydrogeologic provinces: Coastal (Northern and Southern), Transverse Ranges and Selected Peninsular Ranges, Klamath, Modoc Plateau and Cascades, and Sierra Nevada Hydrogeologic Provinces. For the purposes of this trend report, the six hydrogeologic provinces were grouped into two hydrogeologic regions based on location: Coastal and Mountain.The groundwater samples were analyzed for a number of synthetic organic

  12. Design and experimental results of the 1-T Bitter Electromagnet Testing Apparatus (BETA)

    Science.gov (United States)

    Bates, E. M.; Birmingham, W. J.; Romero-Talamás, C. A.

    2018-05-01

    The Bitter Electromagnet Testing Apparatus (BETA) is a 1-Tesla (T) technical prototype of the 10 T Adjustable Long Pulsed High-Field Apparatus. BETA's final design specifications are highlighted in this paper which include electromagnetic, thermal, and stress analyses. We discuss here the design and fabrication of BETA's core, vessel, cooling, and electrical subsystems. The electrical system of BETA is composed of a scalable solid-state DC breaker circuit. Experimental results display the stable operation of BETA at 1 T. These results are compared to both analytical design and finite element calculations. Experimental results validate analytical magnet designing methods developed at the Dusty Plasma Laboratory. The theoretical steady state maxima and the limits of BETA's design are explored in this paper.

  13. Statistical sampling methods for soils monitoring

    Science.gov (United States)

    Ann M. Abbott

    2010-01-01

    Development of the best sampling design to answer a research question should be an interactive venture between the land manager or researcher and statisticians, and is the result of answering various questions. A series of questions that can be asked to guide the researcher in making decisions that will arrive at an effective sampling plan are described, and a case...

  14. Comparison of leach results from field and laboratory prepared samples

    International Nuclear Information System (INIS)

    Oblath, S.B.; Langton, C.A.

    1985-01-01

    The leach behavior of saltstone prepared in the laboratory agrees well with that from samples mixed in the field using the Littleford mixer. Leach rates of nitrates and cesium from the current reference formulation saltstone were compared. The laboratory samples were prepared using simulated salt solution; those in the field used Tank 50 decontaminated supernate. For both nitrate and cesium, the field and laboratory samples showed nearly identical leach rates for the first 30 to 50 days. For the remaining period of the test, the field samples showed higher leach rates with the maximum difference being less than a factor of three. Ruthenium and antimony were present in the Tank 50 supernate in known amounts. Antimony-125 was observed in the leachate and a fractional leach rate was calculated to be at least a factor of ten less than that of 137 Cs. No 106 Ru was observed in the leachate, and the release rate was not calculated. However, based on the detection limits for the analysis, the ruthenium leach rate must also be at least a factor of ten less than cesium. These data are the first measurements of the leach rates of Ru and Sb from saltstone. The nitrate leach rates for these samples were 5 x 10 -5 grams of nitrate per square cm per day after 100 days for the laboratory samples and after 200 days for the field samples. These values are consistent with the previously measured leach rates for reference formulation saltstone. The relative standard deviation in the leach rate is about 15% for the field samples, which all were produced from one batch of saltstone, and about 35% for the laboratory samples, which came from different batches. These are the first recorded estimates of the error in leach rates for saltstone

  15. Administration of antibiotic agents before intraoperative sampling in orthopedic infections alters culture results.

    Science.gov (United States)

    Al-Mayahi, Mohamed; Cian, Anais; Lipsky, Benjamin A; Suvà, Domizio; Müller, Camillo; Landelle, Caroline; Miozzari, Hermès H; Uçkay, Ilker

    2015-11-01

    Many physicians and surgeons think that prescribing antibiotics before intraoperative sampling does not alter the microbiological results. Case-control study of adult patients hospitalized with orthopedic infections. Among 2740 episodes of orthopedic infections, 1167 (43%) had received antibiotic therapy before surgical sampling. Among these, 220 (19%) grew no pathogens while the proportion of culture-negative results in the 2573 who had no preoperative antibiotic therapy was only 6%. By multivariate analyses, pre-operative antibiotic exposure was associated with significantly more culture-negative results (odds ratio 2.8, 95% confidence interval 2.1-3.7), more non-fermenting rods and skin commensals (odds ratio 2.8 and 3.0, respectively). Even a single pre-operative dose of antibiotic was significantly associated with subsequent culture-negative results (19/93 vs. 297/2350; χ²-test, p = 0.01) and skin commensals (17/74 vs. 274/2350; p = 0.01) compared to episodes without preceding prophylaxis. Prior antibiotic use, including single-dose prophylactic administrations, is three-fold associated with culture-negative results, non-fermenting rods and resistant skin commensals. Copyright © 2015 The British Infection Association. Published by Elsevier Ltd. All rights reserved.

  16. Sample size determination for mediation analysis of longitudinal data.

    Science.gov (United States)

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  17. Sampling for quality assurance of grading decisions in diabetic retinopathy screening: designing the system to detect errors.

    Science.gov (United States)

    Slattery, Jim

    2005-01-01

    To evaluate various designs for a quality assurance system to detect and control human errors in a national screening programme for diabetic retinopathy. A computer simulation was performed of some possible ways of sampling the referral decisions made during grading and of different criteria for initiating more intensive QA investigations. The effectiveness of QA systems was assessed by the ability to detect a grader making occasional errors in referral. Substantial QA sample sizes are needed to ensure against inappropriate failure to refer. Detection of a grader who failed to refer one in ten cases can be achieved with a probability of 0.58 using an annual sample size of 300 and 0.77 using a sample size of 500. An unmasked verification of a sample of non-referrals by a specialist is the most effective method of internal QA for the diabetic retinopathy screening programme. Preferential sampling of those with some degree of disease may improve the efficiency of the system.

  18. First Total Reflection X-Ray Fluorescence round-robin test of water samples: Preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Borgese, Laura; Bilo, Fabjola [Chemistry for Technologies Laboratory, University of Brescia, Brescia (Italy); Tsuji, Kouichi [Graduate School of Engineering, Osaka City University, Osaka (Japan); Fernández-Ruiz, Ramón [Servicio Interdepartamental de Investigación (SIdI), Laboratorio de TXRF, Universidad Autónoma de Madrid, Madrid (Spain); Margui, Eva [Department of Chemistry, University of Girona, Girona (Spain); Streli, Christina [TU Wien, Atominstitut,Radiation Physics, Vienna (Austria); Pepponi, Giancarlo [Fondazione Bruno Kessler, Povo, Trento (Italy); Stosnach, Hagen [Bruker Nano GmbH, Berlin (Germany); Yamada, Takashi [Rigaku Corporation, Takatsuki, Osaka (Japan); Vandenabeele, Peter [Department of Archaeology, Ghent University, Ghent (Belgium); Maina, David M.; Gatari, Michael [Institute of Nuclear Science and Technology, University of Nairobi, Nairobi (Kenya); Shepherd, Keith D.; Towett, Erick K. [World Agroforestry Centre (ICRAF), Nairobi (Kenya); Bennun, Leonardo [Laboratorio de Física Aplicada, Departamento de Física, Universidad de Concepción (Chile); Custo, Graciela; Vasquez, Cristina [Gerencia Química, Laboratorio B025, Centro Atómico Constituyentes, San Martín (Argentina); Depero, Laura E., E-mail: laura.depero@unibs.it [Chemistry for Technologies Laboratory, University of Brescia, Brescia (Italy)

    2014-11-01

    Total Reflection X-Ray Fluorescence (TXRF) is a mature technique to evaluate quantitatively the elemental composition of liquid samples deposited on clean and well polished reflectors. In this paper the results of the first worldwide TXRF round-robin test of water samples, involving 18 laboratories in 10 countries are presented and discussed. The test was performed within the framework of the VAMAS project, interlaboratory comparison of TXRF spectroscopy for environmental analysis, whose aim is to develop guidelines and a standard methodology for biological and environmental analysis by means of the TXRF analytical technique. - Highlights: • The discussion of the first worldwide TXRF round-robin test of water samples (18 laboratories of 10 countries) is reported. • Drinking, waste, and desalinated water samples were tested. • Data dispersion sources were identified: sample concentration, preparation, fitting procedure, and quantification. • The protocol for TXRF analysis of drinking water is proposed.

  19. Solvent Hold Tank Sample Results for MCU-16-1247-1248-1249: August 2016 Monthly Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-01-12

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-1247-1248-1249), pulled on 08/22/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1247-1248-1249 indicated the Isopar™L concentration is above its nominal level (101%). The extractant (MaxCalix) and the modifier (CS-7SB) are 7% and 9 % below their nominal concentrations. The suppressor (TiDG) is 63% below its nominal concentration. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier and MaxCalix to restore then to nominal levels. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended. At the time of writing this report, A solvent trim batch containing TiDG, modifier and MaxCalix, was added to the SHT (October 2016) and expect the concentration of these components to be at their nominal values.

  20. Results of Macroinvertebrate Sampling Conducted at 33 SRS Stream Locations, July--August 1993

    Energy Technology Data Exchange (ETDEWEB)

    Specht, W.L.

    1994-12-01

    In order to assess the health of the macroinvertebrate communities of SRS streams, the macroinvertebrate communities at 30 stream locations on SRS were sampled during the summer of 1993, using Hester-Dendy multiplate samplers. In addition, three off-site locations in the Upper Three Runs drainage were sampled in order to assess the potential for impact from off-site activities. In interpreting the data, it is important to recognize that these data were from a single set of collections. Macroinvertebrate communities often undergo considerable temporal variation, and are also greatly influenced by such factors as water depth, water velocity, and available habitat. These stations were selected with the intent of developing an on-going sampling program at a smaller number of stations, with the selection of the stations to be based largely upon the results of this preliminary sampling program. When stations within a given stream showed similar results, fewer stations would be sampled in the future. Similarly, if a stream appeared to be perturbed, additional stations or chemical analyses might be added so that the source of the perturbation could be identified. In general, unperturbed streams will contain more taxa than perturbed streams, and the distribution of taxa among orders or families will differ. Some groups of macroinvertebrates, such as Ephemeroptera (mayflies), Plecoptera (stoneflies) and Trichoptera (caddisflies), which are collectively called EPT taxa, are considered to be relatively sensitive to most kinds of stream perturbation; therefore a reduced number of EPT taxa generally indicates that the stream has been subject to chemical or physical stressors. In coastal plain streams, EPT taxa are generally less dominant than in streams with rocky substrates, while Chironomidae (midges) are more abundant. (Abstract Truncated)

  1. A multi-crucible core-catcher concept: Design considerations and basic results

    International Nuclear Information System (INIS)

    Szabo, I.

    1995-01-01

    A multi-crucible core-catcher concept to be implemented in new light water reactor containments has recently been proposed. This paper deals with conceptual design considerations and the various ways this type of core-catcher could be designed to meet requirements for reactor application. A systematic functional analysis of the multi-crucible core-catcher concept and the results of the preliminary design calculation are presented. Finally, the adequacy of the multi-crucible core-catcher concept for reactor application is discussed. (orig.)

  2. Solvent hold tank sample results for MCU-16-1317-1318-1319. September 2016 monthly sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-01-01

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-1317-1318-1319), pulled on 09/12/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-1317-1318-1319 indicated the Isopar™L concentration is above its nominal level (102%). The extractant (MaxCalix) and the modifier (CS-7SB) are 5% and 10 % below their nominal concentrations. The suppressor (TiDG) is 77% below its nominal concentration. A summary of the concentration of the relevant solvent components is shown below. This analysis confirms the Isopar™ addition to the solvent in August. This analysis also indicates the solvent may require the addition of TiDG, and possibly of modifier to restore them to nominal levels.

  3. Sample Acquisition for Materials in Planetary Exploration (SAMPLE), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ORBITEC proposes to analyze, design, and develop a device for autonomous lunar surface/subsurface sampling and processing applications. The Sample Acquisition for...

  4. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  5. Characterization of a low-level radioactive waste grout: Sampling and test results

    International Nuclear Information System (INIS)

    Martin, P.F.C.; Lokken, R.O.

    1992-12-01

    WHC manages and operates the grout treatment facility at Hanford as part of a DOE program to clean up wastes stored at federal nuclear production sites. PNL provides support to the grout disposal program through pilot-scale tests, performance assessments, and formulation verification activities. in 1988 and 1989, over one million gallons of a low-level radioactive liquid waste was processed through the facility to produce a grout waste that was then deposited in an underground vault. The liquid waste was phosphate/sulfate waste (PSW) generated in decontamination of the N Reactor. PNL sampled and tested the grout produced during the second half of the PSW campaign to support quality verification activities prior to grout vault closure. Samples of grout were obtained by inserting nested-tube samplers into the grout slurry in the vault. After the grout had cured, the inner tube of the sampler was removed and the grout samples extracted. Tests for compressive strength, sonic velocity, and leach testing were used to assess grout quality; results were compared to those from pilot-scale test grouts made with a simulated PSW. The grout produced during the second half of the PSW campaign exceeded compressive strength and leachability formulation criteria. The nested tube samplers were effective in collecting samples of grout although their use introduced greater variability into the compressive strength data

  6. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Science.gov (United States)

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  7. Analyzing Repeated Measures Marginal Models on Sample Surveys with Resampling Methods

    Directory of Open Access Journals (Sweden)

    James D. Knoke

    2005-12-01

    Full Text Available Packaged statistical software for analyzing categorical, repeated measures marginal models on sample survey data with binary covariates does not appear to be available. Consequently, this report describes a customized SAS program which accomplishes such an analysis on survey data with jackknifed replicate weights for which the primary sampling unit information has been suppressed for respondent confidentiality. First, the program employs the Macro Language and the Output Delivery System (ODS to estimate the means and covariances of indicator variables for the response variables, taking the design into account. Then, it uses PROC CATMOD and ODS, ignoring the survey design, to obtain the design matrix and hypothesis test specifications. Finally, it enters these results into another run of CATMOD, which performs automated direct input of the survey design specifications and accomplishes the appropriate analysis. This customized SAS program can be employed, with minor editing, to analyze general categorical, repeated measures marginal models on sample surveys with replicate weights. Finally, the results of our analysis accounting for the survey design are compared to the results of two alternate analyses of the same data. This comparison confirms that such alternate analyses, which do not properly account for the design, do not produce useful results.

  8. Retained Gas Sampling Results for the Flammable Gas Program

    International Nuclear Information System (INIS)

    Bates, J.M.; Mahoney, L.A.; Dahl, M.E.; Antoniak, Z.I.

    1999-01-01

    The key phenomena of the Flammable Gas Safety Issue are generation of the gas mixture, the modes of gas retention, and the mechanisms causing release of the gas. An understanding of the mechanisms of these processes is required for final resolution of the safety issue. Central to understanding is gathering information from such sources as historical records, tank sampling data, tank process data (temperatures, ventilation rates, etc.), and laboratory evaluations conducted on tank waste samples

  9. Retained Gas Sampling Results for the Flammable Gas Program

    Energy Technology Data Exchange (ETDEWEB)

    J.M. Bates; L.A. Mahoney; M.E. Dahl; Z.I. Antoniak

    1999-11-18

    The key phenomena of the Flammable Gas Safety Issue are generation of the gas mixture, the modes of gas retention, and the mechanisms causing release of the gas. An understanding of the mechanisms of these processes is required for final resolution of the safety issue. Central to understanding is gathering information from such sources as historical records, tank sampling data, tank process data (temperatures, ventilation rates, etc.), and laboratory evaluations conducted on tank waste samples.

  10. Coastal California's Fog as a Unique Habitable Niche: Design for Autonomous Sampling and Preliminary Aerobiological Characterization

    Science.gov (United States)

    Gentry, Diana; Cynthia Ouandji; Arismendi, Dillon; Guarro, Marcello; Demachkie, Isabella; Crosbie, Ewan; Dadashazar, Hossein; MacDonald, Alex B.; Wang, Zhen; Sorooshian, Armin; hide

    2017-01-01

    Just as on the land or in the ocean, atmospheric regions may be more or less hospitable to life. The aerobiosphere, or collection of living things in Earth's atmosphere, is poorly understood due to the small number and ad hoc nature of samples studied. However, we know viable airborne microbes play important roles, such as providing cloud condensation nuclei. Knowing the distribution of such microorganisms and how their activity can alter water, carbon, and other geochemical cycles is key to developing criteria for planetary habitability, particularly for potential habitats with wet atmospheres but little stable surface water. Coastal California has regular, dense fog known to play a major transport role in the local ecosystem. In addition to the significant local (1 km) geographical variation in typical fog, previous studies have found that changes in height above surface of as little as a few meters can yield significant differences in typical concentrations, populations and residence times. No single current sampling platform (ground-based impactors, towers, balloons, aircraft) is capable of accessing all of these regions of interest.A novel passive fog and cloud water sampler, consisting of a lightweight passive impactor suspended from autonomous aerial vehicles (UAVs), is being developed to allow 4D point sampling within a single fog bank, allowing closer study of small-scale (100 m) system dynamics. Fog and cloud droplet water samples from low-altitude aircraft flights in nearby coastal waters were collected and assayed to estimate the required sample volumes, flight times, and sensitivity thresholds of the system under design.125 cloud water samples were collected from 16 flights of the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) instrumented Twin Otter, equipped with a sampling tube collector, occurring between 18 July and 12 August 2016 below 1 km altitude off the central coast. The collector was flushed first with 70 ethanol

  11. Inclusion of mobile phone numbers into an ongoing population health survey in New South Wales, Australia: design, methods, call outcomes, costs and sample representativeness.

    Science.gov (United States)

    Barr, Margo L; van Ritten, Jason J; Steel, David G; Thackway, Sarah V

    2012-11-22

    In Australia telephone surveys have been the method of choice for ongoing jurisdictional population health surveys. Although it was estimated in 2011 that nearly 20% of the Australian population were mobile-only phone users, the inclusion of mobile phone numbers into these existing landline population health surveys has not occurred. This paper describes the methods used for the inclusion of mobile phone numbers into an existing ongoing landline random digit dialling (RDD) health survey in an Australian state, the New South Wales Population Health Survey (NSWPHS). This paper also compares the call outcomes, costs and the representativeness of the resultant sample to that of the previous landline sample. After examining several mobile phone pilot studies conducted in Australia and possible sample designs (screening dual-frame and overlapping dual-frame), mobile phone numbers were included into the NSWPHS using an overlapping dual-frame design. Data collection was consistent, where possible, with the previous years' landline RDD phone surveys and between frames. Survey operational data for the frames were compared and combined. Demographic information from the interview data for mobile-only phone users, both, and total were compared to the landline frame using χ2 tests. Demographic information for each frame, landline and the mobile-only (equivalent to a screening dual frame design), and the frames combined (with appropriate overlap adjustment) were compared to the NSW demographic profile from the 2011 census using χ2 tests. In the first quarter of 2012, 3395 interviews were completed with 2171 respondents (63.9%) from the landline frame (17.6% landline only) and 1224 (36.1%) from the mobile frame (25.8% mobile only). Overall combined response, contact and cooperation rates were 33.1%, 65.1% and 72.2% respectively. As expected from previous research, the demographic profile of the mobile-only phone respondents differed most (more that were young, males, Aboriginal

  12. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  13. The variance quadtree algorithm: use for spatial sampling design

    NARCIS (Netherlands)

    Minasny, B.; McBratney, A.B.; Walvoort, D.J.J.

    2007-01-01

    Spatial sampling schemes are mainly developed to determine sampling locations that can cover the variation of environmental properties in the area of interest. Here we proposed the variance quadtree algorithm for sampling in an area with prior information represented as ancillary or secondary

  14. Improving the Acquisition and Management of Sample Curation Data

    Science.gov (United States)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  15. Mars Sample Return Architecture Assessment Study

    Science.gov (United States)

    Centuori, S.; Hermosín, P.; Martín, J.; De Zaiacomo, G.; Colin, S.; Godfrey, A.; Myles, J.; Johnson, H.; Sachdev, T.; Ahmed, R.

    2018-04-01

    Current paper presents the results of ESA funded activity "Mars Sample Return Architecture Assessment Study" carried-out by DEIMOS Space, Lockheed Martin UK Ampthill, and MDA Corporation, where more than 500 mission design options have been studied.

  16. Reporting altered test results in hemolyzed samples: is the cure worse than the disease?

    Science.gov (United States)

    Lippi, Giuseppe; Cervellin, Gianfranco; Plebani, Mario

    2017-07-26

    The management of laboratory data in unsuitable (hemolyzed) samples remains an almost unresolved dilemma. Whether or not laboratory test results obtained by measuring unsuitable specimens should be made available to the clinicians has been the matter of fierce debates over the past decades. Recently, an intriguing alternative to suppressing test results and recollecting the specimen has been put forward, entailing the definition and implementation of specific algorithms that would finally allow reporting a preanalytically altered laboratory value within a specific comment about its uncertainty of measurement. This approach carries some advantages, namely the timely communication of potentially life-threatening laboratory values, but also some drawbacks. These especially include the challenging definition of validated performance specifications for hemolyzed samples, the need to producing reliable data with the lowest possible uncertainty, the short turnaround time for repeating most laboratory tests, the risk that the comments may be overlooked in short-stay and frequently overcrowded units (e.g. the emergency department), as well as the many clinical advantages of a direct communication with the physician in charge of the patient. Despite the debate remains open, we continue supporting the suggestion that suppressing data in unsuitable (hemolyzed) samples and promptly notifying the clinicians about the need to recollect the samples remains the most (clinically and analytically) safe practice.

  17. Performance test of SAUNA xenon mobile sampling system

    International Nuclear Information System (INIS)

    Hu Dan; Yang Bin; Yang Weigeng; Jia Huaimao; Wang Shilian; Li Qi; Zhao Yungang; Fan Yuanqing; Chen Zhanying; Chang Yinzhong; Liu Shujiang; Zhang Xinjun; Wang Jun

    2011-01-01

    In this article, the structure and basic functions of SAUNA noble gas xenon mobile sampling system are introduced. The sampling capability of this system is about 2.2 mL per day, as a result from a 684-h operation. The system can be transported to designated locations conveniently to collect xenon sample for routine or emergency environment monitoring. (authors)

  18. Analysis and radiological assessment of survey results and samples from the beaches around Sellafield

    International Nuclear Information System (INIS)

    Webb, G.A.M.; Fry, F.A.

    1983-12-01

    After radioactive sea debris had been found on beaches near the BNFL, Sellafield, plant, NRPB was asked by the Department of the Environment to analyse some of the samples collected and to assess the radiological hazard to members of the public. A report is presented containing an analysis of survey reports for the period 19 November - 4 December 1983 and preliminary results of the analysis of all samples received, together with the Board's recommendations. (author)

  19. Simulated tempering distributed replica sampling: A practical guide to enhanced conformational sampling

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Sarah; Pomes, Regis, E-mail: pomes@sickkids.ca

    2010-11-01

    Simulated tempering distributed replica sampling (STDR) is a generalized-ensemble method designed specifically for simulations of large molecular systems on shared and heterogeneous computing platforms [Rauscher, Neale and Pomes (2009) J. Chem. Theor. Comput. 5, 2640]. The STDR algorithm consists of an alternation of two steps: (1) a short molecular dynamics (MD) simulation; and (2) a stochastic temperature jump. Repeating these steps thousands of times results in a random walk in temperature, which allows the system to overcome energetic barriers, thereby enhancing conformational sampling. The aim of the present paper is to provide a practical guide to applying STDR to complex biomolecular systems. We discuss the details of our STDR implementation, which is a highly-parallel algorithm designed to maximize computational efficiency while simultaneously minimizing network communication and data storage requirements. Using a 35-residue disordered peptide in explicit water as a test system, we characterize the efficiency of the STDR algorithm with respect to both diffusion in temperature space and statistical convergence of structural properties. Importantly, we show that STDR provides a dramatic enhancement of conformational sampling compared to a canonical MD simulation.

  20. MCMC-ODPR: Primer design optimization using Markov Chain Monte Carlo sampling

    Directory of Open Access Journals (Sweden)

    Kitchen James L

    2012-11-01

    Full Text Available Abstract Background Next generation sequencing technologies often require numerous primer designs that require good target coverage that can be financially costly. We aimed to develop a system that would implement primer reuse to design degenerate primers that could be designed around SNPs, thus find the fewest necessary primers and the lowest cost whilst maintaining an acceptable coverage and provide a cost effective solution. We have implemented Metropolis-Hastings Markov Chain Monte Carlo for optimizing primer reuse. We call it the Markov Chain Monte Carlo Optimized Degenerate Primer Reuse (MCMC-ODPR algorithm. Results After repeating the program 1020 times to assess the variance, an average of 17.14% fewer primers were found to be necessary using MCMC-ODPR for an equivalent coverage without implementing primer reuse. The algorithm was able to reuse primers up to five times. We compared MCMC-ODPR with single sequence primer design programs Primer3 and Primer-BLAST and achieved a lower primer cost per amplicon base covered of 0.21 and 0.19 and 0.18 primer nucleotides on three separate gene sequences, respectively. With multiple sequences, MCMC-ODPR achieved a lower cost per base covered of 0.19 than programs BatchPrimer3 and PAMPS, which achieved 0.25 and 0.64 primer nucleotides, respectively. Conclusions MCMC-ODPR is a useful tool for designing primers at various melting temperatures at good target coverage. By combining degeneracy with optimal primer reuse the user may increase coverage of sequences amplified by the designed primers at significantly lower costs. Our analyses showed that overall MCMC-ODPR outperformed the other primer-design programs in our study in terms of cost per covered base.

  1. Design and experimental results of coaxial circuits for gyroklystron amplifiers

    International Nuclear Information System (INIS)

    Flaherty, M.K.E.; Lawson, W.; Cheng, J.; Calame, J.P.; Hogan, B.; Latham, P.E.; Granatstein, V.L.

    1994-01-01

    At the University of Maryland high power microwave source development for use in linear accelerator applications continues with the design and testing of coaxial circuits for gyroklystron amplifiers. This presentation will include experimental results from a coaxial gyroklystron that was tested on the current microwave test bed, and designs for second harmonic coaxial circuits for use in the next generation of the gyroklystron program. The authors present test results for a second harmonic coaxial circuit. Similar to previous second harmonic experiments the input cavity resonated at 9.886 GHz and the output frequency was 19.772 GHz. The coaxial insert was positioned in the input cavity and drift region. The inner conductor consisted of a tungsten rod with copper and ceramic cylinders covering its length. Two tungsten rods that bridged the space between the inner and outer conductors supported the whole assembly. The tube produced over 20 MW of output power with 17% efficiency. Beam interception by the tungsten rods resulted in minor damage. Comparisons with previous non-coaxial circuits showed that the coaxial configuration increased the parameter space over which stable operation was possible. Future experiments will feature an upgraded modulator and beam formation system capable of producing 300 MW of beam power. The fundamental frequency of operation is 8.568 GHz. A second harmonic coaxial gyroklystron circuit was designed for use in the new system. A scattering matrix code predicts a resonant frequency of 17.136 GHz and Q of 260 for the cavity with 95% of the outgoing microwaves in the desired TE032 mode. Efficiency studies of this second harmonic output cavity show 20% expected efficiency. Shorter second harmonic output cavity designs are also being investigated with expected efficiencies near 34%

  2. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    Science.gov (United States)

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  3. Results for the Fourth Quarter Calendar Year 2015 Tank 50H Salt Solution Sample

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-01-11

    In this memorandum, the chemical and radionuclide contaminant results from the Fourth Quarter Calendar Year 2015 (CY15) sample of Tank 50H salt solution are presented in tabulated form. The Fourth Quarter CY15 Tank 50H samples were obtained on October 29, 2015 and received at Savannah River National Laboratory (SRNL) on October 30, 2015. The information from this characterization will be used by Defense Waste Processing Facility (DWPF) & Saltstone Facility Engineering for the transfer of aqueous waste from Tank 50H to the Salt Feed Tank in the Saltstone Production Facility, where the waste will be treated and disposed of in the Saltstone Disposal Facility. This memorandum compares results, where applicable, to Saltstone Waste Acceptance Criteria (WAC) limits and targets. Data pertaining to the regulatory limits for Resource Conservation and Recovery Act (RCRA) metals will be documented at a later time per the Task Technical and Quality Assurance Plan (TTQAP) for the Tank 50H saltstone task. The chemical and radionuclide contaminant results from the characterization of the Fourth Quarter Calendar Year 2015 (CY15) sampling of Tank 50H were requested by SRR personnel and details of the testing are presented in the SRNL Task Technical and Quality Assurance Plan.

  4. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  5. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  6. Status report on the Zagreb Radiocarbon Laboratory – AMS and LSC results of VIRI intercomparison samples

    International Nuclear Information System (INIS)

    Sironić, Andreja; Krajcar Bronić, Ines; Horvatinčić, Nada; Barešić, Jadranka; Obelić, Bogomil; Felja, Igor

    2013-01-01

    A new line for preparation of the graphite samples for 14 C dating by Accelerator Mass Spectrometry (AMS) in the Zagreb Radiocarbon Laboratory has been validated by preparing graphite from various materials distributed within the Fifth International Radiocarbon Intercomparison (VIRI) study. 14 C activity of prepared graphite was measured at the SUERC AMS facility. The results are statistically evaluated by means of the z-score and u-score values. The mean z-score value of 28 prepared VIRI samples is (0.06 ± 0.23) showing excellent agreement with the consensus VIRI values. Only one sample resulted in the u-score value above the limit of acceptability (defined for the confidence interval of 99%) and this was probably caused by a random contamination of the graphitization rig. After the rig had been moved to the new adapted and isolated room, all u-score values laid within the acceptable limits. Our LSC results of VIRI intercomparison samples are also presented and they are all accepted according to the u-score values.

  7. Results For The Third Quarter 2013 Tank 50 WAC Slurry Sample

    Energy Technology Data Exchange (ETDEWEB)

    Bannochie, Christopher J.

    2013-11-26

    This report details the chemical and radionuclide contaminant results for the characterization of the 2013 Third Quarter sampling of Tank 50 for the Saltstone Waste Acceptance Criteria (WAC) in effect at that time. Information from this characterization will be used by DWPF & Saltstone Facility Engineering (DSFE) to support the transfer of low-level aqueous waste from Tank 50 to the Salt Feed Tank in the Saltstone Facility in Z-Area, where the waste will be immobilized. This information is also used to update the Tank 50 Waste Characterization System.

  8. Design and sampling plan optimization for RT-qPCR experiments in plants: a case study in blueberry

    Directory of Open Access Journals (Sweden)

    Jose V Die

    2016-03-01

    Full Text Available The qPCR assay has become a routine technology in plant biotechnology and agricultural research. It is unlikely to be technically improved, but there are still challenges which center around minimizing the variability in results and transparency when reporting technical data in support of the conclusions of a study. There are a number of aspects of the pre- and post-assay workflow that contribute to variability of results. Here, through the study of the introduction of error in qPCR measurements at different stages of the workflow, we describe the most important causes of technical variability in a case study using blueberry. In this study, we found that the stage for which increasing the number of replicates would be the most beneficial depends on the tissue used. For example, we would recommend the use of more RT replicates when working with leaf tissue, while the use of more sampling (RNA extraction replicates would be recommended when working with stems or fruits to obtain the most optimal results. The use of more qPCR replicates provides the least benefit as it is the most reproducible step. By knowing the distribution of error over an entire experiment and the costs at each step, we have developed a script to identify the optimal sampling plan within the limits of a given budget. These findings should help plant scientists improve the design of qPCR experiments and refine their laboratory practices in order to conduct qPCR assays in a more reliable-manner to produce more consistent and reproducible data.

  9. Advancing the Use of Passive Sampling in Risk Assessment and Management of Sediments Contaminated with Hydrophobic Organic Chemicals: Results of an International Ex Situ Passive Sampling Interlaboratory Comparison.

    Science.gov (United States)

    Jonker, Michiel T O; van der Heijden, Stephan A; Adelman, Dave; Apell, Jennifer N; Burgess, Robert M; Choi, Yongju; Fernandez, Loretta A; Flavetta, Geanna M; Ghosh, Upal; Gschwend, Philip M; Hale, Sarah E; Jalalizadeh, Mehregan; Khairy, Mohammed; Lampi, Mark A; Lao, Wenjian; Lohmann, Rainer; Lydy, Michael J; Maruya, Keith A; Nutile, Samuel A; Oen, Amy M P; Rakowska, Magdalena I; Reible, Danny; Rusina, Tatsiana P; Smedes, Foppe; Wu, Yanwen

    2018-03-20

    This work presents the results of an international interlaboratory comparison on ex situ passive sampling in sediments. The main objectives were to map the state of the science in passively sampling sediments, identify sources of variability, provide recommendations and practical guidance for standardized passive sampling, and advance the use of passive sampling in regulatory decision making by increasing confidence in the use of the technique. The study was performed by a consortium of 11 laboratories and included experiments with 14 passive sampling formats on 3 sediments for 25 target chemicals (PAHs and PCBs). The resulting overall interlaboratory variability was large (a factor of ∼10), but standardization of methods halved this variability. The remaining variability was primarily due to factors not related to passive sampling itself, i.e., sediment heterogeneity and analytical chemistry. Excluding the latter source of variability, by performing all analyses in one laboratory, showed that passive sampling results can have a high precision and a very low intermethod variability (sampling, irrespective of the specific method used, is fit for implementation in risk assessment and management of contaminated sediments, provided that method setup and performance, as well as chemical analyses are quality-controlled.

  10. Characterization Results for the January 2017 H-Tank Farm 2H Evaporator Overhead Sample

    Energy Technology Data Exchange (ETDEWEB)

    Truong, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Nicholson, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-04-11

    This report contains the radioanalytical results of the 2H evaporator overhead sample received at SRNL on January 19, 2017. Specifically, concentrations of 137Cs, 90Sr, and 129I are reported and compared to the corresponding Waste Acceptance Criteria (WAC) limits of the Effluent Treatment Project (ETP) Waste Water Collection Tank (WWCT) (rev. 6). All of the radionuclide concentrations in the sample were found to be in compliance with the ETP WAC limits.

  11. Two‐phase designs for joint quantitative‐trait‐dependent and genotype‐dependent sampling in post‐GWAS regional sequencing

    Science.gov (United States)

    Espin‐Garcia, Osvaldo; Craiu, Radu V.

    2017-01-01

    ABSTRACT We evaluate two‐phase designs to follow‐up findings from genome‐wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation‐maximization‐based inference under a semiparametric maximum likelihood formulation tailored for post‐GWAS inference. A GWAS‐SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT‐SNP‐dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme‐QT strata yields significant power improvements compared to marginal QT‐ or SNP‐based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. PMID:29239496

  12. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals

    Directory of Open Access Journals (Sweden)

    Parsons Nick R

    2012-04-01

    Full Text Available Abstract Background The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. Methods A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. Results The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10–26% of the studies investigated the conclusions were not clearly justified by the results, in 39% (30–49% of studies a different analysis should have been undertaken and in 17% (10–26% a different analysis could have made a difference to the overall conclusions. Conclusion It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  13. The economic impact of poor sample quality in clinical chemistry laboratories: results from a global survey.

    Science.gov (United States)

    Erdal, Erik P; Mitra, Debanjali; Khangulov, Victor S; Church, Stephen; Plokhoy, Elizabeth

    2017-03-01

    Background Despite advances in clinical chemistry testing, poor blood sample quality continues to impact laboratory operations and the quality of results. While previous studies have identified the preanalytical causes of lower sample quality, few studies have examined the economic impact of poor sample quality on the laboratory. Specifically, the costs associated with workarounds related to fibrin and gel contaminants remain largely unexplored. Methods A quantitative survey of clinical chemistry laboratory stakeholders across 10 international regions, including countries in North America, Europe and Oceania, was conducted to examine current blood sample testing practices, sample quality issues and practices to remediate poor sample quality. Survey data were used to estimate costs incurred by laboratories to mitigate sample quality issues. Results Responses from 164 participants were included in the analysis, which was focused on three specific issues: fibrin strands, fibrin masses and gel globules. Fibrin strands were the most commonly reported issue, with an overall incidence rate of ∼3%. Further, 65% of respondents indicated that these issues contribute to analyzer probe clogging, and the majority of laboratories had visual inspection and manual remediation practices in place to address fibrin- and gel-related quality problems (55% and 70%, respectively). Probe maintenance/replacement, visual inspection and manual remediation were estimated to carry significant costs for the laboratories surveyed. Annual cost associated with lower sample quality and remediation related to fibrin and/or gel globules for an average US laboratory was estimated to be $100,247. Conclusions Measures to improve blood sample quality present an important step towards improved laboratory operations.

  14. Monitoring well design and sampling techniques at NAPL sites

    International Nuclear Information System (INIS)

    Collins, M.; Rohrman, W.R.; Drake, K.D.

    1992-01-01

    The existence of Non-Aqueous Phase Liquids (NAPLs) at many Superfund and RCRA hazardous waste sites has become a recognized problem in recent years. The large number of sites exhibiting this problem results from the fact that many of the most frequently used industrial solvents and petroleum products can exist as NAPLs. Hazardous waste constituents occurring as NAPLs possess a common characteristic that causes great concern during groundwater contamination evaluation: while solubility in water is generally very low, it is sufficient to cause groundwater to exceed Maximum Contamination Levels (MCLs). Thus, even a small quantity of NAPL within a groundwater regime can act as a point source with the ability to contaminate vast quantities of groundwater over time. This property makes it imperative that groundwater investigations focus heavily on characterizing the nature, extent, and migration pathways of NAPLs at sites where it exists. Two types of NAPLs may exist in a groundwater system. Water-immiscible liquid constituents having a specific gravity greater than one are termed Dense Non-Aqueous Phase Liquids, while those with a specific gravity less than one are considered Light Non-Aqueous Phase Liquids. For a groundwater investigation to properly characterize the two types of NAPLs, careful consideration must be given to the placement and sampling of groundwater monitoring wells. Unfortunately, technical reviewers at EPA Region VII and the Corps of Engineers find that many groundwater investigations fall short in characterizing NAPLs because several basic considerations were overlooked. Included among these are monitoring well location and screen placement with respect to the water table and significant confining units, and the ability of the well sampling method to obtain samples of NAPL. Depending on the specific gravity of the NAPL that occurs at a site, various considerations can substantially enhance adequate characterization of NAPL contaminants

  15. Biased representation of disturbance rates in the roadside sampling frame in boreal forests: implications for monitoring design

    Directory of Open Access Journals (Sweden)

    Steven L. Van Wilgenburg

    2015-12-01

    Full Text Available The North American Breeding Bird Survey (BBS is the principal source of data to inform researchers about the status of and trend for boreal forest birds. Unfortunately, little BBS coverage is available in the boreal forest, where increasing concern over the status of species breeding there has increased interest in northward expansion of the BBS. However, high disturbance rates in the boreal forest may complicate roadside monitoring. If the roadside sampling frame does not capture variation in disturbance rates because of either road placement or the use of roads for resource extraction, biased trend estimates might result. In this study, we examined roadside bias in the proportional representation of habitat disturbance via spatial data on forest "loss," forest fires, and anthropogenic disturbance. In each of 455 BBS routes, the area disturbed within multiple buffers away from the road was calculated and compared against the area disturbed in degree blocks and BBS strata. We found a nonlinear relationship between bias and distance from the road, suggesting forest loss and forest fires were underrepresented below 75 and 100 m, respectively. In contrast, anthropogenic disturbance was overrepresented at distances below 500 m and underrepresented thereafter. After accounting for distance from road, BBS routes were reasonably representative of the degree blocks they were within, with only a few strata showing biased representation. In general, anthropogenic disturbance is overrepresented in southern strata, and forest fires are underrepresented in almost all strata. Similar biases exist when comparing the entire road network and the subset sampled by BBS routes against the amount of disturbance within BBS strata; however, the magnitude of biases differed. Based on our results, we recommend that spatial stratification and rotating panel designs be used to spread limited BBS and off-road sampling effort in an unbiased fashion and that new BBS routes

  16. Results from the Industrial Design Workshop at DEWI

    International Nuclear Information System (INIS)

    Soeker, H.

    1994-01-01

    As any other manufacturer of industrial goods wind energy companies have to face harsh conditions such as growing technical maturity of the wind energy market, shortened phases for product development and increasing competition. DEWI's Industrial Design Workshop, that took place in early March 1994, suggested the integration of the industrial design vision into the product development as a remedy. Presentations were given concerning the need for industrial design in wind energy technology as well as its self-image and potential. The discussion showed the still prevailing mutual non acceptance of competence on the part of industry executives as well as designers. In a second sequence five design projects and studies carried out by professional designers and industrial design students were presented. Again passionate discussions followed each presentation. As a resumee a clear approximation between the participating wind energy developers and industrial designers can be stated. A detailed documentation on the workshop in german language will soon be available. For more information call ++49-4421-48 08 25. (orig.)

  17. Design of real-time monitoring and control system of 222Rn/220Rn sampling for radon chamber

    International Nuclear Information System (INIS)

    Wu Rongyan; Zhao Xiuliang; Zhang Meiqin; Yu Hong

    2008-01-01

    This paper describes the design of 222 Rn/ 220 Rn sampling monitoring and control system based on single-chip microcomputer of series Intel51. The hardware design involves the choosing and usage of sensors-chips, A/D conversion-chip, USB interface-chip, keyboard-chip, digital display-chip, photoelectric coupling isolation-chips and drive circuit-chips of the direct current pump. Software design is composed by software of Personal Computer (PC) and software of Single Chip Microcomputer (SCM). The data acquisition and conversion and the flux control of direct current pump are realized by using soft of Visual Basic and assemble language. The program flow charts are given. Furthermore, we improved the stability of the direct current pump by means of PID Control Algorithms. (authors)

  18. Designing optimal sampling schemes for field visits

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-10-01

    Full Text Available This is a presentation of a statistical method for deriving optimal spatial sampling schemes. The research focuses on ground verification of minerals derived from hyperspectral data. Spectral angle mapper (SAM) and spectral feature fitting (SFF...

  19. Results From a Channel Restoration Project: Hydraulic Design Considerations

    Science.gov (United States)

    Karle, K.F.; Densmore, R.V.; ,

    2001-01-01

    Techniques for the hydraulic restoration of placer-mined streams and floodplains were developed in Denali National Park and Preserve, Alaska. The two-year study at Glen Creek focused on a design of stream and floodplain geometry using hydraulic capacity and shear stress equations. Slope and sinuosity values were based on regional relationships. Design requirements included a channel capacity for a bankfull discharge and a floodplain capacity for a 1.5- to 100-year discharge. Several bio-engineering techniques using alder and willow, including anchored brush bars, streambank hedge layering, seedlings, and cuttings, were tested to dissipate floodwater energy and encourage sediment deposition until natural revegetation stabilized the new floodplains. Permanently monumented cross-sections installed throughout the project site were surveyed every one to three years. Nine years after the project began, a summer flood caused substantial damage to the channel form, including a change in width/depth ratio, slope, and thalweg location. Many of the alder brush bars were heavily damaged or destroyed, resulting in significant bank erosion. This paper reviews the original hydraulic design process, and describes changes to the channel and floodplain geometry over time, based on nine years of cross-section surveys.

  20. The Design of Sample Driver System for Gamma Irradiator Facility at Thermal Column of Kartini Reactor

    International Nuclear Information System (INIS)

    Suyamto; Tasih Mulyono; Setyo Atmojo

    2007-01-01

    The design and construction of sample driver system for gamma irradiator facility at thermal column of Kartini reactor post operation has been carried out. The design and construction is based on the space of thermal column and the sample speed rotation which has to as low as possible in order the irradiation process can be more homogeneity. The electrical and mechanical calculation was done after fixation the electrical motor and transmission system which will be applied. By the assumption that the maximum sample weight is 50 kg, the electric motor specification is decided due to its rating i.e. single phase induction motor, run capacitor type, 0.5 HP; 220 V; 3.61 A, CCW and CW, rotation speed 1430 rpm. To achieve the low load rotation speed, motor speed was reduced twice using the conical reduction gear with the reduction ratio 3.9 and thread reduction gear with the reduction ratio 60. From the calculation it is found that power of motor is 118.06 watt, speed rotation of load sample is 6.11 rpm due to the no load rotation of motor 1430 rpm. From the test by varying weight of load up to 75 kg it is known that the device can be operated in a good condition, both in the two direction with the average speed of motor 1486 rpm and load 6.3 rpm respectively. So that the slip is 0.268 % and 0.314 % for no load and full load condition. The difference input current to the motor during no load and full load condition is relative small i.e. 0.14 A. The safety factor of motor is 316 % which is correspond to the weight of load 158 kg. (author)

  1. Vanishing auxiliary variables in PPS sampling - with applications in microscopy

    DEFF Research Database (Denmark)

    Andersen, Ina Trolle; Hahn, Ute; Jensen, Eva B. Vedel

    Recently, non-uniform sampling has been suggested in microscopy to increase efficiency. More precisely, sampling proportional to size (PPS) has been introduced where the probability of sampling a unit in the population is proportional to the value of an auxiliary variable. Unfortunately, vanishing...... auxiliary variables are a common phenomenon in microscopy and, accordingly, part of the population is not accessible, using PPS sampling. We propose a modification of the design, for which an optimal solution can be found, using a model assisted approach. The optimal design has independent interest...... in sampling theory. We verify robustness of the new approach by numerical results, and we use real data to illustrate the applicability....

  2. JWST-MIRI spectrometer main optics design and main results

    Science.gov (United States)

    Navarro, Ramón; Schoenmaker, Ton; Kroes, Gabby; Oudenhuysen, Ad; Jager, Rieks; Venema, Lars

    2017-11-01

    MIRI ('Mid InfraRed Instrument') is the combined imager and integral field spectrometer for the 5-29 micron wavelength range under development for the James Webb Space Telescope JWST. The flight acceptance tests of the Spectrometer Main Optics flight models (SMO), part of the MIRI spectrometer, are completed in the summer of 2008 and the system is delivered to the MIRI-JWST consortium. The two SMO arms contain 14 mirrors and form the MIRI optical system together with 12 selectable gratings on grating wheels. The entire system operates at a temperature of 7 Kelvin and is designed on the basis of a 'no adjustments' philosophy. This means that the optical alignment precision depends strongly on the design, tolerance analysis and detailed knowledge of the manufacturing process. Because in principle no corrections are needed after assembly, continuous tracking of the alignment performance during the design and manufacturing phases is important. The flight hardware is inspected with respect to performance parameters like alignment and image quality. The stability of these parameters is investigated after exposure to various vibration levels and successive cryogenic cool downs. This paper describes the philosophy behind the acceptance tests, the chosen test strategy and reports the results of these tests. In addition the paper covers the design of the optical test setup, focusing on the simulation of the optical interfaces of the SMO. Also the relation to the SMO qualification and verification program is addressed.

  3. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    Science.gov (United States)

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  4. Penn State geoPebble system: Design,Implementation, and Initial Results

    Science.gov (United States)

    Urbina, J. V.; Anandakrishnan, S.; Bilen, S. G.; Fleishman, A.; Burkett, P.

    2014-12-01

    The Penn State geoPebble system is a new network of wirelessly interconnected seismic and GPS sensor nodes with flexible architecture. This network will be used for studies of ice sheets in Antarctica and Greenland, as well as to investigate mountain glaciers. The network will consist of ˜150 geoPebbles that can be deployed in a user-defined spatial geometry. We present our design methodology, which has enabled us to develop these state-of- the art sensors using commercial-off-the-shelf hardware combined with custom-designed hardware and software. Each geoPebble is a self- contained, wirelessly connected sensor for collecting seismic measurements and position information. Key elements of each node encompasses a three-component seismic recorder, which includes an amplifier, filter, and 24- bit analog-to-digital converter that can sample up to 10 kHz. Each unit also includes a microphone channel to record the ground-coupled airwave. The timing for each node is available from GPS measurements and a local precision oscillator that is conditioned by the GPS timing pulses. In addition, we record the carrier-phase measurement of the L1 GPS signal in order to determine location at sub-decimeter accuracy (relative to other geoPebbles within a few kilometers radius). Each geoPebble includes 16 GB of solid-state storage, wireless communications capability to a central supervisory unit, and auxiliary measurements capability (including tilt from accelerometers, absolute orientation from magnetometers and temperature). A novel aspect of the geoPebble is a wireless charging system for the internal battery (using inductive coupling techniques). The geoPebbles include all the sensors (geophones, GPS, microphone), communications (WiFi), and power (battery and charging) internally, so the geoPebble system can operate without any cabling connections (though we do provide an external connector so that different geophones can be used). We report initial field-deployment results and

  5. Light scattering by ultrasonically-controlled small particles: system design, calibration, and measurement results

    Science.gov (United States)

    Kassamakov, Ivan; Maconi, Göran; Penttilä, Antti; Helander, Petteri; Gritsevich, Maria; Puranen, Tuomas; Salmi, Ari; Hæggström, Edward; Muinonen, Karri

    2018-02-01

    We present the design of a novel scatterometer for precise measurement of the angular Mueller matrix profile of a mm- to µm-sized sample held in place by sound. The scatterometer comprises a tunable multimode Argon-krypton laser (with possibility to set 1 of the 12 wavelengths in visible range), linear polarizers, a reference photomultiplier tube (PMT) for monitoring the beam intensity, and a micro-PMT module mounted radially towards the sample at an adjustable radius. The measurement angle is controlled by a motor-driven rotation stage with an accuracy of 15'. The system is fully automated using LabVIEW, including the FPGA-based data acquisition and the instrument's user interface. The calibration protocol ensures accurate measurements by using a control sphere sample (diameter 3 mm, refractive index of 1.5) fixed first on a static holder followed by accurate multi-wavelength measurements of the same sample levitated ultrasonically. To demonstrate performance of the scatterometer, we conducted detailed measurements of light scattered by a particle derived from the Chelyabinsk meteorite, as well as planetary analogue materials. The measurements are the first of this kind, since they are obtained using controlled spectral angular scattering including linear polarization effects, for arbitrary shaped objects. Thus, our novel approach permits a non-destructive, disturbance-free measurement with control of the orientation and location of the scattering object.

  6. LDEF materials results for spacecraft applications: Executive summary

    Science.gov (United States)

    Whitaker, A. F.; Dooling, D.

    1995-03-01

    To address the challenges of space environmental effects, NASA designed the Long Duration Exposure Facility (LDEF) for an 18-month mission to expose thousands of samples of candidate materials that might be used on a space station or other orbital spacecraft. LDEF was launched in April 1984 and was to have been returned to Earth in 1985. Changes in mission schedules postponed retrieval until January 1990, after 69 months in orbit. Analyses of the samples recovered from LDEF have provided spacecraft designers and managers with the most extensive data base on space materials phenomena. Many LDEF samples were greatly changed by extended space exposure. Among even the most radially altered samples, NASA and its science teams are finding a wealth of surprising conclusions and tantalizing clues about the effects of space on materials. Many were discussed at the first two LDEF results conferences and subsequent professional papers. The LDEF Materials Results for Spacecraft Applications Conference was convened in Huntsville to discuss implications for spacecraft design. Already, paint and thermal blanket selections for space station and other spacecraft have been affected by LDEF data. This volume synopsizes those results.

  7. Objective sampling design in a highly heterogeneous landscape - characterizing environmental determinants of malaria vector distribution in French Guiana, in the Amazonian region.

    Science.gov (United States)

    Roux, Emmanuel; Gaborit, Pascal; Romaña, Christine A; Girod, Romain; Dessay, Nadine; Dusfour, Isabelle

    2013-12-01

    Sampling design is a key issue when establishing species inventories and characterizing habitats within highly heterogeneous landscapes. Sampling efforts in such environments may be constrained and many field studies only rely on subjective and/or qualitative approaches to design collection strategy. The region of Cacao, in French Guiana, provides an excellent study site to understand the presence and abundance of Anopheles mosquitoes, their species dynamics and the transmission risk of malaria across various environments. We propose an objective methodology to define a stratified sampling design. Following thorough environmental characterization, a factorial analysis of mixed groups allows the data to be reduced and non-collinear principal components to be identified while balancing the influences of the different environmental factors. Such components defined new variables which could then be used in a robust k-means clustering procedure. Then, we identified five clusters that corresponded to our sampling strata and selected sampling sites in each stratum. We validated our method by comparing the species overlap of entomological collections from selected sites and the environmental similarities of the same sites. The Morisita index was significantly correlated (Pearson linear correlation) with environmental similarity based on i) the balanced environmental variable groups considered jointly (p = 0.001) and ii) land cover/use (p-value sampling approach. Land cover/use maps (based on high spatial resolution satellite images) were shown to be particularly useful when studying the presence, density and diversity of Anopheles mosquitoes at local scales and in very heterogeneous landscapes.

  8. FFTF thermal-hydraulic testing results affecting piping and vessel component design in LMFBR's

    International Nuclear Information System (INIS)

    Stover, R.L.; Beaver, T.R.; Chang, S.C.

    1983-01-01

    The Fast Flux Test Facility completed four years of pre-operational testing in April 1982. This paper describes thermal-hydraulic testing results from this period which impact piping and vessel component design in LMFBRs. Data discussed are piping flow oscillations, piping thermal stratification and vessel upper plenum stratification. Results from testing verified that plant design limits were met

  9. 40 CFR 761.298 - Decisions based on PCB concentration measurements resulting from sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Decisions based on PCB concentration... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.298 Decisions based on PCB concentration measurements resulting from sampling. (a) For...

  10. Implications for monitoring: study designs and interpretation of results

    International Nuclear Information System (INIS)

    Green, R. H.; Montagna, P.

    1996-01-01

    Two innovative statistical approaches to the interpretation and generalization of the results from the study of long-term environmental impacts of offshore oil and gas exploration and production in the Gulf of Mexico were described. The first of the two methods, the Sediment Quality Triad approach, relies on a test of coherence of responses, whereas the second approach uses small scale spatial heterogeneity of response as evidence of impact. As far as the study design was concerned, it was argued that differing objectives which are demanded of the same study (e.g. generalization about environmental impact of similar platforms versus the spatial pattern of impact around individual platforms) are frequently in conflict. If at all possible, they should be avoided since the conflicting demands tend to compromise the design for both situations. 31 refs., 5 figs

  11. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  12. Recent results relevant to ignition physics and machine design issues

    International Nuclear Information System (INIS)

    Coppi, B.; Airoldi, A.; Bombarda, F.

    2001-01-01

    The plasma regimes under which ignition can be achieved involve a characteristic range of parameters and issues on which information has been provided by recent experiments. In particular, these results have motivated a new, in-depth analysis of the expected performance of the Ignitor machine as well as of the plasma processes that it can investigate. The main results and recent advances in the design of key systems of the machine are reported. (author)

  13. Recent results relevant to ignition physics and machine design issues

    International Nuclear Information System (INIS)

    Coppi, B.; Airoldi, A.; Bombarda, F.

    1999-01-01

    The plasma regimes under which ignition can be achieved involve a characteristic range of parameters and issues on which information has been provided by recent experiments. In particular, these results have motivated a new, in-depth analysis of the expected performance of the Ignitor machine as well as of the plasma processes that it can investigate. The main results and recent advances in the design of key systems of the machine are reported. (author)

  14. New sample carrier systems for thermogravimetric analysis under forced flow conditions and their influence on microkinetic results.

    Science.gov (United States)

    Seibel, C; Fieback, T M

    2015-09-01

    For thermogravimetric analysis, it has been shown that, depending on the type of sample container, different kinetic results could be obtained despite regarding the same reaction under constant conditions. This is due to limiting macrokinetic effects which are strongly dependant on the type of sample carrying system. This prompted the need for sample containers which deliver results minimally limited by diffusive mass transport. In this way, two container systems were developed, both characterized by a forced flow stream through a solid, porous bed: one from bottom to top (counter-current flow) and one from top to bottom (co-current flow). Optical test measurements were performed, the results indicating that reaction proceedings are almost fully independent of the geometrical shape of the sample containers. The Boudouard reaction was investigated with a standard crucible and the new developed systems; the reaction rates determined differed significantly, up to a factor of 6.2 at 1373 K.

  15. 40 CFR 761.316 - Interpreting PCB concentration measurements resulting from this sampling scheme.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Interpreting PCB concentration... § 761.79(b)(3) § 761.316 Interpreting PCB concentration measurements resulting from this sampling... composite is 20 µg/100 cm2, then the entire 9.5 square meters has a PCB surface concentration of 20 µg/100...

  16. Final Sampling and Analysis Plan for Background Sampling, Fort Sheridan, Illinois

    National Research Council Canada - National Science Library

    1995-01-01

    .... This Background Sampling and Analysis Plan (BSAP) is designed to address this issue through the collection of additional background samples at Fort Sheridan to support the statistical analysis and the Baseline Risk Assessment (BRA...

  17. Theory of NMR probe design

    International Nuclear Information System (INIS)

    Schnall, M.D.

    1988-01-01

    The NMR probe is the intrinsic part of the NMR system which allows transmission of a stimulus to a sample and the reception of a resulting signal from a sample. NMR probes are used in both imaging and spectroscopy. Optimal probe design is important to the production of adequate signal/moise. It is important for anyone using NMR techniques to understand how NMR probes work and how to optimize probe design

  18. Headspace vapor characterization of Hanford waste Tank 241-BX-110: Results from samples collected on 04/30/96

    International Nuclear Information System (INIS)

    Evans, J.C.; Pool, K.H.; Thomas, B.L.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1997-01-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-BX-110 (Tank BX-110) at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNNL). Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Analyte concentrations were based on analytical results and, where appropriate, sample volumes provided by WHC. A summary of the inorganic analytes, permanent gases, and total non-methane organic compounds is listed in a table. The three highest concentration analytes detected in SUMMA trademark canister and triple sorbent trap samples are also listed in the table. Detailed descriptions of the analytical results appear in the appendices

  19. Reliability of environmental sampling culture results using the negative binomial intraclass correlation coefficient.

    Science.gov (United States)

    Aly, Sharif S; Zhao, Jianyang; Li, Ben; Jiang, Jiming

    2014-01-01

    The Intraclass Correlation Coefficient (ICC) is commonly used to estimate the similarity between quantitative measures obtained from different sources. Overdispersed data is traditionally transformed so that linear mixed model (LMM) based ICC can be estimated. A common transformation used is the natural logarithm. The reliability of environmental sampling of fecal slurry on freestall pens has been estimated for Mycobacterium avium subsp. paratuberculosis using the natural logarithm transformed culture results. Recently, the negative binomial ICC was defined based on a generalized linear mixed model for negative binomial distributed data. The current study reports on the negative binomial ICC estimate which includes fixed effects using culture results of environmental samples. Simulations using a wide variety of inputs and negative binomial distribution parameters (r; p) showed better performance of the new negative binomial ICC compared to the ICC based on LMM even when negative binomial data was logarithm, and square root transformed. A second comparison that targeted a wider range of ICC values showed that the mean of estimated ICC closely approximated the true ICC.

  20. Tank 241-U-104 headspace gas and vapor characterization results from samples collected on July 16, 1996

    International Nuclear Information System (INIS)

    Pool, K.H.; Evans, J.C.; Hayes, J.C.; Mitroshkov, A.V.; Edwards, J.A.; Julya, J.L.; Thornton, B.M.; Fruchter, J.S.; Silvers, K.L.

    1997-08-01

    This report presents the results from analyses of samples taken from the headspace of waste storage tank 241-U-104 (Tank U-104) at the Hanford Site in Washington State. Tank headspace samples collected by Westinghouse Hanford Company (WHC) were analyzed by Pacific Northwest National Laboratory (PNNL) to determine headspace concentrations of selected non-radioactive analytes. Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Vapor concentrations from sorbent trap samples are based on measured sample volumes provided by WHC. No analytes were determined to be above the immediate notification limits specified by the sampling and analysis plan. None of the flammable constituents were present at concentrations above the analytical instrument detection limits. Total headspace flammability was estimated to be <0.108% of the lower flammability limit. Average measured concentrations of targeted gases, inorganic vapors, and selected organic vapors are provided in a table. A summary of experimental methods, including sampling methodology, analytical procedures, and quality assurance and control methods are presented in Section 2.0. Detailed descriptions of the analytical results are provided in Section 3.0

  1. Headspace vapor characterization of Hanford waste tank 241-U-108: Results from samples collected on 8/29/95

    International Nuclear Information System (INIS)

    Thomas, B.L.; Clauss, T.W.; Evans, J.C.; McVeety, B.D.; Pool, K.H.; Olsten, K.B.; Fruchter, J.S.; Ligotke, M.W.

    1996-05-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-U-108 (Tank U-108) at the Hanford Site in Washington State. The results described in the report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNNL). Analyte concentrations were based on analytical results and, where appropriate, sample volumes provided by WHC

  2. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    International Nuclear Information System (INIS)

    Rapp, Juergen; Aaron, A. M.; Bell, Gary L.; Burgess, Thomas W.; Ellis, Ronald James; Giuliano, D.; Howard, R.; Kiggans, James O.; Lessard, Timothy L.; Ohriner, Evan Keith; Perkins, Dale E.; Varma, Venugopal Koikal

    2015-01-01

    5-20 MW/m"2 and ion fluxes up to 10"2"4 m"-"2s"-"1. Since PFCs will have to withstand neutron irradiation displacement damage up to 50 dpa, the target station design must accommodate radioactive specimens (materials to be irradiated in HFIR or at SNS) to enable investigations of the impact of neutron damage on materials. Therefore, the system will have to be able to install and extract irradiated specimens using equipment and methods to avoid sample modification, control contamination, and minimize worker dose. Included in the design considerations will be an assessment of all the steps between neutron irradiation and post-exposure materials examination/characterization, as well as an evaluation of the facility hazard categorization. In particular, the factors associated with the acquisition of radioactive specimens and their preparation, transportation, experimental configuration at the plasma-specimen interface, post-plasma-exposure sample handling, and specimen preparation will be evaluated. Neutronics calculations to determine the dose rates of the samples were carried out for a large number of potential plasma-facing materials.

  3. Web-Face-to-Face Mixed-Mode Design in a Longitudinal Survey: Effects on Participation Rates, Sample Composition, and Costs

    Directory of Open Access Journals (Sweden)

    Bianchi Annamaria

    2017-06-01

    Full Text Available Sequential mixed-mode designs are increasingly considered as an alternative to interviewer-administered data collection, allowing researchers to take advantage of the benefits of each mode. We assess the effects of the introduction of a sequential web-face-to-face mixed-mode design over three waves of a longitudinal survey in which members were previously interviewed face-to-face. Findings are reported from a large-scale randomised experiment carried out on the UK Household Longitudinal Study. No differences are found between the mixed-mode design and face-to-face design in terms of cumulative response rates and only minimal differences in terms of sample composition. On the other hand, potential cost savings are evident.

  4. Problematic Social Media Use: Results from a Large-Scale Nationally Representative Adolescent Sample.

    Science.gov (United States)

    Bányai, Fanni; Zsila, Ágnes; Király, Orsolya; Maraz, Aniko; Elekes, Zsuzsanna; Griffiths, Mark D; Andreassen, Cecilie Schou; Demetrovics, Zsolt

    2017-01-01

    Despite social media use being one of the most popular activities among adolescents, prevalence estimates among teenage samples of social media (problematic) use are lacking in the field. The present study surveyed a nationally representative Hungarian sample comprising 5,961 adolescents as part of the European School Survey Project on Alcohol and Other Drugs (ESPAD). Using the Bergen Social Media Addiction Scale (BSMAS) and based on latent profile analysis, 4.5% of the adolescents belonged to the at-risk group, and reported low self-esteem, high level of depression symptoms, and elevated social media use. Results also demonstrated that BSMAS has appropriate psychometric properties. It is concluded that adolescents at-risk of problematic social media use should be targeted by school-based prevention and intervention programs.

  5. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    Science.gov (United States)

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  6. Algorithm/Architecture Co-design of the Generalized Sampling Theorem Based De-Interlacer.

    NARCIS (Netherlands)

    Beric, A.; Haan, de G.; Sethuraman, R.; Meerbergen, van J.

    2005-01-01

    De-interlacing is a major determinant of image quality in a modern display processing chain. The de-interlacing method based on the generalized sampling theorem (GST)applied to motion estimation and motion compensation provides the best de-interlacing results. With HDTV interlaced input material

  7. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals.

    Science.gov (United States)

    Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L

    2012-04-25

    The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.

  8. The development of a Martian atmospheric Sample collection canister

    Science.gov (United States)

    Kulczycki, E.; Galey, C.; Kennedy, B.; Budney, C.; Bame, D.; Van Schilfgaarde, R.; Aisen, N.; Townsend, J.; Younse, P.; Piacentine, J.

    The collection of an atmospheric sample from Mars would provide significant insight to the understanding of the elemental composition and sub-surface out-gassing rates of noble gases. A team of engineers at the Jet Propulsion Laboratory (JPL), California Institute of Technology have developed an atmospheric sample collection canister for Martian application. The engineering strategy has two basic elements: first, to collect two separately sealed 50 cubic centimeter unpressurized atmospheric samples with minimal sensing and actuation in a self contained pressure vessel; and second, to package this atmospheric sample canister in such a way that it can be easily integrated into the orbiting sample capsule for collection and return to Earth. Sample collection and integrity are demonstrated by emulating the atmospheric collection portion of the Mars Sample Return mission on a compressed timeline. The test results achieved by varying the pressure inside of a thermal vacuum chamber while opening and closing the valve on the sample canister at Mars ambient pressure. A commercial off-the-shelf medical grade micro-valve is utilized in the first iteration of this design to enable rapid testing of the system. The valve has been independently leak tested at JPL to quantify and separate the leak rates associated with the canister. The results are factored in to an overall system design that quantifies mass, power, and sensing requirements for a Martian atmospheric Sample Collection (MASC) canister as outlined in the Mars Sample Return mission profile. Qualitative results include the selection of materials to minimize sample contamination, preliminary science requirements, priorities in sample composition, flight valve selection criteria, a storyboard from sample collection to loading in the orbiting sample capsule, and contributions to maintaining “ Earth” clean exterior surfaces on the orbiting sample capsule.

  9. 2015 Long-Term Hydrologic Monitoring Program Sampling and Analysis Results at Rio Blanco, Colorado

    Energy Technology Data Exchange (ETDEWEB)

    Findlay, Rick [Nararro Research and Engineering, Oak Ridge, TN (United States); Kautsky, Mark [US Department of Energy, Washington, DC (United States). Office of Legacy Management

    2015-12-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted annual sampling at the Rio Blanco, Colorado, Site for the Long-Term Hydrologic Monitoring Program (LTHMP) on May 20–21, 2015. This report documents the analytical results of the Rio Blanco annual monitoring event, the trip report, and the data validation package. The groundwater and surface water monitoring samples were shipped to the GEL Group Inc. laboratories for conventional analysis of tritium and analysis of gamma-emitting radionuclides by high-resolution gamma spectrometry. A subset of water samples collected from wells near the Rio Blanco site was also sent to GEL Group Inc. for enriched tritium analysis. All requested analyses were successfully completed. Samples were collected from a total of four onsite wells, including two that are privately owned. Samples were also collected from two additional private wells at nearby locations and from nine surface water locations. Samples were analyzed for gamma-emitting radionuclides by high-resolution gamma spectrometry, and they were analyzed for tritium using the conventional method with a detection limit on the order of 400 picocuries per liter (pCi/L). Four locations (one well and three surface locations) were analyzed using the enriched tritium method, which has a detection limit on the order of 3 pCi/L. The enriched locations included the well at the Brennan Windmill and surface locations at CER-1, CER-4, and Fawn Creek 500 feet upstream.

  10. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  11. Development of system design in recent Siemens/KWU PWR influenced by PSA results

    International Nuclear Information System (INIS)

    Feigel, A.; Fabian, H.

    1989-01-01

    This paper reports on the design of the latest Siemens/KWU PWRs (Convoy plants) which is checked by a PSA, performed as a Δ-analysis to the German Risk Study (GPRA), which used a PWR 1300 MW in commercial operation since 1977 as a reference plant. The 10 years difference in the design between the reference plant and the Convoy plants, led to design changes, due to operational experience and findings from GPRA, Phase A. These are evaluated quantitatively by a PSA with respect to plat safety level and balance of the safety concept. The results gained from the Convoy PSA showed the importance and appropriateness of these modifications. Even if the latest results from GPRA, Phase B are considered with respect to additional accident sequences, it can be demonstrated that the new design is balanced with respect to these additional sequences. So no need exists to improve the new design any more

  12. Relationship of Indoor, Outdoor and Personal Air (RIOPA) study: study design, methods and quality assurance/control results.

    Science.gov (United States)

    Weisel, Clifford P; Zhang, Junfeng; Turpin, Barbara J; Morandi, Maria T; Colome, Steven; Stock, Thomas H; Spektor, Dalia M; Korn, Leo; Winer, Arthur; Alimokhtari, Shahnaz; Kwon, Jaymin; Mohan, Krishnan; Harrington, Robert; Giovanetti, Robert; Cui, William; Afshar, Masoud; Maberti, Silvia; Shendell, Derek

    2005-03-01

    The Relationship of Indoor, Outdoor and Personal Air (RIOPA) Study was undertaken to evaluate the contribution of outdoor sources of air toxics, as defined in the 1990 Clean Air Act Amendments, to indoor concentrations and personal exposures. The concentrations of 18 volatile organic compounds (VOCs), 17 carbonyl compounds, and fine particulate matter mass (PM(2.5)) were measured using 48-h outdoor, indoor and personal air samples collected simultaneously. PM2.5 mass, as well as several component species (elemental carbon, organic carbon, polyaromatic hydrocarbons and elemental analysis) were also measured; only PM(2.5) mass is reported here. Questionnaires were administered to characterize homes, neighborhoods and personal activities that might affect exposures. The air exchange rate was also measured in each home. Homes in close proximity (<0.5 km) to sources of air toxics were preferentially (2:1) selected for sampling. Approximately 100 non-smoking households in each of Elizabeth, NJ, Houston, TX, and Los Angeles, CA were sampled (100, 105, and 105 respectively) with second visits performed at 84, 93, and 81 homes in each city, respectively. VOC samples were collected at all homes, carbonyls at 90% and PM(2.5) at 60% of the homes. Personal samples were collected from nonsmoking adults and a portion of children living in the target homes. This manuscript provides the RIOPA study design and quality control and assurance data. The results from the RIOPA study can potentially provide information on the influence of ambient sources on indoor air concentrations and exposure for many air toxics and will furnish an opportunity to evaluate exposure models for these compounds.

  13. Tank Farm WM-182 and WM-183 Heel Slurry Samples PSD Results

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.

    2000-01-01

    Particle size distribution (PSD) analysis of INTEC Tank Farm WM-182 and WM-183 heel slurry samples were performed using a modified Horiba LA-300 PSD analyzer at the RAL facility. There were two types of testing performed: typical PSD analysis, and setting rate testing. Although the heel slurry samples were obtained from two separate vessels, the particle size distribution results were quite similar. The slurry solids were from approximately a minimum particle size of 0.5 mm to a maximum of 230 mm with about 90% of the material between 2-to-133 mm, and the cumulative 50% value at approximately 20 mm. This testing also revealed that high frequency sonication with an ultrasonic element may break-up larger particles in the WM-182 and WM-183 tank from heel slurries. This finding represents useful information regarding ultimate tank heel waste processing. Settling rate testing results were also fairly consistent with material from both vessels in that it appears that most of the mass of solids settle to an agglomerated, yet easily redispersed layer at the bottom. A dispersed and suspended material remained in the ''clear'' layer above the settled layer after about one-half an hour of settling time. This material had a statistical mode of approximately 5 mm and a maximum particle size of 30 mm

  14. Automatic Sample Changer for X-Ray Spectrometry

    International Nuclear Information System (INIS)

    Morales Tarre, Orlando; Diaz Castro, Maikel; Rivero Ramirez, Doris; Lopez Pino, Neivy

    2011-01-01

    The design and construction of an automatic sample changer for Nuclear Analysis Laboratory's X-ray spectrometer at InSTEC is presented by giving basic details about its mechanical structure, control circuits and the software application developed to interact with the data acquisition software of the multichannel analyzer. Results of some test experiments performed with the automatic sample changer are also discussed. The system is currently in use at InSTEC. (Author)

  15. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    Science.gov (United States)

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  16. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    Science.gov (United States)

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  17. Sample size requirements for separating out the effects of combination treatments: Randomised controlled trials of combination therapy vs. standard treatment compared to factorial designs for patients with tuberculous meningitis

    Directory of Open Access Journals (Sweden)

    Farrar Jeremy

    2011-02-01

    Full Text Available Abstract Background In certain diseases clinical experts may judge that the intervention with the best prospects is the addition of two treatments to the standard of care. This can either be tested with a simple randomized trial of combination versus standard treatment or with a 2 × 2 factorial design. Methods We compared the two approaches using the design of a new trial in tuberculous meningitis as an example. In that trial the combination of 2 drugs added to standard treatment is assumed to reduce the hazard of death by 30% and the sample size of the combination trial to achieve 80% power is 750 patients. We calculated the power of corresponding factorial designs with one- to sixteen-fold the sample size of the combination trial depending on the contribution of each individual drug to the combination treatment effect and the strength of an interaction between the two. Results In the absence of an interaction, an eight-fold increase in sample size for the factorial design as compared to the combination trial is required to get 80% power to jointly detect effects of both drugs if the contribution of the less potent treatment to the total effect is at least 35%. An eight-fold sample size increase also provides a power of 76% to detect a qualitative interaction at the one-sided 10% significance level if the individual effects of both drugs are equal. Factorial designs with a lower sample size have a high chance to be underpowered, to show significance of only one drug even if both are equally effective, and to miss important interactions. Conclusions Pragmatic combination trials of multiple interventions versus standard therapy are valuable in diseases with a limited patient pool if all interventions test the same treatment concept, it is considered likely that either both or none of the individual interventions are effective, and only moderate drug interactions are suspected. An adequately powered 2 × 2 factorial design to detect effects of

  18. Headspace vapor characterization of Hanford waste tank 241-U-109: Results from samples collected on 8/10/95

    International Nuclear Information System (INIS)

    Evans, J.C.; Thomas, B.L.; Pool, K.H.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1996-05-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-U-109 (Tank U-109) At the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. This tank is on the Hydrogen Waste List. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNNL). Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Analyte concentrations were based on analytical results and, where appropriate, sample volumes provided by WHC. A summary of the inorganic analytes, permanent gases and total non-methane hydrocarbons is listed in a table. The three highest concentration analytes detected in SUMMA trademark canister and triple sorbent trap samples is also listed in the table. Detailed descriptions of the analytical results appear in the text

  19. Box-Behnken design in modeling of solid-phase tea waste extraction for the removal of uranium from water samples

    Energy Technology Data Exchange (ETDEWEB)

    Khajeh, Mostafa; Jahanbin, Elham; Ghaffari-Moghaddam, Mansour; Moghaddam, Zahra Safaei [Zabol Univ. (Iran, Islamic Republic of). Dept. of Chemistry; Bohlooli, Mousa [Zabol Univ. (Iran, Islamic Republic of). Dept. of Biology

    2015-07-01

    In this study, the solid-phase tea waste procedure was used for separation, preconcentration and determination of uranium from water samples by UV-Vis spectrophotometer. In addition, Box-Behnken experimental design was employed to investigated the influence of six variables including pH, mass of adsorbent, eluent volume, amount of 1-(2-pyridylazo)-2-naphthol (PAN); and sample and eluent flow rates on the extraction of analyte. High determination coefficient (R{sup 2}) of 0.972 and adjusted-R{sup 2} of 0.943 showed the satisfactory adjustment of the polynomial regression model. This method was used for the extraction of uranium from real water samples.

  20. Box-Behnken design in modeling of solid-phase tea waste extraction for the removal of uranium from water samples

    International Nuclear Information System (INIS)

    Khajeh, Mostafa; Jahanbin, Elham; Ghaffari-Moghaddam, Mansour; Moghaddam, Zahra Safaei; Bohlooli, Mousa

    2015-01-01

    In this study, the solid-phase tea waste procedure was used for separation, preconcentration and determination of uranium from water samples by UV-Vis spectrophotometer. In addition, Box-Behnken experimental design was employed to investigated the influence of six variables including pH, mass of adsorbent, eluent volume, amount of 1-(2-pyridylazo)-2-naphthol (PAN); and sample and eluent flow rates on the extraction of analyte. High determination coefficient (R 2 ) of 0.972 and adjusted-R 2 of 0.943 showed the satisfactory adjustment of the polynomial regression model. This method was used for the extraction of uranium from real water samples.

  1. Problematic Social Media Use: Results from a Large-Scale Nationally Representative Adolescent Sample.

    Directory of Open Access Journals (Sweden)

    Fanni Bányai

    Full Text Available Despite social media use being one of the most popular activities among adolescents, prevalence estimates among teenage samples of social media (problematic use are lacking in the field. The present study surveyed a nationally representative Hungarian sample comprising 5,961 adolescents as part of the European School Survey Project on Alcohol and Other Drugs (ESPAD. Using the Bergen Social Media Addiction Scale (BSMAS and based on latent profile analysis, 4.5% of the adolescents belonged to the at-risk group, and reported low self-esteem, high level of depression symptoms, and elevated social media use. Results also demonstrated that BSMAS has appropriate psychometric properties. It is concluded that adolescents at-risk of problematic social media use should be targeted by school-based prevention and intervention programs.

  2. Long-term frozen storage of urine samples: a trouble to get PCR results in Schistosoma spp. DNA detection?

    Science.gov (United States)

    Fernández-Soto, Pedro; Velasco Tirado, Virginia; Carranza Rodríguez, Cristina; Pérez-Arellano, José Luis; Muro, Antonio

    2013-01-01

    Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples' storage or conditions for handling and DNA preservation and extraction methods. We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patients urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA method of extraction used.

  3. Sample similarity analysis of angles of repose based on experimental results for DEM calibration

    Science.gov (United States)

    Tan, Yuan; Günthner, Willibald A.; Kessler, Stephan; Zhang, Lu

    2017-06-01

    As a fundamental material property, particle-particle friction coefficient is usually calculated based on angle of repose which can be obtained experimentally. In the present study, the bottomless cylinder test was carried out to investigate this friction coefficient of a kind of biomass material, i.e. willow chips. Because of its irregular shape and varying particle size distribution, calculation of the angle becomes less applicable and decisive. In the previous studies only one section of those uneven slopes is chosen in most cases, although standard methods in definition of a representable section are barely found. Hence, we presented an efficient and reliable method from the new technology, 3D scan, which was used to digitize the surface of heaps and generate its point cloud. Then, two tangential lines of any selected section were calculated through the linear least-squares regression (LLSR), such that the left and right angle of repose of a pile could be derived. As the next step, a certain sum of sections were stochastic selected, and calculations were repeated correspondingly in order to achieve sample of angles, which was plotted in Cartesian coordinates as spots diagram. Subsequently, different samples were acquired through various selections of sections. By applying similarities and difference analysis of these samples, the reliability of this proposed method was verified. Phased results provides a realistic criterion to reduce the deviation between experiment and simulation as a result of random selection of a single angle, which will be compared with the simulation results in the future.

  4. Sample similarity analysis of angles of repose based on experimental results for DEM calibration

    Directory of Open Access Journals (Sweden)

    Tan Yuan

    2017-01-01

    Full Text Available As a fundamental material property, particle-particle friction coefficient is usually calculated based on angle of repose which can be obtained experimentally. In the present study, the bottomless cylinder test was carried out to investigate this friction coefficient of a kind of biomass material, i.e. willow chips. Because of its irregular shape and varying particle size distribution, calculation of the angle becomes less applicable and decisive. In the previous studies only one section of those uneven slopes is chosen in most cases, although standard methods in definition of a representable section are barely found. Hence, we presented an efficient and reliable method from the new technology, 3D scan, which was used to digitize the surface of heaps and generate its point cloud. Then, two tangential lines of any selected section were calculated through the linear least-squares regression (LLSR, such that the left and right angle of repose of a pile could be derived. As the next step, a certain sum of sections were stochastic selected, and calculations were repeated correspondingly in order to achieve sample of angles, which was plotted in Cartesian coordinates as spots diagram. Subsequently, different samples were acquired through various selections of sections. By applying similarities and difference analysis of these samples, the reliability of this proposed method was verified. Phased results provides a realistic criterion to reduce the deviation between experiment and simulation as a result of random selection of a single angle, which will be compared with the simulation results in the future.

  5. The results of experimental studies of VLF–ULF electromagnetic emission by rock samples due to mechanical action

    OpenAIRE

    A. A. Panfilov

    2013-01-01

    The paper presents the results of laboratory experiments on electromagnetic emission excitation (electric component of electromagnetic field) by rock samples due to different forms of mechanical stress applications. It was shown that samples generate electric impulses with different spectra when the impact action, gradual loading or dynamic friction is applied. It was ascertained that level and spectral compositions of signals, generated by rock samples, cha...

  6. Design, placement, and sampling of groundwater monitoring wells for the management of hazardous waste disposal facilities

    International Nuclear Information System (INIS)

    Tsai, S.Y.

    1988-01-01

    Groundwater monitoring is an important technical requirement in managing hazardous waste disposal facilities. The purpose of monitoring is to assess whether and how a disposal facility is affecting the underlying groundwater system. This paper focuses on the regulatory and technical aspects of the design, placement, and sampling of groundwater monitoring wells for hazardous waste disposal facilities. Such facilities include surface impoundments, landfills, waste piles, and land treatment facilities. 8 refs., 4 figs

  7. Optimal color design of psychological counseling room by design of experiments and response surface methodology.

    Science.gov (United States)

    Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu

    2014-01-01

    Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.

  8. Evaluation of various conventional methods for sampling weeds in potato and spinach crops

    Directory of Open Access Journals (Sweden)

    David Jamaica

    2014-04-01

    Full Text Available This study aimed to evaluate (at an exploratory level, some of the different conventional sampling designs in a section of a potato crop and in a commercial crop of spinach. Weeds were sampled in a 16 x 48 m section of a potato crop with a set grid of 192 sections. The cover and density of the weeds were registered in squares of from 0.25 to 64 m². The results were used to create a database that allowed for the simulation of different sampling designs: variables and square size. A second sampling was carried out with these results in a spinach crop of 1.16 ha with a set grid of 6 x 6 m cells, evaluating the cover in 4 m² squares. Another database was created with this information, which was used to simulate other sampling designs such as distribution and quantity of sampling squares. According to the obtained results, a good method for approximating the quantity of squares for diverse samples is 10-12 squares (4 m² for richness per ha and 18 or more squares for abundance per hectare. This square size is optimal since it allows for a sampling of more area without losing sight of low-profile species, with the cover variable best representing the abundance of the weeds.

  9. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  10. Solvent Hold Tank Sample Results for MCU-16-991-992-993: July 2016 Monthly sample and MCU-16-1033-1034-1035: July 2016 Superwashed Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-11-25

    SRNL received one set of SHT samples (MCU-16-991, MCU-16-992 and MCU-16-993), pulled on 07/13/2016 and another set of SHT samples (MCU-16-1033, MCU-16-1034, and MCU-16-1035) that were pulled on 07/24/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-991, MCU-16-992, and MCU-16-993 were combined into one sample (MCU-16-991-992-993) and samples MCU-16-1033, MCU-16-1034, and MCU-16-1035 were combined into one sample (MCU-16-1033-1034-1035). Of the two composite samples MCU-16-1033-1034-1035 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-1033-1034-1035. There were no chemical differences between MCU-16- 991-992-993 and superwashed MCU-16-1033-1034-1035.

  11. Vapor space characterization of Waste Tank 241-C-103: Inorganic results from sample Job 7B (May 12-25, 1994)

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Pool, K.H.; Lerner, B.D.

    1994-10-01

    This report is to provide analytical results for use in safety and toxicological evaluations of the vapor space of Hanford single-shell waste storage tanks C-103. Samples were analysed to determine concentrations of ammonia, nitric oxide, nitrogen dioxide, sulfur oxides, and hydrogen cyanide. In addition to the samples, controls were analyzed that included blanks, spiked blanks, and spiked samples. These controls provided information about the suitability of sampling and analytical methods. Also included are the following: information describing the methods and sampling procedures used; results of sample analyses; and Conclusions and recommendations

  12. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Juergen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aaron, A. M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bell, Gary L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burgess, Thomas W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellis, Ronald James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Giuliano, D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Howard, R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kiggans, James O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lessard, Timothy L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ohriner, Evan Keith [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Perkins, Dale E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Varma, Venugopal Koikal [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-10-20

    -state heat fluxes of 5–20 MW/m2 and ion fluxes up to 1024 m-2s-1. Since PFCs will have to withstand neutron irradiation displacement damage up to 50 dpa, the target station design must accommodate radioactive specimens (materials to be irradiated in HFIR or at SNS) to enable investigations of the impact of neutron damage on materials. Therefore, the system will have to be able to install and extract irradiated specimens using equipment and methods to avoid sample modification, control contamination, and minimize worker dose. Included in the design considerations will be an assessment of all the steps between neutron irradiation and post-exposure materials examination/characterization, as well as an evaluation of the facility hazard categorization. In particular, the factors associated with the acquisition of radioactive specimens and their preparation, transportation, experimental configuration at the plasma-specimen interface, post-plasma-exposure sample handling, and specimen preparation will be evaluated. Neutronics calculations to determine the dose rates of the samples were carried out for a large number of potential plasma-facing materials.

  13. Designing testing service at baristand industri Medan’s liquid waste laboratory

    Science.gov (United States)

    Kusumawaty, Dewi; Napitupulu, Humala L.; Sembiring, Meilita T.

    2018-03-01

    Baristand Industri Medan is a technical implementation unit under the Industrial and Research and Development Agency, the Ministry of Industry. One of the services often used in Baristand Industri Medan is liquid waste testing service. The company set the standard of service is nine working days for testing services. At 2015, 89.66% on testing services liquid waste does not meet the specified standard of services company because of many samples accumulated. The purpose of this research is designing online services to schedule the coming the liquid waste sample. The method used is designing an information system that consists of model design, output design, input design, database design and technology design. The results of designing information system of testing liquid waste online consist of three pages are pages to the customer, the recipient samples and laboratory. From the simulation results with scheduled samples, then the standard services a minimum of nine working days can be reached.

  14. 2015 Long-Term Hydrologic Monitoring Program Sampling and Analysis Results Report for Project Rulison, Co

    Energy Technology Data Exchange (ETDEWEB)

    Findlay, Rick [Navarro Research and Engineering, Oak Ridge, TN (United States); Kautsky, Mark [US Department of Energy, Washington, DC (United States). Office of Legacy Management

    2015-12-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted annual sampling at the Rulison, Colorado, Site for the Long-Term Hydrologic Monitoring Program (LTHMP) on May 20–22 and 27, 2015. Several of the land owners were not available to allow access to their respective properties, which created the need for several sample collection trips. This report documents the analytical results of the Rulison monitoring event and includes the trip report and the data validation package (Appendix A). The groundwater and surface water monitoring were shipped to the GEL Group Inc. laboratories for analysis. All requested analyses were successfully completed. Samples were analyzed for gamma-emitting radionuclides by high- resolution gamma spectrometry. Tritium was analyzed using two methods, the conventional tritium method, which has a detection limit on the order of 400 picocuries per liter (pCi/L), and the enriched method (for selected samples), which has a detection limit on the order of 3 pCi/L.

  15. Design of a New Concentration Series for the Orthogonal Sample Design Approach and Estimation of the Number of Reactions in Chemical Systems.

    Science.gov (United States)

    Shi, Jiajia; Liu, Yuhai; Guo, Ran; Li, Xiaopei; He, Anqi; Gao, Yunlong; Wei, Yongju; Liu, Cuige; Zhao, Ying; Xu, Yizhuang; Noda, Isao; Wu, Jinguang

    2015-11-01

    A new concentration series is proposed for the construction of a two-dimensional (2D) synchronous spectrum for orthogonal sample design analysis to probe intermolecular interaction between solutes dissolved in the same solutions. The obtained 2D synchronous spectrum possesses the following two properties: (1) cross peaks in the 2D synchronous spectra can be used to reflect intermolecular interaction reliably, since interference portions that have nothing to do with intermolecular interaction are completely removed, and (2) the two-dimensional synchronous spectrum produced can effectively avoid accidental collinearity. Hence, the correct number of nonzero eigenvalues can be obtained so that the number of chemical reactions can be estimated. In a real chemical system, noise present in one-dimensional spectra may also produce nonzero eigenvalues. To get the correct number of chemical reactions, we classified nonzero eigenvalues into significant nonzero eigenvalues and insignificant nonzero eigenvalues. Significant nonzero eigenvalues can be identified by inspecting the pattern of the corresponding eigenvector with help of the Durbin-Watson statistic. As a result, the correct number of chemical reactions can be obtained from significant nonzero eigenvalues. This approach provides a solid basis to obtain insight into subtle spectral variations caused by intermolecular interaction.

  16. Sampling pig farms at the abattoir in a cross-sectional study − Evaluation of a sampling method

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Hisham Beshara Halasa, Tariq; Toft, Nils

    2017-01-01

    slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2......A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list...... of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However...

  17. Design description and validation results for the IFMIF High Flux Test Module as outcome of the EVEDA phase

    Directory of Open Access Journals (Sweden)

    F. Arbeiter

    2016-12-01

    Full Text Available During the Engineering Validation and Engineering Design Activities (EVEDA phase (2007-2014 of the International Fusion Materials Irradiation Facility (IFMIF, an advanced engineering design of the High Flux Test Module (HFTM has been developed with the objective to facilitate the controlled irradiation of steel samples in the high flux area directly behind the IFMIF neutron source. The development process addressed included manufacturing techniques, CAD, neutronic, thermal-hydraulic and mechanical analyses complemented by a series of validation activities. Validation included manufacturing of 1:1 parts and mockups, test of prototypes in the FLEX and HELOKA-LP helium loops of KIT for verification of the thermal and mechanical properties, and irradiation of specimen filled capsule prototypes in the BR2 test reactor. The prototyping activities were backed by several R&D studies addressing focused issues like handling of liquid NaK (as filling medium and insertion of Small Specimen Test Technique (SSTT specimens into the irradiation capsules. This paper provides an up-todate design description of the HFTM irradiation device, and reports on the achieved performance criteria related to the requirements. Results of the validation activities are accounted for and the most important issues for further development are identified.

  18. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Science.gov (United States)

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how

  19. Strategies for achieving high sequencing accuracy for low diversity samples and avoiding sample bleeding using illumina platform.

    Directory of Open Access Journals (Sweden)

    Abhishek Mitra

    Full Text Available Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding. Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants. Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer's, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively

  20. Vapor space characterization of waste Tank 241-BY-108: Results from samples collected on 10/27/94

    International Nuclear Information System (INIS)

    McVeety, B.D.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-BY-108 (referred to as Tank BY-108). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Trends in NH 3 and H 2 O samples indicated a possible sampling problem. Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, the authors looked for the 40 TO-14 compounds plus an additional 15 analytes. Of these, 17 were observed above the 5-ppbv reporting cutoff. Also, eighty-one organic tentatively identified compounds (TICs) were observed above the reporting cutoff (ca.) 10 ppbv, and are reported with concentrations that are semiquantitative estimates based on internal standard response factors. The nine organic analytes with the highest estimated concentrations are listed in Summary Table 1 and account for approximately 48% of the total organic components in the headspace of Tank BY-108. Three permanent gases, hydrogen (H 2 ), carbon dioxide (CO 2 ), and nitrous oxide (N 2 O) were also detected. Tank BY-108 is on the Ferrocyanide Watch List

  1. Results Of Analytical Sample Crosschecks For Next Generation Solvent Extraction Samples Isopar L Concentration And pH

    International Nuclear Information System (INIS)

    Peters, T.; Fink, S.

    2011-01-01

    As part of the implementation process for the Next Generation Cesium Extraction Solvent (NGCS), SRNL and F/H Lab performed a series of analytical cross-checks to ensure that the components in the NGCS solvent system do not constitute an undue analytical challenge. For measurement of entrained Isopar(reg s ign) L in aqueous solutions, both labs performed similarly with results more reliable at higher concentrations (near 50 mg/L). Low bias occurred in both labs, as seen previously for comparable blind studies for the baseline solvent system. SRNL recommends consideration to use of Teflon(trademark) caps on all sample containers used for this purpose. For pH measurements, the labs showed reasonable agreement but considerable positive bias for dilute boric acid solutions. SRNL recommends consideration of using an alternate analytical method for qualification of boric acid concentrations.

  2. Vapor space characterization of waste tank 241-BY-105 (in situ): Results from samples collected on May 9, 1994

    International Nuclear Information System (INIS)

    McVeety, B.D.; Pool, K.H.; Ligotke, M.W.; Clauss, T.W.; Lucke, R.B.; Sharma, A.K.; McCulloch, M.; Fruchter, J.S.; Goheen, S.C.

    1995-05-01

    This report describes inorganic and organic analyses results from in situ samples obtained from the tank headspace of the Hanford waste storage Tank 241-BY-105 (referred to as Tank BY-105). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds NH 3 , NO 2 , NO, HCN, and H 2 O. Sampling for sulfur oxides was not requested. Results of the inorganic samples were affected by sampling errors that led to an undefined uncertainty in sample volume. Consequently, tank-headspace concentrations are estimated only. Thirty-nine tentatively identified organic analytes were observed above the detection limit of (ca.) 10 ppbv, but standards for most of these were not available at the time of analysis, and their quantitation is beyond the scope of this study. In addition, we looked for the 41 standard TO-14 analytes. Of these, only a few were observed above the 2-ppbv detection limit. The 16 organic analytes with the highest estimated concentrations are listed. These 16 analytes account for approximately 68% of the total or organic components in Tank BY-105

  3. Headspace vapor characterization of Hanford Waste Tank 241-U-112: Results from samples collected on 7/09/96

    International Nuclear Information System (INIS)

    Evans, J.C.; Pool, K.H.; Thomas, B.L.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1997-01-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-U-112 at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company

  4. Long-term frozen storage of urine samples: a trouble to get PCR results in Schistosoma spp. DNA detection?

    Directory of Open Access Journals (Sweden)

    Pedro Fernández-Soto

    Full Text Available BACKGROUND: Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples' storage or conditions for handling and DNA preservation and extraction methods. METHODOLOGY/PRINCIPAL FINDINGS: We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patients urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. CONCLUSIONS/SIGNIFICANCE: Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA

  5. Effect of sample stratification on dairy GWAS results

    Directory of Open Access Journals (Sweden)

    Ma Li

    2012-10-01

    Full Text Available Abstract Background Artificial insemination and genetic selection are major factors contributing to population stratification in dairy cattle. In this study, we analyzed the effect of sample stratification and the effect of stratification correction on results of a dairy genome-wide association study (GWAS. Three methods for stratification correction were used: the efficient mixed-model association expedited (EMMAX method accounting for correlation among all individuals, a generalized least squares (GLS method based on half-sib intraclass correlation, and a principal component analysis (PCA approach. Results Historical pedigree data revealed that the 1,654 contemporary cows in the GWAS were all related when traced through approximately 10–15 generations of ancestors. Genome and phenotype stratifications had a striking overlap with the half-sib structure. A large elite half-sib family of cows contributed to the detection of favorable alleles that had low frequencies in the general population and high frequencies in the elite cows and contributed to the detection of X chromosome effects. All three methods for stratification correction reduced the number of significant effects. EMMAX method had the most severe reduction in the number of significant effects, and the PCA method using 20 principal components and GLS had similar significance levels. Removal of the elite cows from the analysis without using stratification correction removed many effects that were also removed by the three methods for stratification correction, indicating that stratification correction could have removed some true effects due to the elite cows. SNP effects with good consensus between different methods and effect size distributions from USDA’s Holstein genomic evaluation included the DGAT1-NIBP region of BTA14 for production traits, a SNP 45kb upstream from PIGY on BTA6 and two SNPs in NIBP on BTA14 for protein percentage. However, most of these consensus effects had

  6. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  7. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  8. Headspace vapor characterization of Hanford waste Tank 241-C-201: Results from samples collected on 06/19/96

    International Nuclear Information System (INIS)

    Thomas, B.L.; Evans, J.C.; Pool, K.H.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1997-01-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-C-201 (Tank C-201) at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNNL). Analyses were performed by the Vapor Analytical Laboratory (VAL) at PNNL. Analyte concentrations were based on analytical results and, where appropriate, on sample volumes provided by WHC. A summary, of the inorganic analytes, permanent gases, and total non-methane organic compounds is listed in a table. Detailed descriptions of the analytical results appear in the appendices

  9. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  10. The results of experimental studies of VLF-ULF electromagnetic emission by rock samples due to mechanical action

    Science.gov (United States)

    Panfilov, A. A.

    2014-06-01

    The paper presents the results of laboratory experiments on electromagnetic emissions excitation (the electric component of electromagnetic fields) by rock samples due to different forms of mechanical stress applications. It was shown that samples generate electric impulses with different spectra when the impact action, gradual loading or dynamic friction is applied. It was ascertained that level and spectral compositions of signals, generated by rock samples, change with an increasing number of hits. It was found that strong electromagnetic signals, generated while rock samples were fracturing, were accompanied by repetitive weak but perceptible variations in the electric field intensity in short frequency ranges.

  11. Parameter sampling capabilities of sequential and simultaneous data assimilation: II. Statistical analysis of numerical results

    International Nuclear Information System (INIS)

    Fossum, Kristian; Mannseth, Trond

    2014-01-01

    We assess and compare parameter sampling capabilities of one sequential and one simultaneous Bayesian, ensemble-based, joint state-parameter (JS) estimation method. In the companion paper, part I (Fossum and Mannseth 2014 Inverse Problems 30 114002), analytical investigations lead us to propose three claims, essentially stating that the sequential method can be expected to outperform the simultaneous method for weakly nonlinear forward models. Here, we assess the reliability and robustness of these claims through statistical analysis of results from a range of numerical experiments. Samples generated by the two approximate JS methods are compared to samples from the posterior distribution generated by a Markov chain Monte Carlo method, using four approximate measures of distance between probability distributions. Forward-model nonlinearity is assessed from a stochastic nonlinearity measure allowing for sufficiently large model dimensions. Both toy models (with low computational complexity, and where the nonlinearity is fairly easy to control) and two-phase porous-media flow models (corresponding to down-scaled versions of problems to which the JS methods have been frequently applied recently) are considered in the numerical experiments. Results from the statistical analysis show strong support of all three claims stated in part I. (paper)

  12. Strain measurement of objects subjected to aerodynamic heating using digital image correlation: experimental design and preliminary results.

    Science.gov (United States)

    Pan, Bing; Jiang, Tianyun; Wu, Dafang

    2014-11-01

    In thermomechanical testing of hypersonic materials and structures, direct observation and quantitative strain measurement of the front surface of a test specimen directly exposed to severe aerodynamic heating has been considered as a very challenging task. In this work, a novel quartz infrared heating device with an observation window is designed to reproduce the transient thermal environment experienced by hypersonic vehicles. The specially designed experimental system allows the capture of test article's surface images at various temperatures using an optical system outfitted with a bandpass filter. The captured images are post-processed by digital image correlation to extract full-field thermal deformation. To verify the viability and accuracy of the established system, thermal strains of a chromiumnickel austenite stainless steel sample heated from room temperature up to 600 °C were determined. The preliminary results indicate that the air disturbance between the camera and the specimen due to heat haze induces apparent distortions in the recorded images and large errors in the measured strains, but the average values of the measured strains are accurate enough. Limitations and further improvements of the proposed technique are discussed.

  13. Implicit Leadership Theory: Are Results Generalizable from Student to Professional Samples?

    Science.gov (United States)

    Singer, Ming

    1990-01-01

    Explores whether student subjects' implicit leadership theories are generalizable to professional subjects. Samples consisted of 220 undergraduates and 152 government employees in New Zealand. Finds the mean importance ratings were similar for the 2 samples, except students placed greater importance on factors beyond individual control. (DB)

  14. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  15. Headspace vapor characterization of Hanford waste tank 241-B-107: Results from samples collected on 7/23/96

    International Nuclear Information System (INIS)

    Evans, J.C.; Pool, K.H.; Thomas, B.L.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1997-01-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-B-107 (Tank B-107) at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwestern National Laboratory (PNNL). A summary of the inorganic analytes, permanent gases, and total non-methane organic compounds is listed in a table. The three highest concentration analytes detected in SUMMA trademark canister and triple sorbent trap samples are also listed in the same table. Detailed descriptions of the analytical results appear in the appendices

  16. Headspace vapor characterization of Hanford waste tank 241-S-106: Results from samples collected on 06/13/96

    International Nuclear Information System (INIS)

    Evans, J.C.; Pool, K.H.; Thomas, B.L.; Olsen, K.B.; Fruchter, J.S.; Silvers, K.L.

    1997-01-01

    This report describes the analytical results of vapor samples taken from the headspace of the waste storage tank 241-S-106 (Tank S-106) at the Hanford Site in Washington State. The results described in this report were obtained to characterize the vapors present in the tank headspace and to support safety evaluations and tank farm operations. The results include air concentrations of selected inorganic and organic analytes and grouped compounds from samples obtained by Westinghouse Hanford Company (WHC) and provided for analysis to Pacific Northwest National Laboratory (PNNL). A summary of the inorganic analytes, permanent gases, and total non-methane organic compounds is listed in a table. The three highest concentration analytes detected in SUMMA trademark canister and triple sorbent trap samples are also listed in the same table. Detailed descriptions of the analytical results appear in the appendices

  17. Towards Representative Metallurgical Sampling and Gold Recovery Testwork Programmes

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and high quality testwork. The quality and type of samples used are as important as the testwork itself. The key characteristic required of any set of samples is that they represent a given domain and quantify its variability. There are those who think that stating a sample(s is representative makes it representative without justification. There is a need to consider both (1 in-situ and (2 testwork sub-sample representativity. Early ore/waste characterisation and domain definition are required, so that sampling and testwork protocols can be designed to suit the style of mineralisation in question. The Theory of Sampling (TOS provides an insight into the causes and magnitude of errors that may occur during the sampling of particulate materials (e.g., broken rock and is wholly applicable to metallurgical sampling. Quality assurance/quality control (QAQC is critical throughout all programmes. Metallurgical sampling and testwork should be fully integrated into geometallurgical studies. Traditional metallurgical testwork is critical for plant design and is an inherent part of geometallurgy. In a geometallurgical study, multiple spatially distributed small-scale tests are used as proxies for process parameters. These will be validated against traditional testwork results. This paper focusses on sampling and testwork for gold recovery determination. It aims to provide the reader with the background to move towards the design, implementation and reporting of representative and fit-for-purpose sampling and testwork programmes. While the paper does not intend to provide a definitive commentary, it critically assesses the hard-rock sampling methods used and their optimal collection and preparation. The need for representative sampling and quality testwork to avoid financial and intangible losses is

  18. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  19. Direct calibration of PICKY-designed microarrays

    Directory of Open Access Journals (Sweden)

    Ronald Pamela C

    2009-10-01

    Full Text Available Abstract Background Few microarrays have been quantitatively calibrated to identify optimal hybridization conditions because it is difficult to precisely determine the hybridization characteristics of a microarray using biologically variable cDNA samples. Results Using synthesized samples with known concentrations of specific oligonucleotides, a series of microarray experiments was conducted to evaluate microarrays designed by PICKY, an oligo microarray design software tool, and to test a direct microarray calibration method based on the PICKY-predicted, thermodynamically closest nontarget information. The complete set of microarray experiment results is archived in the GEO database with series accession number GSE14717. Additional data files and Perl programs described in this paper can be obtained from the website http://www.complex.iastate.edu under the PICKY Download area. Conclusion PICKY-designed microarray probes are highly reliable over a wide range of hybridization temperatures and sample concentrations. The microarray calibration method reported here allows researchers to experimentally optimize their hybridization conditions. Because this method is straightforward, uses existing microarrays and relatively inexpensive synthesized samples, it can be used by any lab that uses microarrays designed by PICKY. In addition, other microarrays can be reanalyzed by PICKY to obtain the thermodynamically closest nontarget information for calibration.

  20. Vapor space characterization of Waste Tank 241-TY-104: Results from samples collected on 4/27/95

    International Nuclear Information System (INIS)

    Klinger, G.S.; Olsen, K.B.; Clauss, T.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-TY-104 (referred to as Tank TY-104). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 8 were observed above the 5-ppbv reporting cutoff. Five tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 94% of the total organic components in Tank TY-104. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace samples. Tank TY-104 is on the Ferrocyanide Watch List

  1. Vapor space characterization of Waste Tank 241-U-105: Results from samples collected on 2/24/95

    International Nuclear Information System (INIS)

    Pool, K.H.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-U-105 (referred to as Tank U-105). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, six were observed above the 5-ppbv reporting cutoff. Three tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. All nine of the organic analytes identified are listed in Table 1 and account for 100% of the total organic components in Tank U-105. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace sample. Tank U-105 is on the Hydrogen Watch List

  2. Vapor space characterization of waste Tank 241-SX-103: Results from samples collected on 3/23/95

    International Nuclear Information System (INIS)

    Ligotke, M.W.; Clauss, T.W.; Pool, K.H.; McVeety, B.D.; Klinger, G.S.; Olsen, K.B.; Bredt, O.P.; Fruchter, J.S.; Goheen, S.C.

    1995-11-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage tank 241-SX-103 (referred to as Tank SX-103). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water vapor (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, two were observed above the 5-ppbv reporting cutoff. Two tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The four organic analytes identified are listed in Table 1 and account for approximately 100% of the total organic components in Tank SX-103. Carbon dioxide (CO 2 ) was the only permanent gas detected in the tank-headspace samples. Tank SX-103 is on the Hydrogen Watch List

  3. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    Science.gov (United States)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  4. SSC 40 mm cable results and 50 mm design discussions

    International Nuclear Information System (INIS)

    Christopherson, D.; Capone, D.; Hannaford, R.; Remsbottom, R.; Delashmit, R.; Jayakumar, R.J.; Snitchler, G.; Scanlan, R.; Royet, J.

    1991-01-01

    This paper presents a summary of the cable produced for the 1990 40 mm Dipole Program. The cable design parameters for the 50 mm Dipole Program are discussed, as well as portions of the SSC specification draft. Considerations leading to the final cable configuration and the results of preliminary trials are included. The first iteration of a strand mapping program to automate cable strand maps is introduced

  5. Design and development of a highly sensitive, field portable plasma source instrument for on-line liquid stream monitoring and real-time sample analysis

    International Nuclear Information System (INIS)

    Duan, Yixiang; Su, Yongxuan; Jin, Zhe; Abeln, Stephen P.

    2000-01-01

    The development of a highly sensitive, field portable, low-powered instrument for on-site, real-time liquid waste stream monitoring is described in this article. A series of factors such as system sensitivity and portability, plasma source, sample introduction, desolvation system, power supply, and the instrument configuration, were carefully considered in the design of the portable instrument. A newly designed, miniature, modified microwave plasma source was selected as the emission source for spectroscopy measurement, and an integrated small spectrometer with a charge-coupled device detector was installed for signal processing and detection. An innovative beam collection system with optical fibers was designed and used for emission signal collection. Microwave plasma can be sustained with various gases at relatively low power, and it possesses high detection capabilities for both metal and nonmetal pollutants, making it desirable to use for on-site, real-time, liquid waste stream monitoring. An effective in situ sampling system was coupled with a high efficiency desolvation device for direct-sampling liquid samples into the plasma. A portable computer control system is used for data processing. The new, integrated instrument can be easily used for on-site, real-time monitoring in the field. The system possesses a series of advantages, including high sensitivity for metal and nonmetal elements; in situ sampling; compact structure; low cost; and ease of operation and handling. These advantages will significantly overcome the limitations of previous monitoring techniques and make great contributions to environmental restoration and monitoring. (c)

  6. Control sample design using a geodemographic discriminator: An application of Super Profiles

    Science.gov (United States)

    Brown, Peter J. B.; McCulloch, Peter G.; Williams, Evelyn M. I.; Ashurst, Darren C.

    The development and application of an innovative sampling framework for use in a British study of the early detection of gastric cancer are described. The Super Profiles geodemographic discriminator is used in the identification of geographically distinct control and contrast areas from which samples of cancer registry case records may be drawn for comparison with the records of patients participating in the gastric cancer intervention project. Preliminary results of the application of the framework are presented and confirm its effectiveness in satisfactorily reflecting known patterns of variation in cancer occurrence by age, gender and social class. The method works well for cancers with a known and clear social gradient, such as lung and breast cancer, moderately well for gastric cancer and somewhat less well for oesophageal cancer, where the social class gradient is less clear.

  7. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  8. Present status of NMCC and sample preparation method for bio-samples

    International Nuclear Information System (INIS)

    Futatsugawa, S.; Hatakeyama, S.; Saitou, S.; Sera, K.

    1993-01-01

    In NMCC(Nishina Memorial Cyclotron Center) we are doing researches on PET of nuclear medicine (Positron Emission Computed Tomography) and PIXE analysis (Particle Induced X-ray Emission) using a small cyclotron of compactly designed. The NMCC facilities have been opened to researchers of other institutions since April 1993. The present status of NMCC is described. Bio-samples (medical samples, plants, animals and environmental samples) have mainly been analyzed by PIXE in NMCC. Small amounts of bio-samples for PIXE are decomposed quickly and easily in a sealed PTFE (polytetrafluoroethylene) vessel with a microwave oven. This sample preparation method of bio-samples also is described. (author)

  9. Preconcentration and determination of ceftazidime in real samples using dispersive liquid-liquid microextraction and high-performance liquid chromatography with the aid of experimental design.

    Science.gov (United States)

    Razmi, Rasoul; Shahpari, Behrouz; Pourbasheer, Eslam; Boustanifar, Mohammad Hasan; Azari, Zhila; Ebadi, Amin

    2016-11-01

    A rapid and simple method for the extraction and preconcentration of ceftazidime in aqueous samples has been developed using dispersive liquid-liquid microextraction followed by high-performance liquid chromatography analysis. The extraction parameters, such as the volume of extraction solvent and disperser solvent, salt effect, sample volume, centrifuge rate, centrifuge time, extraction time, and temperature in the dispersive liquid-liquid microextraction process, were studied and optimized with the experimental design methods. Firstly, for the preliminary screening of the parameters the taguchi design was used and then, the fractional factorial design was used for significant factors optimization. At the optimum conditions, the calibration curves for ceftazidime indicated good linearity over the range of 0.001-10 μg/mL with correlation coefficients higher than the 0.98, and the limits of detection were 0.13 and 0.17 ng/mL, for water and urine samples, respectively. The proposed method successfully employed to determine ceftazidime in water and urine samples and good agreement between the experimental data and predictive values has been achieved. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Sample Results From The Interim Salt Disposition Program Macrobatch 7 Tank 21H Qualification Samples

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B.; Washington, A. L. II

    2013-08-08

    Savannah River National Laboratory (SRNL) analyzed samples from Tank 21H in support of qualification of Macrobatch (Salt Batch) 7 for the Interim Salt Disposition Program (ISDP). An ARP and several ESS tests were also performed. This document reports characterization data on the samples of Tank 21H as well as simulated performance of ARP/MCU. No issues with the projected Salt Batch 7 strategy are identified, other than the presence of visible quantities of dark colored solids. A demonstration of the monosodium titanate (0.2 g/L) removal of strontium and actinides provided acceptable 4 hour average decontamination factors for Pu and Sr of 3.22 and 18.4, respectively. The Four ESS tests also showed acceptable behavior with distribution ratios (D(Cs)) values of 15.96, 57.1, 58.6, and 65.6 for the MCU, cold blend, hot blend, and Next Generation Solvent (NGS), respectively. The predicted value for the MCU solvent was 13.2. Currently, there are no models that would allow a prediction of extraction behavior for the other three solvents. SRNL recommends that a model for predicting extraction behavior for cesium removal for the blended solvent and NGS be developed. While no outstanding issues were noted, the presence of solids in the samples should be investigated in future work. It is possible that the solids may represent a potential reservoir of material (such as potassium) that could have an impact on MCU performance if they were to dissolve back into the feed solution. This salt batch is intended to be the first batch to be processed through MCU entirely using the new NGS-MCU solvent.

  11. Research on PCPV for BWR - physical model as design tool - main results

    International Nuclear Information System (INIS)

    Fumagalli, E.; Verdelli, G.

    1975-01-01

    ISMES (Experimental Institute for Models and Structures) is now carrying out a series of tests on physical models as a part of a research programme sponsored by DSR (Studies and Research Direction) of ENEL (Italian State Electricity Board) on behalf of CPN (Nuclear Design and Construction Centre) of ENEL with the aim to experience a 'Thin'-walled PCPV for 'BWR'. The physical model, together with the mathematical model and the rheological model of the materials, is intended as a meaningful design tool. The mathematical model covers the overall structural design phase, (geometries) and the linear behaviour, whereas the physical model, besides of a global information to be compared with the results of the mathematical model, supplies a number of data as the non-linear behaviour up to failure and local conditions (penetration area etc.) are concerned. The aim of the first phase of this research programme is to make a comparison between the calculation and experiment tests as the thicknesses of the wall and the bottom slab are concerned, whereas the second phase of the research deals with the behaviour of the removable lid and its connection with the main structure. To do this, a model in scale 1:10 has been designed which symmetrically reproduces with respect to the equator, the bottom part of the structure. In the bottom slab the penetrations of the prototype design are reproduced, whereas the upper slab is plain. This paper describes the model, and illustrates the main results, underlining the different behaviour of the upper and bottom slabs up to collapse

  12. Infant feeding bottle design, growth and behaviour: results from a randomised trial

    Directory of Open Access Journals (Sweden)

    Fewtrell MS

    2012-03-01

    Full Text Available Abstract Background Whether the design of an anti-vacuum infant feeding bottle influences infant milk intake, growth or behavior is unknown, and was the subject of this randomized trial. Methods Subjects 63 (36 male healthy, exclusively formula-fed term infants. Intervention Randomisation to use Bottle A (n = 31, one-way air valve: Philips Avent versus Bottle B (n = 32, internal venting system: Dr Browns. 74 breast-fed reference infants were recruited, with randomisation (n = 24 to bottle A (n = 11 or B (n = 13 if bottle-feeding was subsequently introduced. Randomisation stratified by gender and parity; computer-based telephone randomisation by independent clinical trials unit. Setting Infant home. Primary outcome measure infant weight gain to 4 weeks. Secondary outcomes (i milk intake (ii infant behaviour measured at 2 weeks (validated 3-day diary; (iii risk of infection; (iv continuation of breastfeeding following introduction of mixed feeding. Results Number analysed for primary outcome Bottle A n = 29, Bottle B n = 25. Primary outcome There was no significant difference in weight gain between randomised groups (0-4 weeks Bottle A 0.74 (SD 1.2 SDS versus bottle B 0.51 (0.39, mean difference 0.23 (95% CI -0.31 to 0.77. Secondary outcomes Infants using bottle A had significantly less reported fussing (mean 46 versus 74 minutes/day, p Breast-fed reference group There were no significant differences in primary or secondary outcomes between breast-fed and formula fed infants. The likelyhood of breastfeeding at 3 months was not significantly different in infants subsequently randomised to bottle A or B. Conclusion Bottle design may have short-term effects on infant behaviour which merit further investigation. No significant effects were seen on milk intake or growth; confidence in these findings is limited by the small sample size and this needs confirmation in a larger study. Trial registration Clinical Trials.gov NCT00325208.

  13. Sample triage : an overview of Environment Canada's program

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, P.; Goldthorp, M.; Fingas, M. [Environment Canada, Ottawa, ON (Canada). Emergencies Science and Technology Division, Environmental Technology Centre, Science and Technology Branch

    2006-07-01

    The Chemical, biological and radiological/nuclear Research and Technology Initiative (CRTI) is a program led by Canada's Department of National Defence in an effort to improve the capability of providing technical and analytical support in the event of a terrorist-related event. This paper summarized the findings from the CRTI Sample Triage Working Group and reviewed information on Environment Canada's triage program and its' mobile sample inspection facility that was designed to help examine samples of hazardous materials in a controlled environment to minimize the risk of exposure. A sample triage program is designed to deal with administrative, health and safety issues by facilitating the safe transfer of samples to an analytical laboratory. It refers to the collation of all results including field screening information, intelligence and observations for the purpose of prioritizing and directing the sample to the appropriate laboratory for analysis. A central component of Environment Canada's Emergency Response Program has been its capacity to respond on site during an oil or chemical spill. As such, the Emergencies Science and Technology Division has acquired a new mobile sample inspection facility in 2004. It is constructed to work with a custom designed decontamination unit and Ford F450 tow vehicle. The criteria and general design of the trailer facility was described. This paper also outlined the steps taken following a spill of hazardous materials into the environment so that potentially dangerous samples could be safety assessed. Several field trials will be carried out in order to develop standard operating procedures for the mobile sample inspection facility. 6 refs., 6 figs., 4 appendices.

  14. Sensitive Metamaterial Sensor for Distinction of Authentic and Inauthentic Fuel Samples

    Science.gov (United States)

    Tümkaya, Mehmet Ali; Dinçer, Furkan; Karaaslan, Muharrem; Sabah, Cumali

    2017-08-01

    A metamaterial-based sensor has been realized to distinguish authentic and inauthentic fuel samples in the microwave frequency regime. Unlike the many studies in literature on metamaterial-based sensor applications, this study focuses on a compact metamaterial-based sensor operating in the X-band frequency range. Firstly, electromagnetic properties of authentic and inauthentic fuel samples were obtained experimentally in a laboratory environment. Secondly, these experimental results were used to design and create a highly efficient metamaterial-based sensor with easy fabrication characteristics and simple design structure. The experimental results for the sensor were in good agreement with the numerical ones. The proposed sensor offers a more efficient design and can be used to detect fuel and multiple other liquids in various application fields from medical to military areas in several frequency regimes.

  15. Aerosol sampling of an experimental fluidized bed coal combustor

    International Nuclear Information System (INIS)

    Newton, G.J.; Peele, E.R.; Carpenter, R.L.; Yeh, H.C.

    1977-01-01

    Fluidized bed combustion of coal, lignite or other materials has a potential for widespread use in central electric generating stations in the near future. This technology may allow widespread use of low-grade and/or high sulfur fuels due to its high energy utilization at low combustion temperature and its ability to meet emission criteria by using limestone bed material. Particulate and gaseous products resulting from fuel combustion and fluidization of bed material are discharged and proceed out the exhaust clean-up system. Sampling philosophy, methodology and equipment used to obtain aerosol samples from the exhaust system of the 18-inch fluidized bed combustor (FBC) at the Morgantown Energy Research Center (MERC) are described. Identification of sampling sites led to design of an aerosol sampling train which allowed a known quantity of the effluent streams to be sampled. Depending on the position, a 15 to 25 l/min sample is extracted from the duct, immediately diluted and transferred to a sampling/aging chamber. Transmission and scanning electron microscope samples, two types of cascade impactor samples, vapor-phase and particulate-phase organic samples, spiral duct aerosol centrifuge samples, optical size measurements and filter samples were obtained. Samples are undergoing physical, chemical and biological tests to help establish human health risk estimates for fluidized bed coal combustion and to provide information for use in design and evaluation of control technologies

  16. Semiautomatic exchanger of samples for carry out neutron activation analysis

    International Nuclear Information System (INIS)

    Aguilar H, F.; Quintana C, G.; Torres R, C. E.; Mejia J, J. O.

    2015-09-01

    In this paper the design methodology and implementation of a semiautomatic exchanger of samples for the Analysis Laboratory by Neutron Activation of the Reactor department is presented. Taking into account the antecedents, the necessities of improvement are described, as well as the equipment that previously contained the Laboratory. The project of the semiautomatic exchanger of samples was developed at Instituto Nacional de Investigaciones Nucleares, with its own technology to increase independence from commercial equipment. Each element of the semiautomatic exchanger of samples is described both in the design phase as construction. The achieved results are positive and encouraging for the fulfillment of the proposed objective that is to increase the capacity of the Laboratory. (Author)

  17. Hyper-X Mach 7 Scramjet Design, Ground Test and Flight Results

    Science.gov (United States)

    Ferlemann, Shelly M.; McClinton, Charles R.; Rock, Ken E.; Voland, Randy T.

    2005-01-01

    The successful Mach 7 flight test of the Hyper-X (X-43) research vehicle has provided the major, essential demonstration of the capability of the airframe integrated scramjet engine. This flight was a crucial first step toward realizing the potential for airbreathing hypersonic propulsion for application to space launch vehicles. However, it is not sufficient to have just achieved a successful flight. The more useful knowledge gained from the flight is how well the prediction methods matched the actual test results in order to have confidence that these methods can be applied to the design of other scramjet engines and powered vehicles. The propulsion predictions for the Mach 7 flight test were calculated using the computer code, SRGULL, with input from computational fluid dynamics (CFD) and wind tunnel tests. This paper will discuss the evolution of the Mach 7 Hyper-X engine, ground wind tunnel experiments, propulsion prediction methodology, flight results and validation of design methods.

  18. Optimal relaxed causal sampler using sampled-date system theory

    NARCIS (Netherlands)

    Shekhawat, Hanumant; Meinsma, Gjerrit

    This paper studies the design of an optimal relaxed causal sampler using sampled data system theory. A lifted frequency domain approach is used to obtain the existence conditions and the optimal sampler. A state space formulation of the results is also provided. The resulting optimal relaxed causal

  19. Fabrication of an electrochemical sensor based on computationally designed molecularly imprinted polymer for the determination of mesalamine in real samples

    Energy Technology Data Exchange (ETDEWEB)

    Torkashvand, M. [Department of Analytical Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Gholivand, M.B., E-mail: mbgholivand@yahoo.com [Department of Analytical Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of); Taherkhani, F. [Department of Physical Chemistry, Razi University, Kermanshah (Iran, Islamic Republic of)

    2015-10-01

    A novel electrochemical sensor based on mesalamine molecularly imprinted polymer (MIP) film on a glassy carbon electrode was fabricated. Density functional theory (DFT) in gas and solution phases was developed to study the intermolecular interactions in the pre-polymerization mixture and to find the suitable functional monomers in MIP preparation. On the basis of computational results, o-phenylenediamine (OP), gallic acid (GA) and p-aminobenzoic acid (ABA) were selected as functional monomers. The MIP film was cast on glassy carbon electrode by electropolymerization of solution containing ternary monomers and then followed by Ag dendrites (AgDs) with nanobranch deposition. The surface feature of the modified electrode (AgDs/MIP/GCE) was characterized by scanning electron microscopy (SEM) and electrochemical impedance spectroscopy (EIS). Under the optimal experimental conditions, the peak current was proportional to the concentration of mesalamine ranging from 0.05 to 100 μM, with the detection limit of 0.015 μM. The proposed sensor was applied successfully for mesalamine determination in real samples. - Highlights: • The determination of MES using AgDs/MIP/GCE is reported for the first time. • The computer assisted design of terpolymer MIPs was used to screen monomers. • Theoretical results of DFT approach were in agreement with experimental results. • The sensor displayed a high selectivity for template in the presence of interferes. • The developed sensor has been applied to determine mesalamine in real samples.

  20. Fabrication of an electrochemical sensor based on computationally designed molecularly imprinted polymer for the determination of mesalamine in real samples

    International Nuclear Information System (INIS)

    Torkashvand, M.; Gholivand, M.B.; Taherkhani, F.

    2015-01-01

    A novel electrochemical sensor based on mesalamine molecularly imprinted polymer (MIP) film on a glassy carbon electrode was fabricated. Density functional theory (DFT) in gas and solution phases was developed to study the intermolecular interactions in the pre-polymerization mixture and to find the suitable functional monomers in MIP preparation. On the basis of computational results, o-phenylenediamine (OP), gallic acid (GA) and p-aminobenzoic acid (ABA) were selected as functional monomers. The MIP film was cast on glassy carbon electrode by electropolymerization of solution containing ternary monomers and then followed by Ag dendrites (AgDs) with nanobranch deposition. The surface feature of the modified electrode (AgDs/MIP/GCE) was characterized by scanning electron microscopy (SEM) and electrochemical impedance spectroscopy (EIS). Under the optimal experimental conditions, the peak current was proportional to the concentration of mesalamine ranging from 0.05 to 100 μM, with the detection limit of 0.015 μM. The proposed sensor was applied successfully for mesalamine determination in real samples. - Highlights: • The determination of MES using AgDs/MIP/GCE is reported for the first time. • The computer assisted design of terpolymer MIPs was used to screen monomers. • Theoretical results of DFT approach were in agreement with experimental results. • The sensor displayed a high selectivity for template in the presence of interferes. • The developed sensor has been applied to determine mesalamine in real samples

  1. Advancing the Use of Passive Sampling in Risk Assessment and Management of Sediments Contaminated with Hydrophobic Organic Chemicals: Results of an International Ex Situ Passive Sampling Interlaboratory Comparison

    Science.gov (United States)

    This work presents the results of an international interlaboratory comparison on ex situ passive sampling in sediments. The main objectives were to map the state of the science in passively sampling sediments, identify sources of variability, provide recommendations and practica...

  2. An ultrasonic corer for planetary rock sample retrieval

    International Nuclear Information System (INIS)

    Harkness, P; Cardoni, A; Lucas, M

    2009-01-01

    Several recent and planned space projects have been focussed on surface rovers for planetary missions, such as the U.S. Mars Exploration Rovers and the European ExoMars. The main functions of similar extraterrestrial vehicles in the future will be moving across planetary surfaces and retrieving rock samples. This paper presents a novel ultrasonic rock sampling tool tuned in a longitudinal-torsional mode along with the conceptual design of a full coring apparatus for preload delivery and core removal. Drilling and coring bits have been designed so that a portion of the longitudinal motion supplied by the ultrasonic transducer is converted into torsional motion. Results of drilling/coring trials are also presented.

  3. Off-road sampling reveals a different grassland bird community than roadside sampling: implications for survey design and estimates to guide conservation

    Directory of Open Access Journals (Sweden)

    Troy I. Wellicome

    2014-06-01

    concern. Our results highlight the need to develop appropriate corrections for bias in estimates derived from roadside sampling, and the need to design surveys that sample bird communities across a more representative cross-section of the landscape, both near and far from roads.

  4. Bayesian assessment of the expected data impact on prediction confidence in optimal sampling design

    Science.gov (United States)

    Leube, P. C.; Geiges, A.; Nowak, W.

    2012-02-01

    Incorporating hydro(geo)logical data, such as head and tracer data, into stochastic models of (subsurface) flow and transport helps to reduce prediction uncertainty. Because of financial limitations for investigation campaigns, information needs toward modeling or prediction goals should be satisfied efficiently and rationally. Optimal design techniques find the best one among a set of investigation strategies. They optimize the expected impact of data on prediction confidence or related objectives prior to data collection. We introduce a new optimal design method, called PreDIA(gnosis) (Preposterior Data Impact Assessor). PreDIA derives the relevant probability distributions and measures of data utility within a fully Bayesian, generalized, flexible, and accurate framework. It extends the bootstrap filter (BF) and related frameworks to optimal design by marginalizing utility measures over the yet unknown data values. PreDIA is a strictly formal information-processing scheme free of linearizations. It works with arbitrary simulation tools, provides full flexibility concerning measurement types (linear, nonlinear, direct, indirect), allows for any desired task-driven formulations, and can account for various sources of uncertainty (e.g., heterogeneity, geostatistical assumptions, boundary conditions, measurement values, model structure uncertainty, a large class of model errors) via Bayesian geostatistics and model averaging. Existing methods fail to simultaneously provide these crucial advantages, which our method buys at relatively higher-computational costs. We demonstrate the applicability and advantages of PreDIA over conventional linearized methods in a synthetic example of subsurface transport. In the example, we show that informative data is often invisible for linearized methods that confuse zero correlation with statistical independence. Hence, PreDIA will often lead to substantially better sampling designs. Finally, we extend our example to specifically

  5. Single blood-Hg samples can result in exposure misclassification: temporal monitoring within the Japanese community (United States

    Directory of Open Access Journals (Sweden)

    Tsuchiya Ami

    2012-06-01

    Full Text Available Abstract Background The most prominent non-occupational source of exposure to methylmercury is the consumption of fish. In this study we examine a fish consuming population to determine the extent of temporal exposure and investigate the extent to which single time estimates of methylmercury exposure based on blood-Hg concentration can provide reliable estimates of longer-term average exposure. Methods Blood-mercury levels were obtained from a portion of the Arsenic Mercury Intake Biometric Study (AMIBS cohort. Specifically, 56 Japanese women residing in the Puget Sound area of Washington State, US were sampled on three occasions across a one-year period. Results An average of 135 days separated samples, with mean blood-mercury levels for the visits being 5.1, 6.6 and 5.0 μg/l and geometric means being 2.7, 4.5 and 3.1 μg/l. The blood-mercury levels in this group exceed national averages with geometric means for two of the visits being between the 90th and 95th percentiles of nationally observed levels and the lowest geometric mean being between the 75th and 90th percentile. Group means were not significantly different across sampling periods suggesting that exposure of combined subjects remained relatively constant. Comparing intra-individual results over time did not reveal a strong correlation among visits (r = 0.19, 0.50, 0.63 between 1st and 2nd, 2nd and 3rd, and 1st and 3rd sample results, respectively. In comparing blood-mercury levels across two sampling interval combinations (1st and 2nd, 2nd and 3rd, and 1st and 3rd visits, respectively, 58% (n = 34, 53% (n = 31 and 29% (n = 17 of the individuals had at least a 100% difference in blood-Hg levels. Conclusions Point estimates of blood-mercury, when compared with three sample averages, may not reflect temporal variability and individual exposures estimated on the basis of single blood samples should be treated with caution as indicators of long-term exposure

  6. Economic Statistical Design of Variable Sampling Interval X¯$\\overline X $ Control Chart Based on Surrogate Variable Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Lee Tae-Hoon

    2016-12-01

    Full Text Available In many cases, a X¯$\\overline X $ control chart based on a performance variable is used in industrial fields. Typically, the control chart monitors the measurements of a performance variable itself. However, if the performance variable is too costly or impossible to measure, and a less expensive surrogate variable is available, the process may be more efficiently controlled using surrogate variables. In this paper, we present a model for the economic statistical design of a VSI (Variable Sampling Interval X¯$\\overline X $ control chart using a surrogate variable that is linearly correlated with the performance variable. We derive the total average profit model from an economic viewpoint and apply the model to a Very High Temperature Reactor (VHTR nuclear fuel measurement system and derive the optimal result using genetic algorithms. Compared with the control chart based on a performance variable, the proposed model gives a larger expected net income per unit of time in the long-run if the correlation between the performance variable and the surrogate variable is relatively high. The proposed model was confined to the sample mean control chart under the assumption that a single assignable cause occurs according to the Poisson process. However, the model may also be extended to other types of control charts using a single or multiple assignable cause assumptions such as VSS (Variable Sample Size X¯$\\overline X $ control chart, EWMA, CUSUM charts and so on.

  7. Active vibration absorber for CSI evolutionary model: Design and experimental results

    Science.gov (United States)

    Bruner, Anne M.; Belvin, W. Keith; Horta, Lucas G.; Juang, Jer-Nan

    1991-01-01

    The development of control of large flexible structures technology must include practical demonstration to aid in the understanding and characterization of controlled structures in space. To support this effort, a testbed facility was developed to study practical implementation of new control technologies under realistic conditions. The design is discussed of a second order, acceleration feedback controller which acts as an active vibration absorber. This controller provides guaranteed stability margins for collocated sensor/actuator pairs in the absence of sensor/actuator dynamics and computational time delay. The primary performance objective considered is damping augmentation of the first nine structural modes. Comparison of experimental and predicted closed loop damping is presented, including test and simulation time histories for open and closed loop cases. Although the simulation and test results are not in full agreement, robustness of this design under model uncertainty is demonstrated. The basic advantage of this second order controller design is that the stability of the controller is model independent.

  8. The effect of clustering on lot quality assurance sampling: a probabilistic model to calculate sample sizes for quality assessments.

    Science.gov (United States)

    Hedt-Gauthier, Bethany L; Mitsunaga, Tisha; Hund, Lauren; Olives, Casey; Pagano, Marcello

    2013-10-26

    Traditional Lot Quality Assurance Sampling (LQAS) designs assume observations are collected using simple random sampling. Alternatively, randomly sampling clusters of observations and then individuals within clusters reduces costs but decreases the precision of the classifications. In this paper, we develop a general framework for designing the cluster(C)-LQAS system and illustrate the method with the design of data quality assessments for the community health worker program in Rwanda. To determine sample size and decision rules for C-LQAS, we use the beta-binomial distribution to account for inflated risk of errors introduced by sampling clusters at the first stage. We present general theory and code for sample size calculations.The C-LQAS sample sizes provided in this paper constrain misclassification risks below user-specified limits. Multiple C-LQAS systems meet the specified risk requirements, but numerous considerations, including per-cluster versus per-individual sampling costs, help identify optimal systems for distinct applications. We show the utility of C-LQAS for data quality assessments, but the method generalizes to numerous applications. This paper provides the necessary technical detail and supplemental code to support the design of C-LQAS for specific programs.

  9. Characteristics On Culinary Packaging Design Of Ayam Betutu In Denpasar

    Directory of Open Access Journals (Sweden)

    Ni Luh Desi In Diana Sari

    2017-05-01

    Full Text Available The specific objective to be achieved in this research is to identify the characteristics of packaging design that is used to package a culinary ayam betutu in the city of Denpasar. Results of identification by the helpful to formulate criteria for packaging culinary design of/lyum Betutu, which can be used as a guide in creating innovative packaging design (Packaging Design Brief. This research is a qualitative descriptive study. Sources of data obtained through purposive sampling method performed by accidental sampling technique. Sampling is performed in the restaurant as a place selling Ayam Betutu which are spread throughout the city of Denpasar. Data collection techniques performed through observation, documentation and interview. The results achieved are the types of packaging used to pack a culinary of Ayam Betutu in the city of Denpasar. Collecting the results are determined based on the classification of materials, visual appeal, the appeal of practical and packaging criteria. Keywords: Packaging design, Ayam Betutu, Denpasar, Bali typical souvenirs.

  10. Report of the Results of an IMS LEarning Design Expert Workshop

    NARCIS (Netherlands)

    Neumann, Susanne; Klebl, Michael; Griffiths, David; Hernández-Leo, Davinia; De la Fuente-Valentin, Luis; Hummel, Hans; Brouns, Francis; Derntl, Michael; Oberhuemer, Petra

    2009-01-01

    Neumann, S., Klebl, M., Griffiths, D., Hernández-Leo, D., de la Fuente Valentín, L., Hummel, H., Brouns, F., Derntl, M., & Oberhuemer, P. (2010). Report of the Results of an IMS Learning Design Expert Workshop. International Journal Of Emerging Technologies In Learning (IJET), 5(1), pp.

  11. Solvent Hold Tank Sample Results for MCU-16-701-702-703: May 2016 Monthly Sample and MCU-16-710-711-712: May 2016 Superwashed Sample

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Jones, D. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-30

    The Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-701, MCU-16-702 and MCU-16-703), pulled on 05/23/2016, and another set of SHT samples (MCU-16-710, MCU-16-711, and MCU-16-712) were pulled on 05/28/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-701, MCU-16-702, and MCU-16-703 were combined into one sample (MCU-16-701-702-703) and samples MCU-16-710, MCU- 16-711, and MCU-16-712 were combined into one sample (MCU-16-710-711-712). Of the two composite samples MCU-16-710-711-712 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-710-711-712. There were no chemical differences between MCU-16-701-702-703 and superwashed MCU-16-710-711-712. Analysis of the composited sample MCU-16-710-712-713 indicated the Isopar™L concentration is above its nominal level (102%). The modifier (CS-7SB) is 16% below its nominal concentration, while the TiDG and MaxCalix concentrations are at and above their nominal concentrations, respectively. The TiDG level has begun to decrease, and it is 7% below its nominal level as of May 28, 2016. Based on this current analysis, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.

  12. Radiological results for samples collected on paired glass- and cellulose-fiber filters at the Sandia complex, Tonopah Test Range, Nevada

    International Nuclear Information System (INIS)

    Mizell, Steve A.; Shadel, Craig A.

    2016-01-01

    Airborne particulates are collected at U.S. Department of Energy sites that exhibit radiological contamination on the soil surface to help assess the potential for wind to transport radionuclides from the contamination sites. Collecting these samples was originally accomplished by drawing air through a cellulose-fiber filter. These filters were replaced with glass-fiber filters in March 2011. Airborne particulates were collected side by side on the two filter materials between May 2013 and May 2014. Comparisons of the sample mass and the radioactivity determinations for the side-by-side samples were undertaken to determine if the change in the filter medium produced significant results. The differences in the results obtained using the two filter types were assessed visually by evaluating the time series and correlation plots and statistically by conducting a nonparametric matched-pair sign test. Generally, the glass-fiber filters collect larger samples of particulates and produce higher radioactivity values for the gross alpha, gross beta, and gamma spectroscopy analyses. However, the correlation between the radioanalytical results for the glass-fiber filters and the cellulose-fiber filters was not strong enough to generate a linear regression function to estimate the glass-fiber filter sample results from the cellulose-fiber filter sample results.

  13. Vapor space characterization of Waste Tank 241-U-107: Results from samples collected on 2/17/95

    International Nuclear Information System (INIS)

    McVeety, B.D.; Clauss, T.W.; Ligotke, M.W.

    1995-10-01

    This report describes inorganic and organic analyses results from samples obtained from the headspace of the Hanford waste storage Tank 241-U-107 (referred to as Tank U-107). The results described here were obtained to support safety and toxicological evaluations. A summary of the results for inorganic and organic analytes is listed in Table 1. Detailed descriptions of the results appear in the text. Quantitative results were obtained for the inorganic compounds ammonia (NH 3 ), nitrogen dioxide (NO 2 ), nitric oxide (NO), and water (H 2 O). Sampling for hydrogen cyanide (HCN) and sulfur oxides (SO x ) was not requested. In addition, quantitative results were obtained for the 39 TO-14 compounds plus an additional 14 analytes. Of these, 10 were observed above the 5-ppbv reporting cutoff. Sixteen organic tentatively identified compounds (TICs) were observed above the reporting cutoff of (ca.) 10 ppbv, and are reported with concentrations that are semiquantitative estimates based on internal-standard response factors. The 10 organic analytes with the highest estimated concentrations are listed in Table 1 and account for approximately 88% of the total organic components in Tank U-107. Nitrous oxide (N 2 O) was the only permanent gas detected in the tank-headspace samples. Tank U-107 is on the Organic and the Hydrogen Watch Lists

  14. Design of a low cost miniaturized SFCW GPR with initial results

    Science.gov (United States)

    Duggal, Swati; Sinha, Piyush; Gupta, Manish; Patel, Anand; Vedam, V. V.; Mevada, Pratik; Chavda, Rajesh; Shah, Amita; Putrevu, Deepak

    2016-05-01

    This paper discusses about the design &developmental of Ground Penetrating Radar (GPR), various scientific and commercial applications of GPR along with the testing and results of GPR at Antarctica for Ice thickness measurement. GPR instruments are categorised as per their frequency of operation, which is inversely proportional to the depth of penetration. GPRs are also categorized as per method of operation which is time-domain or frequency-domain. Indian market is presently procuring GPRs from only foreign suppliers. Space Applications Centre (SAC) had taken up GPR as R&D Technological development with a view to benchmark the technology which may be transferred to local industry for mass production of instrument at a relatively cheaper cost (~20 times cheaper). Hence, this instrument presents a viable indigenous alternative. Also, the design and configuration was targeted for terrestrial as well as future interplanetary (Lander/Rover) missions of ISRO to map subsurface features. The developed GPR has a very large bandwidth (100%, i.e. bandwidth of 500MHz with centre-frequency of 500MHz) and high dynamic range along with the advantage of being highly portable (<10kg). The system was configured as a Stepped-Frequency-Continuous-Wave (SFCW) GPR which is a frequency domain GPR with the aim to increase the detection capabilities with respect to current systems. In order to achieve this goal, innovative electronic equipment have been designed and developed. Three prototypes were developed and two of them have been delivered for Indian Scientific Expedition to Antarctica (ISEA) in 2013 and 2014-15, respectively and promising results have been obtained. The results from the same closely compare with that from commercial GPR too.

  15. Visual Sample Plan Version 7.0 User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Matzke, Brett D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Newburn, Lisa LN [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bramer, Lisa M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilson, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dowson, Scott T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sego, Landon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pulsipher, Brent A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    User's guide for VSP 7.0 This user's guide describes Visual Sample Plan (VSP) Version 7.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to risk decisions have the required confidence and performance. VSP Version 7.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. It also provides data quality assessment and statistical analysis functions to support evaluation of the data and determine whether the data support decisions regarding sites suspected of contamination. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (XP, Vista, Windows 7, and Windows 8). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to two- and three-dimensional populations to be sampled (e.g., rooms and buildings, surface soil, a defined layer of subsurface soil, water bodies, and other similar applications) for studies of environmental quality. VSP is also applicable for designing sampling plans for assessing chem/rad/bio threat and hazard identification within rooms and buildings, and for designing geophysical surveys for unexploded ordnance (UXO) identification.

  16. Studying the sampling representativeness in the NPP ventilation ducts

    International Nuclear Information System (INIS)

    Sosnovskij, R.I.; Fedchenko, T.K.; Minin, S.A.

    2000-01-01

    Measurements of the gas and aerosol voluminous activity in the NPP ventilation ducts are an important source of information on the radiation contaminants ingress into the environmental medium. These measurements include sampling, samples transport and proper measurements. The work is devoted to calculation of metrological characteristics of the sampling systems for the NPP gas-aerosol releases by different parameters of these systems and ventilation ducts. The results obtained are intended for application by designing such systems and their metrological certification [ru

  17. A Novel Analog Integrated Circuit Design Course Covering Design, Layout, and Resulting Chip Measurement

    Science.gov (United States)

    Lin, Wei-Liang; Cheng, Wang-Chuan; Wu, Chen-Hao; Wu, Hai-Ming; Wu, Chang-Yu; Ho, Kuan-Hsuan; Chan, Chueh-An

    2010-01-01

    This work describes a novel, first-year graduate-level analog integrated circuit (IC) design course. The course teaches students analog circuit design; an external manufacturer then produces their designs in three different silicon chips. The students, working in pairs, then test these chips to verify their success. All work is completed within…

  18. Analytical results for 544 water samples collected in the Attean Quartz Monzonite in the vicinity of Jackman, Maine

    Science.gov (United States)

    Ficklin, W.H.; Nowlan, G.A.; Preston, D.J.

    1983-01-01

    Water samples were collected in the vicinity of Jackman, Maine as a part of the study of the relationship of dissolved constituents in water to the sediments subjacent to the water. Each sample was analyzed for specific conductance, alkalinity, acidity, pH, fluoride, chloride, sulfate, phosphate, nitrate, sodium, potassium, calcium, magnesium, and silica. Trace elements determined were copper, zinc, molybdenum, lead, iron, manganese, arsenic, cobalt, nickel, and strontium. The longitude and latitude of each sample location and a sample site map are included in the report as well as a table of the analytical results.

  19. Validity test of design calculations of a PGNAA setup

    International Nuclear Information System (INIS)

    Naqvi, A.A.; Garwan, M.A.

    2004-01-01

    A rectangular moderator has been designed for the prompt gamma ray neutron activation analysis (PGNAA) setup at King Fahd University of Petroleum and Minerals (KFUPM) to analyze Portland cement samples. The design of the moderator assembly was obtained using Monte Carlo calculations. The design calculations of the new rectangular moderator of the KFUPM PGNAA setup have been verified experimentally through prompt gamma ray yield measurement as a function of the front moderator thickness. In this study the yield of the 3.54 and 4.94 MeV prompt gamma rays from silicon in a soil sample was measured as a function of thickness of the front moderator of the rectangular moderator. The experimental results were compared with the results of the Monte Carlo simulations. A good agreement has been achieved between the experimental results and the results of the calculations. The experimental results have provided useful information about the PGNAA setup performance, neutron moderation, and gamma ray attenuation in the PGNAA sample

  20. A comprehensive summary of the ORNL Groundwater Compliance and Surveillance Sampling Results Software System

    International Nuclear Information System (INIS)

    Loffman, R.S.

    1995-01-01

    Groundwater compliance and surveillance activities are conducted at ORNL to fulfill federal and state requirements for environmental monitoring. Information management is an important aspect of this and encompasses many activities which usually have spcific time frames and schedules. In addition to fulfilling these immediate requirements, the results for the monitoring activities are also used to determine the need for environmental remediation. ORNL performs this groundwater results data management and reporting utilizing a group of SAS reg-sign applications and tools which provide the ability to track samples, capture field measurements, verify and validate result data, manage data, and report results in a variety of ways and in a timely manner. This paper provides a comprehensive summary of these applications and tools