WorldWideScience

Sample records for binomial sampling plan

  1. Enumerative and binomial sequential sampling plans for the multicolored Asian lady beetle (Coleoptera: Coccinellidae) in wine grapes.

    Science.gov (United States)

    Galvan, T L; Burkness, E C; Hutchison, W D

    2007-06-01

    To develop a practical integrated pest management (IPM) system for the multicolored Asian lady beetle, Harmonia axyridis (Pallas) (Coleoptera: Coccinellidae), in wine grapes, we assessed the spatial distribution of H. axyridis and developed eight sampling plans to estimate adult density or infestation level in grape clusters. We used 49 data sets collected from commercial vineyards in 2004 and 2005, in Minnesota and Wisconsin. Enumerative plans were developed using two precision levels (0.10 and 0.25); the six binomial plans reflected six unique action thresholds (3, 7, 12, 18, 22, and 31% of cluster samples infested with at least one H. axyridis). The spatial distribution of H. axyridis in wine grapes was aggregated, independent of cultivar and year, but it was more randomly distributed as mean density declined. The average sample number (ASN) for each sampling plan was determined using resampling software. For research purposes, an enumerative plan with a precision level of 0.10 (SE/X) resulted in a mean ASN of 546 clusters. For IPM applications, the enumerative plan with a precision level of 0.25 resulted in a mean ASN of 180 clusters. In contrast, the binomial plans resulted in much lower ASNs and provided high probabilities of arriving at correct "treat or no-treat" decisions, making these plans more efficient for IPM applications. For a tally threshold of one adult per cluster, the operating characteristic curves for the six action thresholds provided binomial sequential sampling plans with mean ASNs of only 19-26 clusters, and probabilities of making correct decisions between 83 and 96%. The benefits of the binomial sampling plans are discussed within the context of improving IPM programs for wine grapes.

  2. Comparison and Field Validation of Binomial Sampling Plans for Oligonychus perseae (Acari: Tetranychidae) on Hass Avocado in Southern California.

    Science.gov (United States)

    Lara, Jesus R; Hoddle, Mark S

    2015-08-01

    Oligonychus perseae Tuttle, Baker, & Abatiello is a foliar pest of 'Hass' avocados [Persea americana Miller (Lauraceae)]. The recommended action threshold is 50-100 motile mites per leaf, but this count range and other ecological factors associated with O. perseae infestations limit the application of enumerative sampling plans in the field. Consequently, a comprehensive modeling approach was implemented to compare the practical application of various binomial sampling models for decision-making of O. perseae in California. An initial set of sequential binomial sampling models were developed using three mean-proportion modeling techniques (i.e., Taylor's power law, maximum likelihood, and an empirical model) in combination with two-leaf infestation tally thresholds of either one or two mites. Model performance was evaluated using a robust mite count database consisting of >20,000 Hass avocado leaves infested with varying densities of O. perseae and collected from multiple locations. Operating characteristic and average sample number results for sequential binomial models were used as the basis to develop and validate a standardized fixed-size binomial sampling model with guidelines on sample tree and leaf selection within blocks of avocado trees. This final validated model requires a leaf sampling cost of 30 leaves and takes into account the spatial dynamics of O. perseae to make reliable mite density classifications for a 50-mite action threshold. Recommendations for implementing this fixed-size binomial sampling plan to assess densities of O. perseae in commercial California avocado orchards are discussed. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Forward selection two sample binomial test

    Science.gov (United States)

    Wong, Kam-Fai; Wong, Weng-Kee; Lin, Miao-Shan

    2016-01-01

    Fisher’s exact test (FET) is a conditional method that is frequently used to analyze data in a 2 × 2 table for small samples. This test is conservative and attempts have been made to modify the test to make it less conservative. For example, Crans and Shuster (2008) proposed adding more points in the rejection region to make the test more powerful. We provide another way to modify the test to make it less conservative by using two independent binomial distributions as the reference distribution for the test statistic. We compare our new test with several methods and show that our test has advantages over existing methods in terms of control of the type 1 and type 2 errors. We reanalyze results from an oncology trial using our proposed method and our software which is freely available to the reader. PMID:27335577

  4. Implementing reduced-risk integrated pest management in fresh-market cabbage: influence of sampling parameters, and validation of binomial sequential sampling plans for the cabbage looper (Lepidoptera Noctuidae).

    Science.gov (United States)

    Burkness, Eric C; Hutchison, W D

    2009-10-01

    Populations of cabbage looper, Trichoplusiani (Lepidoptera: Noctuidae), were sampled in experimental plots and commercial fields of cabbage (Brasicca spp.) in Minnesota during 1998-1999 as part of a larger effort to implement an integrated pest management program. Using a resampling approach and the Wald's sequential probability ratio test, sampling plans with different sampling parameters were evaluated using independent presence/absence and enumerative data. Evaluations and comparisons of the different sampling plans were made based on the operating characteristic and average sample number functions generated for each plan and through the use of a decision probability matrix. Values for upper and lower decision boundaries, sequential error rates (alpha, beta), and tally threshold were modified to determine parameter influence on the operating characteristic and average sample number functions. The following parameters resulted in the most desirable operating characteristic and average sample number functions; action threshold of 0.1 proportion of plants infested, tally threshold of 1, alpha = beta = 0.1, upper boundary of 0.15, lower boundary of 0.05, and resampling with replacement. We found that sampling parameters can be modified and evaluated using resampling software to achieve desirable operating characteristic and average sample number functions. Moreover, management of T. ni by using binomial sequential sampling should provide a good balance between cost and reliability by minimizing sample size and maintaining a high level of correct decisions (>95%) to treat or not treat.

  5. Sample size calculation for comparing two negative binomial rates.

    Science.gov (United States)

    Zhu, Haiyuan; Lakkis, Hassan

    2014-02-10

    Negative binomial model has been increasingly used to model the count data in recent clinical trials. It is frequently chosen over Poisson model in cases of overdispersed count data that are commonly seen in clinical trials. One of the challenges of applying negative binomial model in clinical trial design is the sample size estimation. In practice, simulation methods have been frequently used for sample size estimation. In this paper, an explicit formula is developed to calculate sample size based on the negative binomial model. Depending on different approaches to estimate the variance under null hypothesis, three variations of the sample size formula are proposed and discussed. Important characteristics of the formula include its accuracy and its ability to explicitly incorporate dispersion parameter and exposure time. The performance of the formula with each variation is assessed using simulations. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Binomial and enumerative sampling of Tetranychus urticae (Acari: Tetranychidae) on peppermint in California.

    Science.gov (United States)

    Tollerup, Kris E; Marcum, Daniel; Wilson, Rob; Godfrey, Larry

    2013-08-01

    The two-spotted spider mite, Tetranychus urticae Koch, is an economic pest on peppermint [Mentha x piperita (L.), 'Black Mitcham'] grown in California. A sampling plan for T. urticae was developed under Pacific Northwest conditions in the early 1980s and has been used by California growers since approximately 1998. This sampling plan, however, is cumbersome and a poor predictor of T. urticae densities in California. Between June and August, the numbers of immature and adult T. urticae were counted on leaves at three commercial peppermint fields (sites) in 2010 and a single field in 2011. In each of seven locations per site, 45 leaves were sampled, that is, 9 leaves per five stems. Leaf samples were stratified by collecting three leaves from the top, middle, and bottom strata per stem. The on-plant distribution of T. urticae did not significantly differ among the stem strata through the growing season. Binomial and enumerative sampling plans were developed using generic Taylor's power law coefficient values. The best fit of our data for binomial sampling occurred using a tally threshold of T = 0. The optimum number of leaves required for T urticae at the critical density of five mites per leaf was 20 for the binomial and 23 for the enumerative sampling plans, respectively. Sampling models were validated using Resampling for Validation of Sampling Plan Software.

  7. Binomial Distribution Sample Confidence Intervals Estimation 1. Sampling and Medical Key Parameters Calculation

    Directory of Open Access Journals (Sweden)

    Tudor DRUGAN

    2003-08-01

    Full Text Available The aim of the paper was to present the usefulness of the binomial distribution in studying of the contingency tables and the problems of approximation to normality of binomial distribution (the limits, advantages, and disadvantages. The classification of the medical keys parameters reported in medical literature and expressing them using the contingency table units based on their mathematical expressions restrict the discussion of the confidence intervals from 34 parameters to 9 mathematical expressions. The problem of obtaining different information starting with the computed confidence interval for a specified method, information like confidence intervals boundaries, percentages of the experimental errors, the standard deviation of the experimental errors and the deviation relative to significance level was solves through implementation in PHP programming language of original algorithms. The cases of expression, which contain two binomial variables, were separately treated. An original method of computing the confidence interval for the case of two-variable expression was proposed and implemented. The graphical representation of the expression of two binomial variables for which the variation domain of one of the variable depend on the other variable was a real problem because the most of the software used interpolation in graphical representation and the surface maps were quadratic instead of triangular. Based on an original algorithm, a module was implements in PHP in order to represent graphically the triangular surface plots. All the implementation described above was uses in computing the confidence intervals and estimating their performance for binomial distributions sample sizes and variable.

  8. Reliability of environmental sampling culture results using the negative binomial intraclass correlation coefficient.

    Science.gov (United States)

    Aly, Sharif S; Zhao, Jianyang; Li, Ben; Jiang, Jiming

    2014-01-01

    The Intraclass Correlation Coefficient (ICC) is commonly used to estimate the similarity between quantitative measures obtained from different sources. Overdispersed data is traditionally transformed so that linear mixed model (LMM) based ICC can be estimated. A common transformation used is the natural logarithm. The reliability of environmental sampling of fecal slurry on freestall pens has been estimated for Mycobacterium avium subsp. paratuberculosis using the natural logarithm transformed culture results. Recently, the negative binomial ICC was defined based on a generalized linear mixed model for negative binomial distributed data. The current study reports on the negative binomial ICC estimate which includes fixed effects using culture results of environmental samples. Simulations using a wide variety of inputs and negative binomial distribution parameters (r; p) showed better performance of the new negative binomial ICC compared to the ICC based on LMM even when negative binomial data was logarithm, and square root transformed. A second comparison that targeted a wider range of ICC values showed that the mean of estimated ICC closely approximated the true ICC.

  9. Sample size determination for a three-arm equivalence trial of Poisson and negative binomial responses.

    Science.gov (United States)

    Chang, Yu-Wei; Tsong, Yi; Zhao, Zhigen

    2017-01-01

    Assessing equivalence or similarity has drawn much attention recently as many drug products have lost or will lose their patents in the next few years, especially certain best-selling biologics. To claim equivalence between the test treatment and the reference treatment when assay sensitivity is well established from historical data, one has to demonstrate both superiority of the test treatment over placebo and equivalence between the test treatment and the reference treatment. Thus, there is urgency for practitioners to derive a practical way to calculate sample size for a three-arm equivalence trial. The primary endpoints of a clinical trial may not always be continuous, but may be discrete. In this paper, the authors derive power function and discuss sample size requirement for a three-arm equivalence trial with Poisson and negative binomial clinical endpoints. In addition, the authors examine the effect of the dispersion parameter on the power and the sample size by varying its coefficient from small to large. In extensive numerical studies, the authors demonstrate that required sample size heavily depends on the dispersion parameter. Therefore, misusing a Poisson model for negative binomial data may easily lose power up to 20%, depending on the value of the dispersion parameter.

  10. Binomial Distribution Sample Confidence Intervals Estimation 7. Absolute Risk Reduction and ARR-like Expressions

    Directory of Open Access Journals (Sweden)

    Andrei ACHIMAŞ CADARIU

    2004-08-01

    Full Text Available Assessments of a controlled clinical trial suppose to interpret some key parameters as the controlled event rate, experimental event date, relative risk, absolute risk reduction, relative risk reduction, number needed to treat when the effect of the treatment are dichotomous variables. Defined as the difference in the event rate between treatment and control groups, the absolute risk reduction is the parameter that allowed computing the number needed to treat. The absolute risk reduction is compute when the experimental treatment reduces the risk for an undesirable outcome/event. In medical literature when the absolute risk reduction is report with its confidence intervals, the method used is the asymptotic one, even if it is well know that may be inadequate. The aim of this paper is to introduce and assess nine methods of computing confidence intervals for absolute risk reduction and absolute risk reduction – like function.Computer implementations of the methods use the PHP language. Methods comparison uses the experimental errors, the standard deviations, and the deviation relative to the imposed significance level for specified sample sizes. Six methods of computing confidence intervals for absolute risk reduction and absolute risk reduction-like functions were assessed using random binomial variables and random sample sizes.The experiments shows that the ADAC, and ADAC1 methods obtains the best overall performance of computing confidence intervals for absolute risk reduction.

  11. IAEA Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  12. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  13. Statistical inference involving binomial and negative binomial parameters.

    Science.gov (United States)

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2009-05-01

    Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.

  14. Sample Size Estimation for Negative Binomial Regression Comparing Rates of Recurrent Events with Unequal Follow-Up Time.

    Science.gov (United States)

    Tang, Yongqiang

    2015-01-01

    A sample size formula is derived for negative binomial regression for the analysis of recurrent events, in which subjects can have unequal follow-up time. We obtain sharp lower and upper bounds on the required size, which is easy to compute. The upper bound is generally only slightly larger than the required size, and hence can be used to approximate the sample size. The lower and upper size bounds can be decomposed into two terms. The first term relies on the mean number of events in each group, and the second term depends on two factors that measure, respectively, the extent of between-subject variability in event rates, and follow-up time. Simulation studies are conducted to assess the performance of the proposed method. An application of our formulae to a multiple sclerosis trial is provided.

  15. Sample size for comparing negative binomial rates in noninferiority and equivalence trials with unequal follow-up times.

    Science.gov (United States)

    Tang, Yongqiang

    2017-05-25

    We derive the sample size formulae for comparing two negative binomial rates based on both the relative and absolute rate difference metrics in noninferiority and equivalence trials with unequal follow-up times, and establish an approximate relationship between the sample sizes required for the treatment comparison based on the two treatment effect metrics. The proposed method allows the dispersion parameter to vary by treatment groups. The accuracy of these methods is assessed by simulations. It is demonstrated that ignoring the between-subject variation in the follow-up time by setting the follow-up time for all individuals to be the mean follow-up time may greatly underestimate the required size, resulting in underpowered studies. Methods are provided for back-calculating the dispersion parameter based on the published summary results.

  16. Bipartite binomial heaps

    DEFF Research Database (Denmark)

    Elmasry, Amr; Jensen, Claus; Katajainen, Jyrki

    2017-01-01

    the (total) number of elements stored in the data structure(s) prior to the operation. As the resulting data structure consists of two components that are different variants of binomial heaps, we call it a bipartite binomial heap. Compared to its counterpart, a multipartite binomial heap, the new structure...

  17. QNB: differential RNA methylation analysis for count-based small-sample sequencing data with a quad-negative binomial model.

    Science.gov (United States)

    Liu, Lian; Zhang, Shao-Wu; Huang, Yufei; Meng, Jia

    2017-08-31

    As a newly emerged research area, RNA epigenetics has drawn increasing attention recently for the participation of RNA methylation and other modifications in a number of crucial biological processes. Thanks to high throughput sequencing techniques, such as, MeRIP-Seq, transcriptome-wide RNA methylation profile is now available in the form of count-based data, with which it is often of interests to study the dynamics at epitranscriptomic layer. However, the sample size of RNA methylation experiment is usually very small due to its costs; and additionally, there usually exist a large number of genes whose methylation level cannot be accurately estimated due to their low expression level, making differential RNA methylation analysis a difficult task. We present QNB, a statistical approach for differential RNA methylation analysis with count-based small-sample sequencing data. Compared with previous approaches such as DRME model based on a statistical test covering the IP samples only with 2 negative binomial distributions, QNB is based on 4 independent negative binomial distributions with their variances and means linked by local regressions, and in the way, the input control samples are also properly taken care of. In addition, different from DRME approach, which relies only the input control sample only for estimating the background, QNB uses a more robust estimator for gene expression by combining information from both input and IP samples, which could largely improve the testing performance for very lowly expressed genes. QNB showed improved performance on both simulated and real MeRIP-Seq datasets when compared with competing algorithms. And the QNB model is also applicable to other datasets related RNA modifications, including but not limited to RNA bisulfite sequencing, m 1 A-Seq, Par-CLIP, RIP-Seq, etc.

  18. CUMBIN - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.

  19. Special nuclear material inventory sampling plans

    International Nuclear Information System (INIS)

    Vaccaro, H.; Goldman, A.

    1987-01-01

    Since their introduction in 1942, sampling inspection procedures have been common quality assurance practice. The U.S. Department of Energy (DOE) supports such sampling of special nuclear materials inventories. The DOE Order 5630.7 states, Operations Offices may develop and use statistically valid sampling plans appropriate for their site-specific needs. The benefits for nuclear facilities operations include reduced worker exposure and reduced work load. Improved procedures have been developed for obtaining statistically valid sampling plans that maximize these benefits. The double sampling concept is described and the resulting sample sizes for double sample plans are compared with other plans. An algorithm is given for finding optimal double sampling plans that assist in choosing the appropriate detection and false alarm probabilities for various sampling plans

  20. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  1. Sample Lesson Plans. Management for Effective Teaching.

    Science.gov (United States)

    Fairfax County Public Schools, VA. Dept. of Instructional Services.

    This guide is part of the Management for Effective Teaching (MET) support kit, a pilot project developed by the Fairfax County (Virginia) Public Schools to assist elementary school teachers in planning, managaing, and implementing the county's curriculum, Program of Studies (POS). In this guide, a sample lesson plan of a teaching-learning activity…

  2. Nitrate Waste Treatment Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    Vigil-Holterman, Luciana R. [Los Alamos National Laboratory; Martinez, Patrick Thomas [Los Alamos National Laboratory; Garcia, Terrence Kerwin [Los Alamos National Laboratory

    2017-07-05

    This plan is designed to outline the collection and analysis of nitrate salt-bearing waste samples required by the New Mexico Environment Department- Hazardous Waste Bureau in the Los Alamos National Laboratory (LANL) Hazardous Waste Facility Permit (Permit).

  3. Feasible sampling plan for Bemisia tabaci control decision-making in watermelon fields.

    Science.gov (United States)

    Lima, Carlos Ho; Sarmento, Renato A; Pereira, Poliana S; Galdino, Tarcísio Vs; Santos, Fábio A; Silva, Joedna; Picanço, Marcelo C

    2017-11-01

    The silverleaf whitefly Bemisia tabaci is one of the most important pests of watermelon fields worldwide. Conventional sampling plans are the starting point for the generation of decision-making systems of integrated pest management programs. The aim of this study was to determine a conventional sampling plan for B. tabaci in watermelon fields. The optimal leaf for B. tabaci adult sampling was the 6 th most apical leaf. Direct counting was the best pest sampling technique. Crop pest densities fitted the negative binomial distribution and had a common aggregation parameter (K common ). The sampling plan consisted of evaluating 103 samples per plot. This sampling plan was conducted for 56 min, costing US$ 2.22 per sampling and with a 10% maximum evaluation error. The sampling plan determined in this study can be adopted by farmers because it enables the adequate evaluation of B. tabaci populations in watermelon fields (10% maximum evaluation error) and is a low-cost (US$ 2.22 per sampling), fast (56 min per sampling) and feasible (because it may be used in a standardized way throughout the crop cycle) technique. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  4. Sampling Plans for the Thrips Frankliniella schultzei (Thysanoptera: Thripidae) in Three Lettuce Varieties.

    Science.gov (United States)

    Silva, Alisson R; Rodrigues-Silva, Nilson; Pereira, Poliana S; Sarmento, Renato A; Costa, Thiago L; Galdino, Tarcísio V S; Picanço, Marcelo C

    2017-12-05

    The common blossom thrips, Frankliniella schultzei Trybom (Thysanoptera: Thripidae), is an important lettuce pest worldwide. Conventional sampling plans are the first step in implementing decision-making systems into integrated pest management programs. However, this tool is not available for F. schultzei infesting lettuce crops. Thus, the objective of this work was to develop a conventional sampling plan for F. schultzei in lettuce crops. Two sampling techniques (direct counting and leaf beating on a white plastic tray) were compared in crisphead, looseleaf, and Boston lettuce varieties before and during head formation. The frequency distributions of F. schultzei densities in lettuce crops were assessed, and the number of samples required to compose the sampling plan was determined. Leaf beating on a white plastic tray was the best sampling technique. F. schultzei densities obtained with this technique were fitted to the negative binomial distribution with a common aggregation parameter (common K = 0.3143). The developed sampling plan is composed of 91 samples per field and presents low errors in its estimates (up to 20%), fast execution time (up to 47 min), and low cost (up to US $1.67 per sampling area). This sampling plan can be used as a tool for integrated pest management in lettuce crops, assisting with reliable decision making in different lettuce varieties before and during head formation. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Negative binomial multiplicity distribution from binomial cluster production

    International Nuclear Information System (INIS)

    Iso, C.; Mori, K.

    1990-01-01

    Two-step interpretation of negative binomial multiplicity distribution as a compound of binomial cluster production and negative binomial like cluster decay distribution is proposed. In this model we can expect the average multiplicity for the cluster production increases with increasing energy, different from a compound Poisson-Logarithmic distribution. (orig.)

  6. Binomial collisions and near collisions

    OpenAIRE

    Blokhuis, Aart; Brouwer, Andries; de Weger, Benne

    2017-01-01

    We describe efficient algorithms to search for cases in which binomial coefficients are equal or almost equal, give a conjecturally complete list of all cases where two binomial coefficients differ by 1, and give some identities for binomial coefficients that seem to be new.

  7. Hanford site transuranic waste sampling plan

    International Nuclear Information System (INIS)

    GREAGER, T.M.

    1999-01-01

    This sampling plan (SP) describes the selection of containers for sampling of homogeneous solids and soil/gravel and for visual examination of transuranic and mixed transuranic (collectively referred to as TRU) waste generated at the U.S. Department of Energy (DOE) Hanford Site. The activities described in this SP will be conducted under the Hanford Site TRU Waste Certification Program. This SP is designed to meet the requirements of the Transuranic Waste Characterization Quality Assurance Program Plan (CAO-94-1010) (DOE 1996a) (QAPP), site-specific implementation of which is described in the Hanford Site Transuranic Waste Characterization Program Quality Assurance Project Plan (HNF-2599) (Hanford 1998b) (QAPP). The QAPP defines the quality assurance (QA) requirements and protocols for TRU waste characterization activities at the Hanford Site. In addition, the QAPP identifies responsible organizations, describes required program activities, outlines sampling and analysis strategies, and identifies procedures for characterization activities. The QAPP identifies specific requirements for TRU waste sampling plans. Table 1-1 presents these requirements and indicates sections in this SP where these requirements are addressed

  8. Integrating Public Perspectives in Sample Return Planning

    Science.gov (United States)

    Race, Margaret S.; MacGregor, G.

    2001-01-01

    Planning for extraterrestrial sample returns, whether from Mars or other solar system bodies, must be done in a way that integrates planetary protection concerns with the usual mission technical and scientific considerations. Understanding and addressing legitimate societal concerns about the possible risks of sample return will be a critical part of the public decision making process ahead. This paper presents the results of two studies, one with lay audiences, the other with expert microbiologists, designed to gather information, on attitudes and concerns about sample return risks and planetary protection. Focus group interviews with lay subjects, using generic information about Mars sample return and a preliminary environmental impact assessment, were designed to obtain an indication of how the factual content is perceived and understood by the public. A research survey of microbiologists gathered information on experts' views and attitudes about sample return, risk management approaches and space exploration risks. These findings, combined with earlier research results on risk perception, will be useful in identifying levels of concern and potential conflicts in understanding between experts and the public about sample return risks. The information will be helpful in guiding development of the environmental impact statement and also has applicability to proposals for sample return from other solar system bodies where scientific uncertainty about extraterrestrial life may persist at the time of mission planning.

  9. Hydrogeologic investigations sampling plan: Revision 0

    International Nuclear Information System (INIS)

    1988-11-01

    The goal of this sampling plan is to identify and develop specific plans for those investigative actions necessary to: (1) characterize the hydrologic regime; (2) define the extent and impact of contamination; and (3) predict future contaminant migration for the Weldon Spring Site (WSS) and vicinity. The plan is part of the Weldon Spring Site Remedial Action Project (WSSRAP) sponsored by the US Department of Energy (DOE) and has been developed in accordance with US EPA Remedial Investigation (RI) and Data Quality Objective (DQO) guidelines. The plan consists of a sequence of activities including the evaluation of data, development of a conceptual model, identification of data uses and needs, and the design and implementation of a data collection program. Data will be obtained to: (1) confirm the presence or absence of contaminants; (2) define contaminant sources and modes of transport; (3) delineate extent of contaminant migration and predict future migration; and (4) provide information to support the evaluation and selection of remedial actions. 81 refs., 62 figs., 26 tabs

  10. Adaptive estimation of binomial probabilities under misclassification

    NARCIS (Netherlands)

    Albers, Willem/Wim; Veldman, H.J.

    1984-01-01

    If misclassification occurs the standard binomial estimator is usually seriously biased. It is known that an improvement can be achieved by using more than one observer in classifying the sample elements. Here it will be investigated which number of observers is optimal given the total number of

  11. Adaptive bayesian analysis for binomial proportions

    CSIR Research Space (South Africa)

    Das, Sonali

    2008-10-01

    Full Text Available The authors consider the problem of statistical inference of binomial proportions for non-matched, correlated samples, under the Bayesian framework. Such inference can arise when the same group is observed at a different number of times with the aim...

  12. CROSSER - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, CROSSER, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), can be used independently of one another. CROSSER can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CROSSER calculates the point at which the reliability of a k-out-of-n system equals the common reliability of the n components. It is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The CROSSER program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CROSSER was developed in 1988.

  13. Visual Sample Plan (VSP) - FIELDS Integration

    Energy Technology Data Exchange (ETDEWEB)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Hassig, Nancy L.; Carlson, Deborah K.; Bing-Canar, John; Cooper, Brian; Roth, Chuck

    2003-04-19

    Two software packages, VSP 2.1 and FIELDS 3.5, are being used by environmental scientists to plan the number and type of samples required to meet project objectives, display those samples on maps, query a database of past sample results, produce spatial models of the data, and analyze the data in order to arrive at defensible decisions. VSP 2.0 is an interactive tool to calculate optimal sample size and optimal sample location based on user goals, risk tolerance, and variability in the environment and in lab methods. FIELDS 3.0 is a set of tools to explore the sample results in a variety of ways to make defensible decisions with quantified levels of risk and uncertainty. However, FIELDS 3.0 has a small sample design module. VSP 2.0, on the other hand, has over 20 sampling goals, allowing the user to input site-specific assumptions such as non-normality of sample results, separate variability between field and laboratory measurements, make two-sample comparisons, perform confidence interval estimation, use sequential search sampling methods, and much more. Over 1,000 copies of VSP are in use today. FIELDS is used in nine of the ten U.S. EPA regions, by state regulatory agencies, and most recently by several international countries. Both software packages have been peer-reviewed, enjoy broad usage, and have been accepted by regulatory agencies as well as site project managers as key tools to help collect data and make environmental cleanup decisions. Recently, the two software packages were integrated, allowing the user to take advantage of the many design options of VSP, and the analysis and modeling options of FIELDS. The transition between the two is simple for the user – VSP can be called from within FIELDS, automatically passing a map to VSP and automatically retrieving sample locations and design information when the user returns to FIELDS. This paper will describe the integration, give a demonstration of the integrated package, and give users download

  14. Distinguishing between Binomial, Hypergeometric and Negative Binomial Distributions

    Science.gov (United States)

    Wroughton, Jacqueline; Cole, Tarah

    2013-01-01

    Recognizing the differences between three discrete distributions (Binomial, Hypergeometric and Negative Binomial) can be challenging for students. We present an activity designed to help students differentiate among these distributions. In addition, we present assessment results in the form of pre- and post-tests that were designed to assess the…

  15. Transuranic waste characterization sampling and analysis plan

    International Nuclear Information System (INIS)

    1994-01-01

    Los Alamos National Laboratory (the Laboratory) is located approximately 25 miles northwest of Santa Fe, New Mexico, situated on the Pajarito Plateau. Technical Area 54 (TA-54), one of the Laboratory's many technical areas, is a radioactive and hazardous waste management and disposal area located within the Laboratory's boundaries. The purpose of this transuranic waste characterization, sampling, and analysis plan (CSAP) is to provide a methodology for identifying, characterizing, and sampling approximately 25,000 containers of transuranic waste stored at Pads 1, 2, and 4, Dome 48, and the Fiberglass Reinforced Plywood Box Dome at TA-54, Area G, of the Laboratory. Transuranic waste currently stored at Area G was generated primarily from research and development activities, processing and recovery operations, and decontamination and decommissioning projects. This document was created to facilitate compliance with several regulatory requirements and program drivers that are relevant to waste management at the Laboratory, including concerns of the New Mexico Environment Department

  16. NEWTONP - CUMULATIVE BINOMIAL PROGRAMS

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative binomial program, NEWTONP, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), can be used independently of one another. NEWTONP can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. NEWTONP calculates the probably p required to yield a given system reliability V for a k-out-of-n system. It can also be used to determine the Clopper-Pearson confidence limits (either one-sided or two-sided) for the parameter p of a Bernoulli distribution. NEWTONP can determine Bayesian probability limits for a proportion (if the beta prior has positive integer parameters). It can determine the percentiles of incomplete beta distributions with positive integer parameters. It can also determine the percentiles of F distributions and the midian plotting positions in probability plotting. NEWTONP is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. NEWTONP is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. It also lists the number of iterations of Newton's method required to calculate the answer within the given error. The NEWTONP program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. NEWTONP was developed in 1988.

  17. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-01-01

    This document provides a management tool for evaluating and designing the appropriate elements of a field sampling program. This document provides discussion of the elements of a program and is to be used as a guidance document during the preparation of project and/or function specific documentation. This document does not specify how a sampling program shall be organized. The HSQMP is to be used as a companion document to the Hanford Analytical Services Quality Assurance Plan (HASQAP) DOE/RL-94-55. The generation of this document was enhanced by conducting baseline evaluations of current sampling organizations. Valuable input was received from members of field and Quality Assurance organizations. The HSQMP is expected to be a living document. Revisions will be made as regulations and or Hanford Site conditions warrant changes in the best management practices. Appendices included are: summary of the sampling and analysis work flow process, a user's guide to the Data Quality Objective process, and a self-assessment checklist

  18. Binomial Rings: Axiomatisation, Transfer and Classification

    OpenAIRE

    Xantcha, Qimh Richey

    2011-01-01

    Hall's binomial rings, rings with binomial coefficients, are given an axiomatisation and proved identical to the numerical rings studied by Ekedahl. The Binomial Transfer Principle is established, enabling combinatorial proofs of algebraical identities. The finitely generated binomial rings are completely classified. An application to modules over binomial rings is given.

  19. On a Fractional Binomial Process

    Science.gov (United States)

    Cahoy, Dexter O.; Polito, Federico

    2012-02-01

    The classical binomial process has been studied by Jakeman (J. Phys. A 23:2815-2825, 1990) (and the references therein) and has been used to characterize a series of radiation states in quantum optics. In particular, he studied a classical birth-death process where the chance of birth is proportional to the difference between a larger fixed number and the number of individuals present. It is shown that at large times, an equilibrium is reached which follows a binomial process. In this paper, the classical binomial process is generalized using the techniques of fractional calculus and is called the fractional binomial process. The fractional binomial process is shown to preserve the binomial limit at large times while expanding the class of models that include non-binomial fluctuations (non-Markovian) at regular and small times. As a direct consequence, the generality of the fractional binomial model makes the proposed model more desirable than its classical counterpart in describing real physical processes. More statistical properties are also derived.

  20. 40 CFR 141.802 - Coliform sampling plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Coliform sampling plan. 141.802... sampling plan. (a) Each air carrier under this subpart must develop a coliform sampling plan covering each... required actions, including repeat and follow-up sampling, corrective action, and notification of...

  1. The Binomial Distribution in Shooting

    Science.gov (United States)

    Chalikias, Miltiadis S.

    2009-01-01

    The binomial distribution is used to predict the winner of the 49th International Shooting Sport Federation World Championship in double trap shooting held in 2006 in Zagreb, Croatia. The outcome of the competition was definitely unexpected.

  2. Introduction to Sample Plan Package for Farms

    Science.gov (United States)

    An example of a completed and self-certified Tier I Qualified Facility SPCC Plan using the template found in Appendix G of the SPCC rule (40 CFR part 112). This example illustrates how to develop an SPCC Plan using a farm scenario.

  3. Tank 241-BY-105 rotary core sampling and analysis plan

    International Nuclear Information System (INIS)

    Sasaki, L.M.

    1995-01-01

    This Sampling and Analysis Plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for two rotary-mode core samples from tank 241-BY-105 (BY-105)

  4. Long-Term Ecological Monitoring Field Sampling Plan for 2007

    International Nuclear Information System (INIS)

    T. Haney R. VanHorn

    2007-01-01

    This field sampling plan describes the field investigations planned for the Long-Term Ecological Monitoring Project at the Idaho National Laboratory Site in 2007. This plan and the Quality Assurance Project Plan for Waste Area Groups 1, 2, 3, 4, 5, 6, 7, 10, and Removal Actions constitute the sampling and analysis plan supporting long-term ecological monitoring sampling in 2007. The data collected under this plan will become part of the long-term ecological monitoring data set that is being collected annually. The data will be used to determine the requirements for the subsequent long-term ecological monitoring. This plan guides the 2007 investigations, including sampling, quality assurance, quality control, analytical procedures, and data management. As such, this plan will help to ensure that the resulting monitoring data will be scientifically valid, defensible, and of known and acceptable quality

  5. Long-Term Ecological Monitoring Field Sampling Plan for 2007

    Energy Technology Data Exchange (ETDEWEB)

    T. Haney

    2007-07-31

    This field sampling plan describes the field investigations planned for the Long-Term Ecological Monitoring Project at the Idaho National Laboratory Site in 2007. This plan and the Quality Assurance Project Plan for Waste Area Groups 1, 2, 3, 4, 5, 6, 7, 10, and Removal Actions constitute the sampling and analysis plan supporting long-term ecological monitoring sampling in 2007. The data collected under this plan will become part of the long-term ecological monitoring data set that is being collected annually. The data will be used t determine the requirements for the subsequent long-term ecological monitoring. This plan guides the 2007 investigations, including sampling, quality assurance, quality control, analytical procedures, and data management. As such, this plan will help to ensure that the resulting monitoring data will be scientifically valid, defensible, and of known and acceptable quality.

  6. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-06-01

    HSQMP establishes quality requirements in response to DOE Order 5700. 6C and to 10 Code of Federal Regulations 830.120. HSQMP is designed to meet the needs of Richland Operations Office for controlling the quality of services provided by sampling operations. It is issued through the Analytical Services Program of the Waste Programs Division. This document describes the Environmental Sampling and Analysis Program activities considered to represent the best management activities necessary to achieve a sampling program with adequate control

  7. Application of binomial-edited CPMG to shale characterization.

    Science.gov (United States)

    Washburn, Kathryn E; Birdwell, Justin E

    2014-09-01

    Unconventional shale resources may contain a significant amount of hydrogen in organic solids such as kerogen, but it is not possible to directly detect these solids with many NMR systems. Binomial-edited pulse sequences capitalize on magnetization transfer between solids, semi-solids, and liquids to provide an indirect method of detecting solid organic materials in shales. When the organic solids can be directly measured, binomial-editing helps distinguish between different phases. We applied a binomial-edited CPMG pulse sequence to a range of natural and experimentally-altered shale samples. The most substantial signal loss is seen in shales rich in organic solids while fluids associated with inorganic pores seem essentially unaffected. This suggests that binomial-editing is a potential method for determining fluid locations, solid organic content, and kerogen-bitumen discrimination. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Optimal Designing of Variables Chain Sampling Plan by Minimizing the Average Sample Number

    Directory of Open Access Journals (Sweden)

    S. Balamurali

    2013-01-01

    Full Text Available We investigate the optimal designing of chain sampling plan for the application of normally distributed quality characteristics. The chain sampling plan is one of the conditional sampling procedures and this plan under variables inspection will be useful when testing is costly and destructive. The advantages of this proposed variables plan over variables single sampling plan and variables double sampling plan are discussed. Tables are also constructed for the selection of optimal parameters of known and unknown standard deviation variables chain sampling plan for specified two points on the operating characteristic curve, namely, the acceptable quality level and the limiting quality level, along with the producer’s and consumer’s risks. The optimization problem is formulated as a nonlinear programming where the objective function to be minimized is the average sample number and the constraints are related to lot acceptance probabilities at acceptable quality level and limiting quality level under the operating characteristic curve.

  9. WRAP Module 1 sampling and analysis plan

    International Nuclear Information System (INIS)

    Mayancsik, B.A.

    1995-01-01

    This document provides the methodology to sample, screen, and analyze waste generated, processed, or otherwise the responsibility of the Waste Receiving and Processing Module 1 facility. This includes Low-Level Waste, Transuranic Waste, Mixed Waste, and Dangerous Waste

  10. B-cell waste classification sampling plan

    Energy Technology Data Exchange (ETDEWEB)

    HOBART, R.L.

    1999-11-20

    This report documents the methods used to collect samples and analyze data necessary to verify and/or determine the radionuclide content of the 324 Facility B-Cell decontamination and decommissioning waste stream.

  11. Integer Solutions of Binomial Coefficients

    Science.gov (United States)

    Gilbertson, Nicholas J.

    2016-01-01

    A good formula is like a good story, rich in description, powerful in communication, and eye-opening to readers. The formula presented in this article for determining the coefficients of the binomial expansion of (x + y)n is one such "good read." The beauty of this formula is in its simplicity--both describing a quantitative situation…

  12. Nevada National Security Site Integrated Groundwater Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Marutzky, Sam; Farnham, Irene

    2014-10-01

    The purpose of the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan) is to provide a comprehensive, integrated approach for collecting and analyzing groundwater samples to meet the needs and objectives of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) Underground Test Area (UGTA) Activity. Implementation of this Plan will provide high-quality data required by the UGTA Activity for ensuring public protection in an efficient and cost-effective manner. The Plan is designed to ensure compliance with the UGTA Quality Assurance Plan (QAP). The Plan’s scope comprises sample collection and analysis requirements relevant to assessing the extent of groundwater contamination from underground nuclear testing. This Plan identifies locations to be sampled by corrective action unit (CAU) and location type, sampling frequencies, sample collection methodologies, and the constituents to be analyzed. In addition, the Plan defines data collection criteria such as well-purging requirements, detection levels, and accuracy requirements; identifies reporting and data management requirements; and provides a process to ensure coordination between NNSS groundwater sampling programs for sampling of interest to UGTA. This Plan does not address compliance with requirements for wells that supply the NNSS public water system or wells involved in a permitted activity.

  13. 384 Power plant waste water sampling and analysis plan

    International Nuclear Information System (INIS)

    Hagerty, K.J.; Knotek, H.M.

    1995-01-01

    This document presents the 384 Power House Sampling and Analysis Plan. The Plan describes sampling methods, locations, frequency, analytes, and stream descriptions. The effluent streams from 384, were characterized in 1989, in support of the Stream Specific Report (WHC-EP-0342, Addendum 1)

  14. Sampling Plans for Monitoring Quality Control Process at a Plastic ...

    African Journals Online (AJOL)

    While the problem of the company's process control system was being studied, management considered other options that include sampling plans. The sampling plans were evaluated by considering the competing goals of decreasing the shipment of substandard goods, decreasing the percentage of acceptable lot that are ...

  15. UMTRA water sampling and analysis plan, Green River, Utah

    International Nuclear Information System (INIS)

    Papusch, R.

    1993-12-01

    The purpose of this water sampling and analysis plan (WSAP) is to provide a basis for groundwater and surface water sampling at the Green River Uranium Mill Tailing Remedial Action (UMTRA) Project site. This WSAP identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the monitoring locations

  16. Bayesian analysis of a correlated binomial model

    OpenAIRE

    Diniz, Carlos A. R.; Tutia, Marcelo H.; Leite, Jose G.

    2010-01-01

    In this paper a Bayesian approach is applied to the correlated binomial model, CB(n, p, ρ), proposed by Luceño (Comput. Statist. Data Anal. 20 (1995) 511–520). The data augmentation scheme is used in order to overcome the complexity of the mixture likelihood. MCMC methods, including Gibbs sampling and Metropolis within Gibbs, are applied to estimate the posterior marginal for the probability of success p and for the correlation coefficient ρ. The sensitivity of the posterior is studied taking...

  17. Smoothness in Binomial Edge Ideals

    Directory of Open Access Journals (Sweden)

    Hamid Damadi

    2016-06-01

    Full Text Available In this paper we study some geometric properties of the algebraic set associated to the binomial edge ideal of a graph. We study the singularity and smoothness of the algebraic set associated to the binomial edge ideal of a graph. Some of these algebraic sets are irreducible and some of them are reducible. If every irreducible component of the algebraic set is smooth we call the graph an edge smooth graph, otherwise it is called an edge singular graph. We show that complete graphs are edge smooth and introduce two conditions such that the graph G is edge singular if and only if it satisfies these conditions. Then, it is shown that cycles and most of trees are edge singular. In addition, it is proved that complete bipartite graphs are edge smooth.

  18. Sampling and Analysis Plan for PUREX canyon vessel flushing

    International Nuclear Information System (INIS)

    Villalobos, C.N.

    1995-01-01

    A sampling and analysis plan is necessary to provide direction for the sampling and analytical activities determined by the data quality objectives. This document defines the sampling and analysis necessary to support the deactivation of the Plutonium-Uranium Extraction (PUREX) facility vessels that are regulated pursuant to Washington Administrative Code 173-303

  19. Final Sampling and Analysis Plan for Background Sampling, Fort Sheridan, Illinois

    National Research Council Canada - National Science Library

    1995-01-01

    .... This Background Sampling and Analysis Plan (BSAP) is designed to address this issue through the collection of additional background samples at Fort Sheridan to support the statistical analysis and the Baseline Risk Assessment (BRA...

  20. Adjusted Wald Confidence Interval for a Difference of Binomial Proportions Based on Paired Data

    Science.gov (United States)

    Bonett, Douglas G.; Price, Robert M.

    2012-01-01

    Adjusted Wald intervals for binomial proportions in one-sample and two-sample designs have been shown to perform about as well as the best available methods. The adjusted Wald intervals are easy to compute and have been incorporated into introductory statistics courses. An adjusted Wald interval for paired binomial proportions is proposed here and…

  1. IP Sample Plan #5 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    A sample Intellectual Property Management Plan in the form of a legal agreement between a University and its collaborators which addresses data sharing, sharing of research tools and resources and intellectual property management.

  2. UMTRA water sampling and analysis plan, Lakeview, Oregon

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of this document is to provide background, guidance, and justification for water sampling activities for the Lakeview, Oregon, Uranium Mill Tailings Remedial Action (UMTRA) processing and disposal sites. This water sampling and analysis plan will form the basis for groundwater sampling and analysis work orders (WSAWO) to be implemented during 1993. Monitoring at the former Lakeview processing site is for characterization purposes and in preparation for the risk assessment, scheduled for the fall of 1993. Compliance monitoring was conducted at the disposal site. Details of the sampling plan are discussed in Section 5.0

  3. UMTRA project water sampling and analysis plan, Tuba City, Arizona

    International Nuclear Information System (INIS)

    1996-02-01

    Planned, routine ground water sampling activities at the U.S. Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project site in Tuba City, Arizona, are described in the following sections of this water sampling and analysis plan (WSAP). This plan identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the stations routinely monitored at the site. The ground water data are used for site characterization and risk assessment. The regulatory basis for routine ground water monitoring at UMTRA Project sites is derived from the U.S. Environmental Protection Agency (EPA) regulations in 40 CFR Part 192 (1994) and the final EPA standards of 1995 (60 FR 2854). Sampling procedures are guided by the UMTRA Project standard operating procedures (SOP) (JEG, n.d.), and the most effective technical approach for the site

  4. UMTRA project water sampling and analysis plan, Gunnison, Colorado

    International Nuclear Information System (INIS)

    1994-06-01

    This water sampling and analysis plan summarizes the results of previous water sampling activities and the plan for water sampling activities for calendar year 1994. A buffer zone monitoring plan is included as an appendix. The buffer zone monitoring plan is designed to protect the public from residual contamination that entered the ground water as a result of former milling operations. Surface remedial action at the Gunnison Uranium Mill Tailings Remedial Action Project site began in 1992; completion is expected in 1995. Ground water and surface water will be sampled semiannually in 1994 at the Gunnison processing site (GUN-01) and disposal site (GUN-08). Results of previous water sampling at the Gunnison processing site indicate that ground water in the alluvium is contaminated by the former uranium processing activities. Background ground water conditions have been established in the uppermost aquifer (Tertiary gravels) at the Gunnison disposal site. The monitor well locations provide a representative distribution of sampling points to characterize ground water quality and ground water flow conditions in the vicinity of the sites. The list of analytes has been modified with time to reflect constituents that are related to uranium processing activities and the parameters needed for geochemical evaluation. Water sampling will be conducted at least semiannually during and one year following the period of construction activities, to comply with the ground water protection strategy discussed in the remedial action plan (DOE, 1992a)

  5. Sampling and Analysis Plan for the 216-A-29 Ditch

    International Nuclear Information System (INIS)

    Petersen, S.W.

    1998-06-01

    This sampling and analysis plan defines procedures to be used for collecting and handling samples to be obtained from the 216-A-29 Ditch, and identifies requirements for field and laboratory measurements. The sampling strategy describes here is derived from a Data Quality Objectives workshop conducted in January 1997 to support sampling to assure worker safety during construction and to assess the validity of a 1988 ditch sampling campaign and the effectiveness of subsequent stabilization. The purpose of the proposed sampling and analysis activities is to characterize soil contamination in the vicinity of a proposed road over the 216-A-29 Ditch

  6. Sampling plans in attribute mode with multiple levels of precision

    International Nuclear Information System (INIS)

    Franklin, M.

    1986-01-01

    This paper describes a method for deriving sampling plans for nuclear material inventory verification. The method presented is different from the classical approach which envisages two levels of measurement precision corresponding to NDA and DA. In the classical approach the precisions of the two measurement methods are taken as fixed parameters. The new approach is based on multiple levels of measurement precision. The design of the sampling plan consists of choosing the number of measurement levels, the measurement precision to be used at each level and the sample size to be used at each level

  7. Nevada National Security Site Integrated Groundwater Sampling Plan, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene

    2018-03-01

    The purpose is to provide a comprehensive, integrated approach for collecting and analyzing groundwater samples to meet the needs and objectives of the DOE/EM Nevada Program’s UGTA Activity. Implementation of this Plan will provide high-quality data required by the UGTA Activity for ensuring public protection in an efficient and cost-effective manner. The Plan is designed to ensure compliance with the UGTA Quality Assurance Plan (QAP) (NNSA/NFO, 2015); Federal Facility Agreement and Consent Order (FFACO) (1996, as amended); and DOE Order 458.1, Radiation Protection of the Public and the Environment (DOE, 2013). The Plan’s scope comprises sample collection and analysis requirements relevant to assessing both the extent of groundwater contamination from underground nuclear testing and impact of testing on water quality in downgradient communities. This Plan identifies locations to be sampled by CAU and location type, sampling frequencies, sample collection methodologies, and the constituents to be analyzed. In addition, the Plan defines data collection criteria such as well purging, detection levels, and accuracy requirements/recommendations; identifies reporting and data management requirements; and provides a process to ensure coordination between NNSS groundwater sampling programs for sampling analytes of interest to UGTA. Information used in the Plan development—including the rationale for selection of wells, sampling frequency, and the analytical suite—is discussed under separate cover (N-I, 2014) and is not reproduced herein. This Plan does not address compliance for those wells involved in a permitted activity. Sampling and analysis requirements associated with these wells are described in their respective permits and are discussed in NNSS environmental reports (see Section 5.2). In addition, sampling for UGTA CAUs that are in the Closure Report (CR) stage are not included in this Plan. Sampling requirements for these CAUs are described in the CR

  8. Estimation of Log-Linear-Binomial Distribution with Applications

    Directory of Open Access Journals (Sweden)

    Elsayed Ali Habib

    2010-01-01

    Full Text Available Log-linear-binomial distribution was introduced for describing the behavior of the sum of dependent Bernoulli random variables. The distribution is a generalization of binomial distribution that allows construction of a broad class of distributions. In this paper, we consider the problem of estimating the two parameters of log-linearbinomial distribution by moment and maximum likelihood methods. The distribution is used to fit genetic data and to obtain the sampling distribution of the sign test under dependence among trials.

  9. Liquid effluent Sampling and Analysis Plan (SAP) implementation summary report

    International Nuclear Information System (INIS)

    Lueck, K.J.

    1995-01-01

    This report summarizes liquid effluent analytical data collected during the Sampling and Analysis Plan (SAP) Implementation Program, evaluates whether or not the sampling performed meets the requirements of the individual SAPs, compares the results to the WAC 173-200 Ground Water Quality Standards. Presented in the report are results from liquid effluent samples collected (1992-1994) from 18 of the 22 streams identified in the Consent Order (No. DE 91NM-177) requiring SAPs

  10. Sampling and Analysis Plan for the 221-U Facility

    International Nuclear Information System (INIS)

    Rugg, J.E.

    1998-02-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities proposed to be conducted to support the evaluation of alternatives for the final disposition of the 221-U Facility. This SAP will describe general sample locations and the minimum number of samples required. It will also identify the specific contaminants of potential concern (COPCs) and the required analysis. This SAP does not define the exact sample locations and equipment to be used in the field due to the nature of unknowns associated with the 221-U Facility

  11. Beta-binomial regression and bimodal utilization.

    Science.gov (United States)

    Liu, Chuan-Fen; Burgess, James F; Manning, Willard G; Maciejewski, Matthew L

    2013-10-01

    To illustrate how the analysis of bimodal U-shaped distributed utilization can be modeled with beta-binomial regression, which is rarely used in health services research. Veterans Affairs (VA) administrative data and Medicare claims in 2001-2004 for 11,123 Medicare-eligible VA primary care users in 2000. We compared means and distributions of VA reliance (the proportion of all VA/Medicare primary care visits occurring in VA) predicted from beta-binomial, binomial, and ordinary least-squares (OLS) models. Beta-binomial model fits the bimodal distribution of VA reliance better than binomial and OLS models due to the nondependence on normality and the greater flexibility in shape parameters. Increased awareness of beta-binomial regression may help analysts apply appropriate methods to outcomes with bimodal or U-shaped distributions. © Health Research and Educational Trust.

  12. Zero-truncated negative binomial - Erlang distribution

    Science.gov (United States)

    Bodhisuwan, Winai; Pudprommarat, Chookait; Bodhisuwan, Rujira; Saothayanun, Luckhana

    2017-11-01

    The zero-truncated negative binomial-Erlang distribution is introduced. It is developed from negative binomial-Erlang distribution. In this work, the probability mass function is derived and some properties are included. The parameters of the zero-truncated negative binomial-Erlang distribution are estimated by using the maximum likelihood estimation. Finally, the proposed distribution is applied to real data, the number of methamphetamine in the Bangkok, Thailand. Based on the results, it shows that the zero-truncated negative binomial-Erlang distribution provided a better fit than the zero-truncated Poisson, zero-truncated negative binomial, zero-truncated generalized negative-binomial and zero-truncated Poisson-Lindley distributions for this data.

  13. Risk-Based Approach to Developing National Residue Sampling Plan

    OpenAIRE

    Scientific Committee of the Food Safety Authority of Ireland

    2014-01-01

    A ranking system for veterinary medicinal products and medicated feed additives has been developed as a tool to be applied in a risk-based approach to the residue testing programme for foods of animal origin in the National Residue Control Plan. In the context of food sampling and residue testing for the National Residue Control Plan, there is firstly, the risk to human health from residues of chemical substances in food and secondly, the issue of non-compliance with regulations relating ...

  14. Exact Group Sequential Methods for Estimating a Binomial Proportion

    Directory of Open Access Journals (Sweden)

    Zhengjia Chen

    2013-01-01

    Full Text Available We first review existing sequential methods for estimating a binomial proportion. Afterward, we propose a new family of group sequential sampling schemes for estimating a binomial proportion with prescribed margin of error and confidence level. In particular, we establish the uniform controllability of coverage probability and the asymptotic optimality for such a family of sampling schemes. Our theoretical results establish the possibility that the parameters of this family of sampling schemes can be determined so that the prescribed level of confidence is guaranteed with little waste of samples. Analytic bounds for the cumulative distribution functions and expectations of sample numbers are derived. Moreover, we discuss the inherent connection of various sampling schemes. Numerical issues are addressed for improving the accuracy and efficiency of computation. Computational experiments are conducted for comparing sampling schemes. Illustrative examples are given for applications in clinical trials.

  15. Decision Models for Determining the Optimal Life Test Sampling Plans

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Strelchonok, Vladimir F.

    2010-11-01

    Life test sampling plan is a technique, which consists of sampling, inspection, and decision making in determining the acceptance or rejection of a batch of products by experiments for examining the continuous usage time of the products. In life testing studies, the lifetime is usually assumed to be distributed as either a one-parameter exponential distribution, or a two-parameter Weibull distribution with the assumption that the shape parameter is known. Such oversimplified assumptions can facilitate the follow-up analyses, but may overlook the fact that the lifetime distribution can significantly affect the estimation of the failure rate of a product. Moreover, sampling costs, inspection costs, warranty costs, and rejection costs are all essential, and ought to be considered in choosing an appropriate sampling plan. The choice of an appropriate life test sampling plan is a crucial decision problem because a good plan not only can help producers save testing time, and reduce testing cost; but it also can positively affect the image of the product, and thus attract more consumers to buy it. This paper develops the frequentist (non-Bayesian) decision models for determining the optimal life test sampling plans with an aim of cost minimization by identifying the appropriate number of product failures in a sample that should be used as a threshold in judging the rejection of a batch. The two-parameter exponential and Weibull distributions with two unknown parameters are assumed to be appropriate for modelling the lifetime of a product. A practical numerical application is employed to demonstrate the proposed approach.

  16. A comparison of observation-level random effect and Beta-Binomial models for modelling overdispersion in Binomial data in ecology & evolution.

    Science.gov (United States)

    Harrison, Xavier A

    2015-01-01

    Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.

  17. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  18. 400 area secondary cooling water sampling and analysis plan

    Energy Technology Data Exchange (ETDEWEB)

    Penn, L.L.

    1996-10-29

    This is a total rewrite of the Sampling and Analysis Plan in response to, and to ensure compliance with, the State Waste Discharge Permit ST 4501 issued on July 31, 1996. This revision describes changes in facility status and implements requirements of the permit.

  19. IP Sample Plan #3 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Sample Research Resources and Intellectual Property Plan for use by an Institution and its Collaborators for intellectual property protection strategies covering pre-existing intellectual property, agreements with commercial sources, privacy, and licensing.  | [google6f4cd5334ac394ab.html

  20. IP Sample Plan #4 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Sample letter from Research Institutes and their principal investigator and consultants, describing a data and research tool sharing plan and procedures for sharing data, research materials, and patent and licensing of intellectual property. This letter is designed to be included as part of an application.

  1. IP Sample Plan #1 | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Sample letter that shows how Universities including co-investigators, consultants, and collaborators can describe a data and research tool sharing plan and procedures for exercising intellectual property rights. The letter is to be used as part of the University's application. 

  2. Test plan for core sampling drill bit temperature monitor

    International Nuclear Information System (INIS)

    Francis, P.M.

    1994-01-01

    At WHC, one of the functions of the Tank Waste Remediation System division is sampling waste tanks to characterize their contents. The push-mode core sampling truck is currently used to take samples of liquid and sludge. Sampling of tanks containing hard salt cake is to be performed with the rotary-mode core sampling system, consisting of the core sample truck, mobile exhauster unit, and ancillary subsystems. When drilling through the salt cake material, friction and heat can be generated in the drill bit. Based upon tank safety reviews, it has been determined that the drill bit temperature must not exceed 180 C, due to the potential reactivity of tank contents at this temperature. Consequently, a drill bit temperature limit of 150 C was established for operation of the core sample truck to have an adequate margin of safety. Unpredictable factors, such as localized heating, cause this buffer to be so great. The most desirable safeguard against exceeding this threshold is bit temperature monitoring . This document describes the recommended plan for testing the prototype of a drill bit temperature monitor developed for core sampling by Sandia National Labs. The device will be tested at their facilities. This test plan documents the tests that Westinghouse Hanford Company considers necessary for effective testing of the system

  3. In situ sampling cart development engineering task plan

    International Nuclear Information System (INIS)

    DeFord, D.K.

    1995-01-01

    This Engineering Task Plan (ETP) supports the development for facility use of the next generation in situ sampling system for characterization of tank vapors. In situ sampling refers to placing sample collection devices (primarily sorbent tubes) directly into the tank headspace, then drawing tank gases through the collection devices to obtain samples. The current in situ sampling system is functional but was not designed to provide the accurate flow measurement required by today's data quality objectives (DQOs) for vapor characterization. The new system will incorporate modern instrumentation to achieve much tighter control. The next generation system will be referred to in this ETP as the New In Situ System (NISS) or New System. The report describes the current sampling system and the modifications that are required for more accuracy

  4. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  5. A class of orthogonal nonrecursive binomial filters.

    Science.gov (United States)

    Haddad, R. A.

    1971-01-01

    The time- and frequency-domain properties of the orthogonal binomial sequences are presented. It is shown that these sequences, or digital filters based on them, can be generated using adders and delay elements only. The frequency-domain behavior of these nonrecursive binomial filters suggests a number of applications as low-pass Gaussian filters or as inexpensive bandpass filters.

  6. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  7. Problems on Divisibility of Binomial Coefficients

    Science.gov (United States)

    Osler, Thomas J.; Smoak, James

    2004-01-01

    Twelve unusual problems involving divisibility of the binomial coefficients are represented in this article. The problems are listed in "The Problems" section. All twelve problems have short solutions which are listed in "The Solutions" section. These problems could be assigned to students in any course in which the binomial theorem and Pascal's…

  8. Sampling plans for pest mites on physic nut.

    Science.gov (United States)

    Rosado, Jander F; Sarmento, Renato A; Pedro-Neto, Marçal; Galdino, Tarcísio V S; Marques, Renata V; Erasmo, Eduardo A L; Picanço, Marcelo C

    2014-08-01

    The starting point for generating a pest control decision-making system is a conventional sampling plan. Because the mites Polyphagotarsonemus latus and Tetranychus bastosi are among the most important pests of the physic nut (Jatropha curcas), in the present study, we aimed to establish sampling plans for these mite species on physic nut. Mite densities were monitored in 12 physic nut crops. Based on the obtained results, sampling of P. latus and T. bastosi should be performed by assessing the number of mites per cm(2) in 160 samples using a handheld 20× magnifying glass. The optimal sampling region for T. bastosi is the abaxial surface of the 4th most apical leaf on the branch of the middle third of the canopy. On the abaxial surface, T. bastosi should then be observed on the side parts of the middle portion of the leaf, near its edge. As for P. latus, the optimal sampling region is the abaxial surface of the 4th most apical leaf on the branch of the apical third of the canopy on the abaxial surface. Polyphagotarsonemus latus should then be assessed on the side parts of the leaf's petiole insertion. Each sampling procedure requires 4 h and costs US$ 7.31.

  9. Development of sample size allocation program using hypergeometric distribution

    International Nuclear Information System (INIS)

    Kim, Hyun Tae; Kwack, Eun Ho; Park, Wan Soo; Min, Kyung Soo; Park, Chan Sik

    1996-01-01

    The objective of this research is the development of sample allocation program using hypergeometric distribution with objected-oriented method. When IAEA(International Atomic Energy Agency) performs inspection, it simply applies a standard binomial distribution which describes sampling with replacement instead of a hypergeometric distribution which describes sampling without replacement in sample allocation to up to three verification methods. The objective of the IAEA inspection is the timely detection of diversion of significant quantities of nuclear material, therefore game theory is applied to its sampling plan. It is necessary to use hypergeometric distribution directly or approximate distribution to secure statistical accuracy. Improved binomial approximation developed by Mr. J. L. Jaech and correctly applied binomial approximation are more closer to hypergeometric distribution in sample size calculation than the simply applied binomial approximation of the IAEA. Object-oriented programs of 1. sample approximate-allocation with correctly applied standard binomial approximation, 2. sample approximate-allocation with improved binomial approximation, and 3. sample approximate-allocation with hypergeometric distribution were developed with Visual C ++ and corresponding programs were developed with EXCEL(using Visual Basic for Application). 8 tabs., 15 refs. (Author)

  10. 105-F and DR Phase 1 Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    Curry, L.R.

    1998-06-01

    This SAP presents the rationale and strategy for characterization of specific rooms within the 105-F and 105-DR reactor buildings. Figures 1-1 and 1-2 identify the rooms that are the subject of this SAP. These rooms are to be decontaminated and demolished as an initial step (Phase 1 ) in the Interim Safe Storage process for these reactors. Section 1.0 presents the background and sites history for the reactor buildings and summarizes the data quality objective process, which provides the logical basis for this SAP. Preliminary surveys indicate that little radiochemical contamination is present. Section 2.0 presents the quality assurance project plan, which includes a project management structure, sampling methods and quality control, and oversight of the sampling process. Section 2.2.1 summarizes the sampling methods, reflecting the radiological and chemical sampling designs presented in Tables 1-17 and 1-18. Section 3.0 presents the Field Sampling Plan for Phase 1. The sampling design is broken into two stages. Stage 1 will verify the list of radioactive constituents of concern and generate the isotopic distribution. The objectives of Stage 2 are to estimate the radionuclide inventories of room debris, quantify chemical contamination, and survey room contents for potential salvage or recycle. Table 3-1 presents the sampling activities to be performed in Stage 1. Tables 1-17 and 1-18 identify samples to be collected in Stage 2. Stage 2 will consist primarily of survey data collection, with fixed laboratory samples to be collected in areas showing visible stains. Quality control sampling requirements are presented in Table 3-2

  11. Estimating negative binomial parameters from occurrence data with detection times.

    Science.gov (United States)

    Hwang, Wen-Han; Huggins, Richard; Stoklosa, Jakub

    2016-11-01

    The negative binomial distribution is a common model for the analysis of count data in biology and ecology. In many applications, we may not observe the complete frequency count in a quadrat but only that a species occurred in the quadrat. If only occurrence data are available then the two parameters of the negative binomial distribution, the aggregation index and the mean, are not identifiable. This can be overcome by data augmentation or through modeling the dependence between quadrat occupancies. Here, we propose to record the (first) detection time while collecting occurrence data in a quadrat. We show that under what we call proportionate sampling, where the time to survey a region is proportional to the area of the region, that both negative binomial parameters are estimable. When the mean parameter is larger than two, our proposed approach is more efficient than the data augmentation method developed by Solow and Smith (, Am. Nat. 176, 96-98), and in general is cheaper to conduct. We also investigate the effect of misidentification when collecting negative binomially distributed data, and conclude that, in general, the effect can be simply adjusted for provided that the mean and variance of misidentification probabilities are known. The results are demonstrated in a simulation study and illustrated in several real examples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Nonradioactive Dangerous Waste Landfill sampling and analysis plan and data quality objectives process summary report

    International Nuclear Information System (INIS)

    Smith, R.C.

    1997-08-01

    This sampling and analysis plan defines the sampling and analytical activities and associated procedures that will be used to support the Nonradioactive Dangerous Waste Landfill soil-gas investigation. This SAP consists of three sections: this introduction, the field sampling plan, and the quality assurance project plan. The field sampling plan defines the sampling and analytical methodologies to be performed

  13. Visual Sample Plan (VSP) Models and Code Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O.; Davidson, James R.; Wilson, John E.; Pulsipher, Brent A.

    2001-03-06

    VSP is an easy to use, visual and graphic software tool being developed to select the right number and location of environmental samples so that the results of statistical tests performed to provide input to environmental decisions have the required confidence and performance. It is a significant help for implementing the 6th and 7th steps of the Data Quality Objectives (DQO) planning process ("Specify Tolerable Limits on Decision Errors" and "Optimize the Design for Obtaining Data," respectively).

  14. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  15. Correlated binomial models and correlation structures

    International Nuclear Information System (INIS)

    Hisakado, Masato; Kitsukawa, Kenji; Mori, Shintaro

    2006-01-01

    We discuss a general method to construct correlated binomial distributions by imposing several consistent relations on the joint probability function. We obtain self-consistency relations for the conditional correlations and conditional probabilities. The beta-binomial distribution is derived by a strong symmetric assumption on the conditional correlations. Our derivation clarifies the 'correlation' structure of the beta-binomial distribution. It is also possible to study the correlation structures of other probability distributions of exchangeable (homogeneous) correlated Bernoulli random variables. We study some distribution functions and discuss their behaviours in terms of their correlation structures

  16. The Toggle Local Planner for sampling-based motion planning

    KAUST Repository

    Denny, Jory

    2012-05-01

    Sampling-based solutions to the motion planning problem, such as the probabilistic roadmap method (PRM), have become commonplace in robotics applications. These solutions are the norm as the dimensionality of the planning space grows, i.e., d > 5. An important primitive of these methods is the local planner, which is used for validation of simple paths between two configurations. The most common is the straight-line local planner which interpolates along the straight line between the two configurations. In this paper, we introduce a new local planner, Toggle Local Planner (Toggle LP), which extends local planning to a two-dimensional subspace of the overall planning space. If no path exists between the two configurations in the subspace, then Toggle LP is guaranteed to correctly return false. Intuitively, more connections could be found by Toggle LP than by the straight-line planner, resulting in better connected roadmaps. As shown in our results, this is the case, and additionally, the extra cost, in terms of time or storage, for Toggle LP is minimal. Additionally, our experimental analysis of the planner shows the benefit for a wide array of robots, with DOF as high as 70. © 2012 IEEE.

  17. Final work plan for targeted sampling at Webber, Kansas.

    Energy Technology Data Exchange (ETDEWEB)

    LaFreniere, L. M.; Environmental Science Division

    2006-05-01

    This Work Plan outlines the scope of work for targeted sampling at Webber, Kansas (Figure 1.1). This activity is being conducted at the request of the Kansas Department of Health and Environment (KDHE), in accordance with Section V of the Intergovernmental Agreement between the KDHE and the Commodity Credit Corporation of the U.S. Department of Agriculture (CCC/USDA). Data obtained in this sampling event will be used to (1) evaluate the current status of previously detected contamination at Webber and (2) determine whether the site requires further action. This work is being performed on behalf of the CCC/USDA by the Environmental Science Division of Argonne National Laboratory. Argonne is a nonprofit, multidisciplinary research center operated by the University of Chicago for the U.S. Department of Energy (DOE). The CCC/USDA has entered into an interagency agreement with DOE, under which Argonne provides technical assistance to the CCC/USDA with environmental site characterization and remediation at its former grain storage facilities. Argonne has issued a Master Work Plan (Argonne 2002) that describes the general scope of and guidance for all investigations at former CCC/USDA facilities in Kansas. The Master Work Plan, approved by the KDHE, contains the materials common to investigations at all locations in Kansas. This document should be consulted for complete details of the technical activities proposed at the former CCC/USDA facility in Webber.

  18. Compatibility grab sampling and analysis plan for fiscal year 1999

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    1999-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for grab samples obtained to address waste compatibility. Analytical requirements are taken from two revisions of the Compatibility data quality objectives (DQOs). Revision 1 of the DQO (Fowler 1995) listed analyses to be performed to meet both safety and operational data needs for the Compatibility program. Revision 2A of the DQO (Mulkey and Miller 1998) addresses only the safety-related requirements; the operational requirements of Fowler (1995) have not been superseded by Mulkey and Miller (1998). Therefore, safety-related data needs are taken from Mulkey and Miller (1998) and operational-related data needs are taken from Fowler (1995). Ammonia and total alpha analyses are also performed in accordance with Fowler (1998a, 1998b)

  19. Validation of Statistical Sampling Algorithms in Visual Sample Plan (VSP): Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Nuffer, Lisa L; Sego, Landon H.; Wilson, John E.; Hassig, Nancy L.; Pulsipher, Brent A.; Matzke, Brett D.

    2009-02-18

    The U.S. Department of Homeland Security, Office of Technology Development (OTD) contracted with a set of U.S. Department of Energy national laboratories, including the Pacific Northwest National Laboratory (PNNL), to write a Remediation Guidance for Major Airports After a Chemical Attack. The report identifies key activities and issues that should be considered by a typical major airport following an incident involving release of a toxic chemical agent. Four experimental tasks were identified that would require further research in order to supplement the Remediation Guidance. One of the tasks, Task 4, OTD Chemical Remediation Statistical Sampling Design Validation, dealt with statistical sampling algorithm validation. This report documents the results of the sampling design validation conducted for Task 4. In 2005, the Government Accountability Office (GAO) performed a review of the past U.S. responses to Anthrax terrorist cases. Part of the motivation for this PNNL report was a major GAO finding that there was a lack of validated sampling strategies in the U.S. response to Anthrax cases. The report (GAO 2005) recommended that probability-based methods be used for sampling design in order to address confidence in the results, particularly when all sample results showed no remaining contamination. The GAO also expressed a desire that the methods be validated, which is the main purpose of this PNNL report. The objective of this study was to validate probability-based statistical sampling designs and the algorithms pertinent to within-building sampling that allow the user to prescribe or evaluate confidence levels of conclusions based on data collected as guided by the statistical sampling designs. Specifically, the designs found in the Visual Sample Plan (VSP) software were evaluated. VSP was used to calculate the number of samples and the sample location for a variety of sampling plans applied to an actual release site. Most of the sampling designs validated are

  20. Newton Binomial Formulas in Schubert Calculus

    OpenAIRE

    Cordovez, Jorge; Gatto, Letterio; Santiago, Taise

    2008-01-01

    We prove Newton's binomial formulas for Schubert Calculus to determine numbers of base point free linear series on the projective line with prescribed ramification divisor supported at given distinct points.

  1. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  2. Log-binomial models: exploring failed convergence.

    Science.gov (United States)

    Williamson, Tyler; Eliasziw, Misha; Fick, Gordon Hilton

    2013-12-13

    Relative risk is a summary metric that is commonly used in epidemiological investigations. Increasingly, epidemiologists are using log-binomial models to study the impact of a set of predictor variables on a single binary outcome, as they naturally offer relative risks. However, standard statistical software may report failed convergence when attempting to fit log-binomial models in certain settings. The methods that have been proposed in the literature for dealing with failed convergence use approximate solutions to avoid the issue. This research looks directly at the log-likelihood function for the simplest log-binomial model where failed convergence has been observed, a model with a single linear predictor with three levels. The possible causes of failed convergence are explored and potential solutions are presented for some cases. Among the principal causes is a failure of the fitting algorithm to converge despite the log-likelihood function having a single finite maximum. Despite these limitations, log-binomial models are a viable option for epidemiologists wishing to describe the relationship between a set of predictors and a binary outcome where relative risk is the desired summary measure. Epidemiologists are encouraged to continue to use log-binomial models and advocate for improvements to the fitting algorithms to promote the widespread use of log-binomial models.

  3. The Validation of a Beta-Binomial Model for Overdispersed Binomial Data.

    Science.gov (United States)

    Kim, Jongphil; Lee, Ji-Hyun

    2017-01-01

    The beta-binomial model has been widely used as an analytically tractable alternative that captures the overdispersion of an intra-correlated, binomial random variable, X . However, the model validation for X has been rarely investigated. As a beta-binomial mass function takes on a few different shapes, the model validation is examined for each of the classified shapes in this paper. Further, the mean square error (MSE) is illustrated for each shape by the maximum likelihood estimator (MLE) based on a beta-binomial model approach and the method of moments estimator (MME) in order to gauge when and how much the MLE is biased.

  4. Sampling and Analysis Plan for the 105-N Basin Water

    International Nuclear Information System (INIS)

    R.O. Mahood

    1997-01-01

    This sampling and analysis plan defines the strategy, and field and laboratory methods that will be used to characterize 105-N Basin water. The water will be shipped to the 200 Area Effluent Treatment Facility for treatment and disposal as part of N Reactor deactivation. These analyses are necessary to ensure that the water will meet the acceptance criteria of the ETF, as established in the Memorandum of Understanding for storage and treatment of water from N-Basin (Appendix A), and the characterization requirements for 100-N Area water provided in a letter from ETF personnel (Appendix B)

  5. 241-Z-361 Sludge Characterization Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    BANNING, D.L.

    1999-08-05

    This sampling and analysis plan (SAP) identifies the type, quantity, and quality of data needed to support characterization of the sludge that remains in Tank 241-2-361. The procedures described in this SAP are based on the results of the 241-2-361 Sludge Characterization Data Quality Objectives (DQO) (BWHC 1999) process for the tank. The primary objectives of this project are to evaluate the contents of Tank 241-2-361 in order to resolve safety and safeguards issues and to assess alternatives for sludge removal and disposal.

  6. 241-Z-361 Sludge Characterization Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    BANNING, D.L.

    1999-07-29

    This sampling and analysis plan (SAP) identifies the type, quantity, and quality of data needed to support characterization of the sludge that remains in Tank 241-2-361. The procedures described in this SAP are based on the results of the 241-2-361 Sludge Characterization Data Quality Objectives (DQO) (BWHC 1999) process for the tank. The primary objectives of this project are to evaluate the contents of Tank 241-2-361 in order to resolve safety and safeguards issues and to assess alternatives for sludge removal and disposal.

  7. C-018H Pre-Operational Baseline Sampling Plan

    International Nuclear Information System (INIS)

    Guzek, S.J.

    1993-01-01

    The objective of this task is to field characterize and sample the soil at selected locations along the proposed effluent line routes for Project C-018H. The overall purpose of this effort is to meet the proposed plan to discontinue the disposal of contaminated liquids into the Hanford soil column as described by DOE (1987). Detailed information describing proposed transport pipeline route and associated Kaiser Engineers Hanford Company (KEH) preliminary drawings (H288746...755) all inclusive, have been prepared by KEH (1992). The information developed from field monitoring and sampling will be utilized to characterize surface and subsurface soil along the proposed C-018H effluent pipeline and it's associated facilities. Potentially existing contaminant levels may be encountered therefore, soil characterization will provide a construction preoperational baseline reference, develop personnel safety requirements, and determine the need for any changes in the proposed routes prior to construction of the pipeline

  8. Statistical sampling plan for the TRU waste assay facility

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Wright, T.; Schultz, F.J.; Haff, K.; Monroe, R.J.

    1983-08-01

    Due to limited space, there is a need to dispose appropriately of the Oak Ridge National Laboratory transuranic waste which is presently stored below ground in 55-gal (208-l) drums within weather-resistant structures. Waste containing less than 100 nCi/g transuranics can be removed from the present storage and be buried, while waste containing greater than 100 nCi/g transuranics must continue to be retrievably stored. To make the necessary measurements needed to determine the drums that can be buried, a transuranic Neutron Interrogation Assay System (NIAS) has been developed at Los Alamos National Laboratory and can make the needed measurements much faster than previous techniques which involved γ-ray spectroscopy. The previous techniques are reliable but time consuming. Therefore, a validation study has been planned to determine the ability of the NIAS to make adequate measurements. The validation of the NIAS will be based on a paired comparison of a sample of measurements made by the previous techniques and the NIAS. The purpose of this report is to describe the proposed sampling plan and the statistical analyses needed to validate the NIAS. 5 references, 4 figures, 5 tables

  9. Sampling and Analysis Plan for K Basins Debris

    Energy Technology Data Exchange (ETDEWEB)

    WESTCOTT, J.L.

    2000-06-21

    This Sampling and Analysis Plan presents the rationale and strategy for sampling and analysis activities to support removal of debris from the K-East and K-West Basins located in the 100K Area at the Hanford Site. This project is focused on characterization to support waste designation for disposal of waste at the Environmental Restoration Disposal Facility (ERDF). This material has previously been dispositioned at the Hanford Low-Level Burial Grounds or Central Waste Complex. The structures that house the basins are classified as radioactive material areas. Therefore, all materials removed from the buildings are presumed to be radioactively contaminated. Because most of the materials that will be addressed under this plan will be removed from the basins, and because of the cost associated with screening materials for release, it is anticipated that all debris will be managed as low-level waste. Materials will be surveyed, however, to estimate radionuclide content for disposal and to determine that the debris is not contaminated with levels of transuranic radionuclides that would designate the debris as transuranic waste.

  10. Sampling and Analysis Plan for K Basins Debris

    International Nuclear Information System (INIS)

    WESTCOTT, J.L.

    2000-01-01

    This Sampling and Analysis Plan presents the rationale and strategy for sampling and analysis activities to support removal of debris from the K-East and K-West Basins located in the 100K Area at the Hanford Site. This project is focused on characterization to support waste designation for disposal of waste at the Environmental Restoration Disposal Facility (ERDF). This material has previously been dispositioned at the Hanford Low-Level Burial Grounds or Central Waste Complex. The structures that house the basins are classified as radioactive material areas. Therefore, all materials removed from the buildings are presumed to be radioactively contaminated. Because most of the materials that will be addressed under this plan will be removed from the basins, and because of the cost associated with screening materials for release, it is anticipated that all debris will be managed as low-level waste. Materials will be surveyed, however, to estimate radionuclide content for disposal and to determine that the debris is not contaminated with levels of transuranic radionuclides that would designate the debris as transuranic waste

  11. Zero inflated negative binomial-generalized exponential distributionand its applications

    Directory of Open Access Journals (Sweden)

    Sirinapa Aryuyuen

    2014-08-01

    Full Text Available In this paper, we propose a new zero inflated distribution, namely, the zero inflated negative binomial-generalized exponential (ZINB-GE distribution. The new distribution is used for count data with extra zeros and is an alternative for data analysis with over-dispersed count data. Some characteristics of the distribution are given, such as mean, variance, skewness, and kurtosis. Parameter estimation of the ZINB-GE distribution uses maximum likelihood estimation (MLE method. Simulated and observed data are employed to examine this distribution. The results show that the MLE method seems to have high efficiency for large sample sizes. Moreover, the mean square error of parameter estimation is increased when the zero proportion is higher. For the real data sets, this new zero inflated distribution provides a better fit than the zero inflated Poisson and zero inflated negative binomial distributions.

  12. Compatibility Grab Sampling and Analysis Plan for FY 2000

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    1999-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for grab samples obtained to address waste compatibility. It is written in accordance with requirements identified in Data Quality Objectives for Tank Farms Waste Compatibility Program (Mulkey et al. 1999) and Tank Farm Waste Transfer Compatibility Program (Fowler 1999). In addition to analyses to support Compatibility, the Waste Feed Delivery program has requested that tank samples obtained for Compatibility also be analyzed to confirm the high-level waste and/or low-activity waste envelope(s) for the tank waste (Baldwin 1999). The analytical requirements to confirm waste envelopes are identified in Data Quality Objectives for TWRS Privatization Phase I: Confirm Tank T is an Appropriate Feed Source for Low-Activity Waste Feed Batch X (Nguyen 1999a) and Data Quality Objectives for RPP Privatization Phase I: Confirm Tank T is an Appropriate Feed Source for High-Level Waste Feed Batch X (Nguyen 1999b)

  13. Abbreviated sampling and analysis plan for planning decontamination and decommissioning at Test Reactor Area (TRA) facilities

    International Nuclear Information System (INIS)

    1994-10-01

    The objective is to sample and analyze for the presence of gamma emitting isotopes and hazardous constituents within certain areas of the Test Reactor Area (TRA), prior to D and D activities. The TRA is composed of three major reactor facilities and three smaller reactors built in support of programs studying the performance of reactor materials and components under high neutron flux conditions. The Materials Testing Reactor (MTR) and Engineering Test Reactor (ETR) facilities are currently pending D/D. Work consists of pre-D and D sampling of designated TRA (primarily ETR) process areas. This report addresses only a limited subset of the samples which will eventually be required to characterize MTR and ETR and plan their D and D. Sampling which is addressed in this document is intended to support planned D and D work which is funded at the present time. Biased samples, based on process knowledge and plant configuration, are to be performed. The multiple process areas which may be potentially sampled will be initially characterized by obtaining data for upstream source areas which, based on facility configuration, would affect downstream and as yet unsampled, process areas. Sampling and analysis will be conducted to determine the level of gamma emitting isotopes and hazardous constituents present in designated areas within buildings TRA-612, 642, 643, 644, 645, 647, 648, 663; and in the soils surrounding Facility TRA-611. These data will be used to plan the D and D and help determine disposition of material by D and D personnel. Both MTR and ETR facilities will eventually be decommissioned by total dismantlement so that the area can be restored to its original condition

  14. Abbreviated sampling and analysis plan for planning decontamination and decommissioning at Test Reactor Area (TRA) facilities

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    The objective is to sample and analyze for the presence of gamma emitting isotopes and hazardous constituents within certain areas of the Test Reactor Area (TRA), prior to D and D activities. The TRA is composed of three major reactor facilities and three smaller reactors built in support of programs studying the performance of reactor materials and components under high neutron flux conditions. The Materials Testing Reactor (MTR) and Engineering Test Reactor (ETR) facilities are currently pending D/D. Work consists of pre-D and D sampling of designated TRA (primarily ETR) process areas. This report addresses only a limited subset of the samples which will eventually be required to characterize MTR and ETR and plan their D and D. Sampling which is addressed in this document is intended to support planned D and D work which is funded at the present time. Biased samples, based on process knowledge and plant configuration, are to be performed. The multiple process areas which may be potentially sampled will be initially characterized by obtaining data for upstream source areas which, based on facility configuration, would affect downstream and as yet unsampled, process areas. Sampling and analysis will be conducted to determine the level of gamma emitting isotopes and hazardous constituents present in designated areas within buildings TRA-612, 642, 643, 644, 645, 647, 648, 663; and in the soils surrounding Facility TRA-611. These data will be used to plan the D and D and help determine disposition of material by D and D personnel. Both MTR and ETR facilities will eventually be decommissioned by total dismantlement so that the area can be restored to its original condition.

  15. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  16. A sample lesson plan for the course English Composition II

    Directory of Open Access Journals (Sweden)

    Córdoba Cubillo, Patricia

    2005-03-01

    Full Text Available The goal of this article is to present a lesson plan and a series of sample tasks to help the instructors from the course English Composition II, at the School of Modern Languages from the University of Costa Rica, to guide students write an essay integrating the four skills: listening, speaking, reading, and writing. These activities will be a source of comprehensible input for the learners that will hopefully result in a good writing piece. El objetivo de este artículo es presentar un plan de lección y una serie de actividades que le ayudarán a los y las instructoras del curso Composición Inglesa II de la Escuela de Lenguas Modernas de la Universidad de Costa Rica a guiar a sus estudiantes a escribir un ensayo integrando las cuatro macro-destrezas, a saber comprensión auditiva, conversación, lectura y escritura. Mediante estas actividades se espera que los estudiantes elaboren un ensayo de calidad.

  17. UMTRA Project water sampling and analysis plan, Salt Lake City, Utah. Revision 1

    International Nuclear Information System (INIS)

    1995-06-01

    This water sampling and analysis plan describes planned, routine ground water sampling activities at the US Department of Energy Uranium Mill Tailings Remedial Action Project site in Salt Lake City, Utah. This plan identifies and justifies sampling locations, analytical parameters, detection limits, and sampling frequencies for routine monitoring of ground water, sediments, and surface waters at monitoring stations on the site

  18. UMTRA project water sampling and analysis plan, Mexican Hat, Utah

    International Nuclear Information System (INIS)

    1994-04-01

    The Mexican Hat, Utah, Uranium Mill Tailings Remedial Action (UMTRA) Project site is a former uranium mill that is undergoing surface remediation in the form of on-site tailings stabilization. Contaminated surface materials from the Monument Valley, Arizona, UMTRA Project site have been transported to the Mexican Hat site and are being consolidated with the Mexican Hat tailings. The scheduled completion of the tailings disposal cell is August 1995. Water is found in two geologic units at the site: the Halgaito Shale Formation and the Honaker Trail Formation. The tailings rest on the Halgaito Shale, and water contained in that unit is a result of milling activities and, to a lesser extent, water released from the tailings from compaction during remedial action construction of the disposal cell. Water in the Halgaito Shale flows through fractures and discharges at seeps along nearby arroyos. Flow from the seeps will diminish as water drains from the unit. Ground water in the lower unit, the Honaker Trail Formation, is protected from contamination by an upward hydraulic gradient. There are no nearby water supply wells because of widespread poor background ground water quality and quantity, and the San Juan River shows no impacts from the site. This water sampling and analysis plan (WSAP) recommends sampling six seeps and one upgradient monitor well compared in the Honaker Trail Formation. Samples will be taken in April 1994 (representative of high group water levels) and September 1994 (representative of low ground water levels). Analyses will be performed on filtered samples for plume indicator parameters

  19. UMTRA project water sampling and analysis plan, Grand Junction, Colorado

    International Nuclear Information System (INIS)

    1994-07-01

    Surface remedial action will be completed at the Grand Junction processing site during the summer of 1994. Results of 1993 water sampling indicate that ground water flow conditions and ground water quality at the processing site have remained relatively constant with time. Uranium concentrations in ground water continue to exceed the maximum concentration limits, providing the best indication of the extent of contaminated ground water. Evaluation of surface water quality of the Colorado River indicate no impact from uranium processing activities. No compliance monitoring at the Cheney disposal site has been proposed because ground water in the Dakota Sandstone (uppermost aquifer) is classified as limited-use (Class 111) and because the disposal cell is hydrogeologically isolated from the uppermost aquifer. The following water sampling and water level monitoring activities are planned for calendar year 1994: (i) Semiannual (early summer and late fall) sampling of six existing monitor wells at the former Grand Junction processing site. Analytical results from this sampling will be used to continue characterizing hydrogeochemical trends in background ground water quality and in the contaminated ground water area resulting from source term (tailings) removal. (ii) Water level monitoring of approximately three proposed monitor wells projected to be installed in the alluvium at the processing site in September 1994. Data loggers will be installed in these wells, and water levels will be electronically monitored six times a day. These long-term, continuous ground water level data will be collected to better understand the relationship between surface and ground water at the site. Water level and water quality data eventually will be used in future ground water modeling to establish boundary conditions in the vicinity of the Grand Junction processing site. Modeling results will be used to help demonstrate and document the potential remedial alternative of natural flushing

  20. Tank 241-Z-361 vapor sampling and analysis plan

    Energy Technology Data Exchange (ETDEWEB)

    BANNING, D.L.

    1999-02-23

    Tank 241-Z-361 is identified in the Hanford Federal Facility Agreement and Consent Order (commonly referred to as the Tri-Party Agreement), Appendix C, (Ecology et al. 1994) as a unit to be remediated under the authority of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). As such, the U.S. Environmental Protection Agency will serve as the lead regulatory agency for remediation of this tank under the CERCLA process. At the time this unit was identified as a CERCLA site under the Tri-Party Agreement, it was placed within the 200-ZP-2 Operable Unit. In 1997, The Tri-parties redefined 200 Area Operable Units into waste groupings (Waste Site Grouping for 200 Areas Soils Investigations [DOE-RL 1992 and 1997]). A waste group contains waste sites that share similarities in geological conditions, function, and types of waste received. Tank 241-Z-361 is identified within the CERCLA Plutonium/Organic-rich Process Condensate/Process Waste Group (DOE-RL 1992). The Plutonium/Organic-rich Process Condensate/Process Waste Group has been prioritized for remediation beginning in the year 2004. Results of Tank 216-Z-361 sampling and analysis described in this Sampling and Analysis Plan (SAP) and in the SAP for sludge sampling (to be developed) will determine whether expedited response actions are required before 2004 because of the hazards associated with tank contents. Should data conclude that remediation of this tank should occur earlier than is planned for the other sites in the waste group, it is likely that removal alternatives will be analyzed in a separate Engineering Evaluation/Cost Analysis (EE/CA). Removal actions would proceed after the U.S. Environmental Protection Agency (EPA) signs an Action Memorandum describing the selected removal alternative for Tank 216-Z-361. If the data conclude that there is no immediate threat to human health and the environment from this tank, remedial actions for the tank will be defined in a

  1. Tank 241-U-105 push mode core sampling and analysis plan

    International Nuclear Information System (INIS)

    Bell, K.E.

    1995-01-01

    This Sampling and Analysis Plan (SAP) will identify characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples and two push mode core samples from tank 241-U-105 (U-105)

  2. 7 CFR 52.38 - Sampling plans and procedures for determining lot compliance.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Sampling plans and procedures for determining lot... Sampling § 52.38 Sampling plans and procedures for determining lot compliance. (a) Except as otherwise... Administrator, samples shall be selected from each lot in the exact number of sample units indicated for the lot...

  3. Binomial distribution for the charge asymmetry parameter

    International Nuclear Information System (INIS)

    Chou, T.T.; Yang, C.N.

    1984-01-01

    It is suggested that for high energy collisions the distribution with respect to the charge asymmetry z = nsub(F) - nsub(B) is binomial, where nsub(F) and nsub(B) are the forward and backward charge multiplicities. (orig.)

  4. Binomial test models and item difficulty

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1979-01-01

    In choosing a binomial test model, it is important to know exactly what conditions are imposed on item difficulty. In this paper these conditions are examined for both a deterministic and a stochastic conception of item responses. It appears that they are more restrictive than is generally

  5. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  6. The Normal Distribution From Binomial to Normal

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 6. The Normal Distribution From Binomial to Normal. S Ramasubramanian. Series Article Volume 2 Issue 6 June 1997 pp 15-24. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/06/0015-0024 ...

  7. Longitudinal beta-binomial modeling using GEE for overdispersed binomial data.

    Science.gov (United States)

    Wu, Hongqian; Zhang, Ying; Long, Jeffrey D

    2017-03-15

    Longitudinal binomial data are frequently generated from multiple questionnaires and assessments in various scientific settings for which the binomial data are often overdispersed. The standard generalized linear mixed effects model may result in severe underestimation of standard errors of estimated regression parameters in such cases and hence potentially bias the statistical inference. In this paper, we propose a longitudinal beta-binomial model for overdispersed binomial data and estimate the regression parameters under a probit model using the generalized estimating equation method. A hybrid algorithm of the Fisher scoring and the method of moments is implemented for computing the method. Extensive simulation studies are conducted to justify the validity of the proposed method. Finally, the proposed method is applied to analyze functional impairment in subjects who are at risk of Huntington disease from a multisite observational study of prodromal Huntington disease. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Simulation on Poisson and negative binomial models of count road accident modeling

    Science.gov (United States)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  9. Expansion around half-integer values, binomial sums, and inverse binomial sums

    International Nuclear Information System (INIS)

    Weinzierl, Stefan

    2004-01-01

    I consider the expansion of transcendental functions in a small parameter around rational numbers. This includes in particular the expansion around half-integer values. I present algorithms which are suitable for an implementation within a symbolic computer algebra system. The method is an extension of the technique of nested sums. The algorithms allow in addition the evaluation of binomial sums, inverse binomial sums and generalizations thereof

  10. Group Acceptance Sampling Plan for Lifetime Data Using Generalized Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2010-02-01

    Full Text Available In this paper, a group acceptance sampling plan (GASP is introduced for the situations when lifetime of the items follows the generalized Pareto distribution. The design parameters such as minimum group size and acceptance number are determined when the consumer’s risk and the test termination time are specified. The proposed sampling plan is compared with the existing sampling plan. It is concluded that the proposed sampling plan performs better than the existing plan in terms of minimum sample size required to reach the same decision.

  11. Tomography of binomial states of the radiation field

    NARCIS (Netherlands)

    Bazrafkan, MR; Man'ko, [No Value

    2004-01-01

    The symplectic, optical, and photon-number tomographic symbols of binomial states of the radiation field are studied. Explicit relations for all tomograms of the binomial states are obtained. Two measures for nonclassical properties of these states are discussed.

  12. A New System of Skip-Lot Sampling Plans including Resampling

    Science.gov (United States)

    Jun, Chi-Hyuck

    2014-01-01

    Skip-lot sampling plans have been widely used in industries to reduce the inspection efforts when products have good quality records. These schemes are known as economically advantageous and useful to minimize the cost of the inspection of the final lots. A new system of skip-lot sampling plan called SkSP-R is proposed in this paper. The performance measures for the proposed SkSP-R plan are derived using the Markov chain formulation. The proposed plan is found to be more efficient than the single sampling plan and the SkSP-2 plan. PMID:24574871

  13. A binomial random sum of present value models in investment analysis

    OpenAIRE

    Βουδούρη, Αγγελική; Ντζιαχρήστος, Ευάγγελος

    1997-01-01

    Stochastic present value models have been widely adopted in financial theory and practice and play a very important role in capital budgeting and profit planning. The purpose of this paper is to introduce a binomial random sum of stochastic present value models and offer an application in investment analysis.

  14. Confidence Intervals for Asbestos Fiber Counts: Approximate Negative Binomial Distribution.

    Science.gov (United States)

    Bartley, David; Slaven, James; Harper, Martin

    2017-03-01

    The negative binomial distribution is adopted for analyzing asbestos fiber counts so as to account for both the sampling errors in capturing only a finite number of fibers and the inevitable human variation in identifying and counting sampled fibers. A simple approximation to this distribution is developed for the derivation of quantiles and approximate confidence limits. The success of the approximation depends critically on the use of Stirling's expansion to sufficient order, on exact normalization of the approximating distribution, on reasonable perturbation of quantities from the normal distribution, and on accurately approximating sums by inverse-trapezoidal integration. Accuracy of the approximation developed is checked through simulation and also by comparison to traditional approximate confidence intervals in the specific case that the negative binomial distribution approaches the Poisson distribution. The resulting statistics are shown to relate directly to early research into the accuracy of asbestos sampling and analysis. Uncertainty in estimating mean asbestos fiber concentrations given only a single count is derived. Decision limits (limits of detection) and detection limits are considered for controlling false-positive and false-negative detection assertions and are compared to traditional limits computed assuming normal distributions. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2017.

  15. Jefferson Proving Ground Site-Specific Sampling Design Plan

    National Research Council Canada - National Science Library

    1992-01-01

    The purpose of this document is to outline field sampling and laboratory analyses that are to be conducted as part of the Jefferson Proving Ground Site-Specific Sampling and Analysis (SSSA) program...

  16. Comparison of the efficiency between two sampling plans for aflatoxins analysis in maize

    Science.gov (United States)

    Mallmann, Adriano Olnei; Marchioro, Alexandro; Oliveira, Maurício Schneider; Rauber, Ricardo Hummes; Dilkin, Paulo; Mallmann, Carlos Augusto

    2014-01-01

    Variance and performance of two sampling plans for aflatoxins quantification in maize were evaluated. Eight lots of maize were sampled using two plans: manual, using sampling spear for kernels; and automatic, using a continuous flow to collect milled maize. Total variance and sampling, preparation, and analysis variance were determined and compared between plans through multifactor analysis of variance. Four theoretical distribution models were used to compare aflatoxins quantification distributions in eight maize lots. The acceptance and rejection probabilities for a lot under certain aflatoxin concentration were determined using variance and the information on the selected distribution model to build the operational characteristic curves (OC). Sampling and total variance were lower at the automatic plan. The OC curve from the automatic plan reduced both consumer and producer risks in comparison to the manual plan. The automatic plan is more efficient than the manual one because it expresses more accurately the real aflatoxin contamination in maize. PMID:24948911

  17. Pooling overdispersed binomial data to estimate event rate.

    Science.gov (United States)

    Young-Xu, Yinong; Chan, K Arnold

    2008-08-19

    The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  18. Pooling overdispersed binomial data to estimate event rate

    Directory of Open Access Journals (Sweden)

    Chan K Arnold

    2008-08-01

    Full Text Available Abstract Background The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. Methods We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. Results In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. Conclusion The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  19. Number-Phase Wigner Representation and Entropic Uncertainty Relations for Binomial and Negative Binomial States

    International Nuclear Information System (INIS)

    Amitabh, J.; Vaccaro, J.A.; Hill, K.E.

    1998-01-01

    We study the recently defined number-phase Wigner function S NP (n,θ) for a single-mode field considered to be in binomial and negative binomial states. These states interpolate between Fock and coherent states and coherent and quasi thermal states, respectively, and thus provide a set of states with properties ranging from uncertain phase and sharp photon number to sharp phase and uncertain photon number. The distribution function S NP (n,θ) gives a graphical representation of the complimentary nature of the number and phase properties of these states. We highlight important differences between Wigner's quasi probability function, which is associated with the position and momentum observables, and S NP (n,θ), which is associated directly with the photon number and phase observables. We also discuss the number-phase entropic uncertainty relation for the binomial and negative binomial states and we show that negative binomial states give a lower phase entropy than states which minimize the phase variance

  20. Background Information for the Nevada National Security Site Integrated Sampling Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    Farnham, Irene; Marutzky, Sam

    2014-12-01

    This document describes the process followed to develop the Nevada National Security Site (NNSS) Integrated Sampling Plan (referred to herein as the Plan). It provides the Plan’s purpose and objectives, and briefly describes the Underground Test Area (UGTA) Activity, including the conceptual model and regulatory requirements as they pertain to groundwater sampling. Background information on other NNSS groundwater monitoring programs—the Routine Radiological Environmental Monitoring Plan (RREMP) and Community Environmental Monitoring Program (CEMP)—and their integration with the Plan are presented. Descriptions of the evaluations, comments, and responses of two Sampling Plan topical committees are also included.

  1. Wildlife Conservation Planning Using Stochastic Optimization and Importance Sampling

    Science.gov (United States)

    Robert G. Haight; Laurel E. Travis

    1997-01-01

    Formulations for determining conservation plans for sensitive wildlife species must account for economic costs of habitat protection and uncertainties about how wildlife populations will respond. This paper describes such a formulation and addresses the computational challenge of solving it. The problem is to determine the cost-efficient level of habitat protection...

  2. UMTRA project water sampling and analysis plan, Naturita, Colorado

    International Nuclear Information System (INIS)

    1994-04-01

    Surface remedial action is scheduled to begin at the Naturita UMTRA Project processing site in the spring of 1994. No water sampling was performed during 1993 at either the Naturita processing site (NAT-01) or the Dry Flats disposal site (NAT-12). Results of previous water sampling at the Naturita processing site indicate that ground water in the alluvium is contaminated as a result of uranium processing activities. Baseline ground water conditions have been established in the uppermost aquifer at the Dry Flats disposal site. Water sampling activities scheduled for April 1994 include preconstruction sampling of selected monitor wells at the processing site, surface water sampling of the San Miguel River, sampling of several springs/seeps in the vicinity of the disposal site, and sampling of two monitor wells in Coke Oven Valley. The monitor well locations provide sampling points to characterize ground water quality and flow conditions in the vicinity of the sites. The list of analytes has been updated to reflect constituents related to uranium processing activities and the parameters needed for geochemical evaluation. Water sampling will be conducted annually at minimum during the period of construction activities

  3. UMTRA Project water sampling and analysis plan, Grand Junction, Colorado. Revision 1, Version 6

    International Nuclear Information System (INIS)

    1995-09-01

    This water sampling and analysis plan describes the planned, routine ground water sampling activities at the Grand Junction US DOE Uranium Mill Tailings Remedial Action (UMTRA) Project site (GRJ-01) in Grand Junction, Colorado, and at the Cheney Disposal Site (GRJ-03) near Grand Junction. The plan identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequencies for the routine monitoring stations at the sites. Regulatory basis is in the US EPA regulations in 40 CFR Part 192 (1994) and EPA ground water quality standards of 1995 (60 FR 2854). This plan summarizes results of past water sampling activities, details water sampling activities planned for the next 2 years, and projects sampling activities for the next 5 years

  4. Determination of Optimal Double Sampling Plan using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Sampath Sundaram

    2012-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Designing double sampling plan requires identification of sample sizes and acceptance numbers. In this paper a genetic algorithm has been designed for the selection of optimal acceptance numbers and sample sizes for the specified producer’s risk and consumer’s risk. Implementation of the algorithm has been illustrated numerically for different choices of quantities involved in a double sampling plan   

  1. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland. Volume 1: Field Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, P.; Biang, R.; Dolak, D.; Dunn, C.; Martino, L.; Patton, T.; Wang, Y.; Yuen, C.

    1995-03-01

    The Environmental Management Division (EMD) of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland (Figure 1. 1). Since World War II activities in the Edgewood Area have included the development, manufacture, testing, and destruction of chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). Considerable archival information about J-Field exists as a result of efforts by APG staff to characterize the hazards associated with the site. Contamination of J-Field was first detected during an environmental survey of the Edgewood Area conducted in 1977 and 1978 by the US Army Toxic and Hazardous Materials Agency (USATHAMA) (predecessor to the US Army Environmental Center [AEC]). As part of a subsequent USATHAMA -environmental survey, 11 wells were installed and sampled at J-Field. Contamination at J-Field was also detected during a munitions disposal survey conducted by Princeton Aqua Science in 1983. The Princeton Aqua Science investigation involved the installation and sampling of nine wells and the collection and analysis of surficial and deep composite soil samples. In 1986, a Resource Conservation and Recovery Act (RCRA) permit (MD3-21-002-1355) requiring a basewide RCRA Facility Assessment (RFA) and a hydrogeologic assessment of J-Field was issued by the US Environmental Protection Agency (EPA). In 1987, the US Geological Survey (USGS) began a two-phased hydrogeologic assessment in data were collected to model, groundwater flow at J-Field. Soil gas investigations were conducted, several well clusters were installed, a groundwater flow model was developed, and groundwater and surface water monitoring programs were established that continue today.

  2. Operable Unit 3-13, Group 3, Other Surface Soils (Phase II) Field Sampling Plan

    Energy Technology Data Exchange (ETDEWEB)

    G. L. Schwendiman

    2006-07-27

    This Field Sampling Plan describes the Operable Unit 3-13, Group 3, Other Surface Soils, Phase II remediation field sampling activities to be performed at the Idaho Nuclear Technology and Engineering Center located within the Idaho National Laboratory Site. Sampling activities described in this plan support characterization sampling of new sites, real-time soil spectroscopy during excavation, and confirmation sampling that verifies that the remedial action objectives and remediation goals presented in the Final Record of Decision for Idaho Nuclear Technology and Engineering Center, Operable Unit 3-13 have been met.

  3. Failure-censored accelerated life test sampling plans for Weibull distribution under expected test time constraint

    International Nuclear Information System (INIS)

    Bai, D.S.; Chun, Y.R.; Kim, J.G.

    1995-01-01

    This paper considers the design of life-test sampling plans based on failure-censored accelerated life tests. The lifetime distribution of products is assumed to be Weibull with a scale parameter that is a log linear function of a (possibly transformed) stress. Two levels of stress higher than the use condition stress, high and low, are used. Sampling plans with equal expected test times at high and low test stresses which satisfy the producer's and consumer's risk requirements and minimize the asymptotic variance of the test statistic used to decide lot acceptability are obtained. The properties of the proposed life-test sampling plans are investigated

  4. Modeling Tetanus Neonatorum case using the regression of negative binomial and zero-inflated negative binomial

    Science.gov (United States)

    Amaliana, Luthfatul; Sa'adah, Umu; Wayan Surya Wardhani, Ni

    2017-12-01

    Tetanus Neonatorum is an infectious disease that can be prevented by immunization. The number of Tetanus Neonatorum cases in East Java Province is the highest in Indonesia until 2015. Tetanus Neonatorum data contain over dispersion and big enough proportion of zero-inflation. Negative Binomial (NB) regression is an alternative method when over dispersion happens in Poisson regression. However, the data containing over dispersion and zero-inflation are more appropriately analyzed by using Zero-Inflated Negative Binomial (ZINB) regression. The purpose of this study are: (1) to model Tetanus Neonatorum cases in East Java Province with 71.05 percent proportion of zero-inflation by using NB and ZINB regression, (2) to obtain the best model. The result of this study indicates that ZINB is better than NB regression with smaller AIC.

  5. Discovering Binomial Identities with PascGaloisJE

    Science.gov (United States)

    Evans, Tyler J.

    2008-01-01

    We describe exercises in which students use PascGaloisJE to formulate conjectures about certain binomial identities which hold when the binomial coefficients are interpreted as elements in the cyclic group Z[subscript p] of integers modulo a prime integer "p". In addition to having an appealing visual component, these exercises are open-ended and…

  6. Wigner Function of Density Operator for Negative Binomial Distribution

    International Nuclear Information System (INIS)

    Xu Xinglei; Li Hongqi

    2008-01-01

    By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we derive the Wigner function of squeezed number state, which yields negative binomial distribution by virtue of the entangled state representation and the entangled Wigner operator

  7. Penggunaan Model Binomial Pada Penentuan Harga Opsi Saham Karyawan

    Directory of Open Access Journals (Sweden)

    Dara Puspita Anggraeni

    2015-11-01

    Full Text Available Binomial Model for Valuing Employee Stock Options. Employee Stock Options (ESO differ from standard exchange-traded options. The three main differences in a valuation model for employee stock options : Vesting Period, Exit Rate and Non-Transferability. In this thesis, the model for valuing employee stock options discussed. This model are implement with a generalized binomial model.

  8. Media Exposure: How Models Simplify Sampling

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    1998-01-01

    In media planning, the distribution of exposures to more ad spots in more media (print, TV, radio) is crucial to the evaluation of the campaign. If such information should be sampled, it would only be possible in expensive panel-studies (eg TV-meter panels). Alternatively, the distribution of exp...... of exposures may be modelled statistically, using the Beta distribution combined with the Binomial Distribution. Examples are given....

  9. Group SkSP-R sampling plan for accelerated life tests

    Indian Academy of Sciences (India)

    This study presents a group skip-lot sampling plan using resampling (SkSP-R) for accelerated life tests. It is assumed that the lifetime of a product follows Weibull distribution with known shape parameter under the use condition, while the scale parameter can be obtained from acceleration factor. The plan parameters ...

  10. Group SkSP-R sampling plan for accelerated life tests

    Indian Academy of Sciences (India)

    Muhammad Aslam

    2017-09-15

    Sep 15, 2017 ... erated life tests. There is lack of study in literature on designing of SkSP-. R using group sampling plan under accelerated life test as a reference plan. In this paper, we focus on the designing of. SkSP-R accelerated life test by assuming that the lifetime of a product follows the Weibull distribution. The advan-.

  11. Engineering Task Plan to Expand the Environmental Operational Envelope of Core Sampling

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This Engineering Task Plan authorizes the development of an Alternative Generation and Analysis (AGA). The AGA will determine how to expand the environmental operating envelope during core sampling operations

  12. Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-10-24

    This plan incorporates U.S. Department of Energy (DOE) Office of Legacy Management (LM) standard operating procedures (SOPs) into environmental monitoring activities and will be implemented at all sites managed by LM. This document provides detailed procedures for the field sampling teams so that samples are collected in a consistent and technically defensible manner. Site-specific plans (e.g., long-term surveillance and maintenance plans, environmental monitoring plans) document background information and establish the basis for sampling and monitoring activities. Information will be included in site-specific tabbed sections to this plan, which identify sample locations, sample frequencies, types of samples, field measurements, and associated analytes for each site. Additionally, within each tabbed section, program directives will be included, when developed, to establish additional site-specific requirements to modify or clarify requirements in this plan as they apply to the corresponding site. A flowchart detailing project tasks required to accomplish routine sampling is displayed in Figure 1. LM environmental procedures are contained in the Environmental Procedures Catalog (LMS/PRO/S04325), which incorporates American Society for Testing and Materials (ASTM), DOE, and U.S. Environmental Protection Agency (EPA) guidance. Specific procedures used for groundwater and surface water monitoring are included in Appendix A. If other environmental media are monitored, SOPs used for air, soil/sediment, and biota monitoring can be found in the site-specific tabbed sections in Appendix D or in site-specific documents. The procedures in the Environmental Procedures Catalog are intended as general guidance and require additional detail from planning documents in order to be complete; the following sections fulfill that function and specify additional procedural requirements to form SOPs. Routine revision of this Sampling and Analysis Plan will be conducted annually at the

  13. Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites

    International Nuclear Information System (INIS)

    2012-01-01

    This plan incorporates U.S. Department of Energy (DOE) Office of Legacy Management (LM) standard operating procedures (SOPs) into environmental monitoring activities and will be implemented at all sites managed by LM. This document provides detailed procedures for the field sampling teams so that samples are collected in a consistent and technically defensible manner. Site-specific plans (e.g., long-term surveillance and maintenance plans, environmental monitoring plans) document background information and establish the basis for sampling and monitoring activities. Information will be included in site-specific tabbed sections to this plan, which identify sample locations, sample frequencies, types of samples, field measurements, and associated analytes for each site. Additionally, within each tabbed section, program directives will be included, when developed, to establish additional site-specific requirements to modify or clarify requirements in this plan as they apply to the corresponding site. A flowchart detailing project tasks required to accomplish routine sampling is displayed in Figure 1. LM environmental procedures are contained in the Environmental Procedures Catalog (LMS/PRO/S04325), which incorporates American Society for Testing and Materials (ASTM), DOE, and U.S. Environmental Protection Agency (EPA) guidance. Specific procedures used for groundwater and surface water monitoring are included in Appendix A. If other environmental media are monitored, SOPs used for air, soil/sediment, and biota monitoring can be found in the site-specific tabbed sections in Appendix D or in site-specific documents. The procedures in the Environmental Procedures Catalog are intended as general guidance and require additional detail from planning documents in order to be complete; the following sections fulfill that function and specify additional procedural requirements to form SOPs. Routine revision of this Sampling and Analysis Plan will be conducted annually at the

  14. Ground-water sample collection and analysis plan for the ground-water surveillance project

    International Nuclear Information System (INIS)

    Bryce, R.W.; Evans, J.C.; Olsen, K.B.

    1991-12-01

    The Pacific Northwest Laboratory performs ground-water sampling activities at the US Department of Energy's (DOE's) Hanford Site in support of DOE's environmental surveillance responsibilities. The purpose of this document is to translate DOE's General Environmental Protection Program (DOE Order 5400.1) into a comprehensive ground-water sample collection and analysis plan for the Hanford Site. This sample collection and analysis plan sets forth the environmental surveillance objectives applicable to ground water, identifies the strategy for selecting sample collection locations, and lists the analyses to be performed to meet those objectives

  15. Planning Considerations Related to Collecting and Analyzing Samples of the Martian Soils

    Science.gov (United States)

    Liu, Yang; Mellon, Mike T.; Ming, Douglas W.; Morris, Richard V.; Noble, Sarah K.; Sullivan, Robert J.; Taylor, Lawrence A.; Beaty, David W.

    2014-01-01

    The Mars Sample Return (MSR) End-to-End International Science Analysis Group (E2E-iSAG [1]) established scientific objectives associ-ated with Mars returned-sample science that require the return and investigation of one or more soil samples. Soil is defined here as loose, unconsolidated materials with no implication for the presence or absence of or-ganic components. The proposed Mars 2020 (M-2020) rover is likely to collect and cache soil in addition to rock samples [2], which could be followed by future sample retrieval and return missions. Here we discuss key scientific consid-erations for sampling and caching soil samples on the proposed M-2020 rover, as well as the state in which samples would need to be preserved when received by analysts on Earth. We are seeking feedback on these draft plans as input to mission requirement formulation. A related planning exercise on rocks is reported in an accompanying abstract [3].

  16. Marginalized zero-inflated negative binomial regression with application to dental caries.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Long, D Leann; Divaris, Kimon

    2016-05-10

    The zero-inflated negative binomial regression model (ZINB) is often employed in diverse fields such as dentistry, health care utilization, highway safety, and medicine to examine relationships between exposures of interest and overdispersed count outcomes exhibiting many zeros. The regression coefficients of ZINB have latent class interpretations for a susceptible subpopulation at risk for the disease/condition under study with counts generated from a negative binomial distribution and for a non-susceptible subpopulation that provides only zero counts. The ZINB parameters, however, are not well-suited for estimating overall exposure effects, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. In this paper, a marginalized zero-inflated negative binomial regression (MZINB) model for independent responses is proposed to model the population marginal mean count directly, providing straightforward inference for overall exposure effects based on maximum likelihood estimation. Through simulation studies, the finite sample performance of MZINB is compared with marginalized zero-inflated Poisson, Poisson, and negative binomial regression. The MZINB model is applied in the evaluation of a school-based fluoride mouthrinse program on dental caries in 677 children. Copyright © 2015 John Wiley & Sons, Ltd.

  17. UMTRA Project water sampling and analysis plan, Durango, Colorado. Revision 1

    International Nuclear Information System (INIS)

    1995-09-01

    Planned, routine ground water sampling activities at the US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project site in Durango, Colorado, are described in this water sampling and analysis plan. The plan identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the routine monitoring stations at the site. The ground water data are used to characterize the site ground water compliance strategies and to monitor contaminants of potential concern identified in the baseline risk assessment (DOE, 1995a). Regulatory basis for routine ground water monitoring at UMTRA Project sites is derived from the US EPA regulations in 40 CFR Part 192 (1994) and EPA standards of 1995 (60 FR 2854). Sampling procedures are guided by the UMTRA Project standard operating procedures (SOP) (JEG, n.d.), the Technical Approach Document (TAD) (DOE, 1989), and the most effective technical approach for the site

  18. UMTRA Project water sampling and analysis plan, Gunnison, Colorado: Revision 1

    International Nuclear Information System (INIS)

    1994-11-01

    This water sampling and analysis plan summarizes the results of previous water sampling activities and the plan for future water sampling activities, in accordance with the Guidance Document for Preparing Sampling and Analysis Plans for UMTRA Sites. A buffer zone monitoring plan for the Dos Rios Subdivision is included as an appendix. The buffer zone monitoring plan was developed to ensure continued protection to the public from residual contamination. The buffer zone is beyond the area depicted as contaminated ground water due to former milling operations. Surface remedial action at the Gunnison Uranium Mill Tailings Remedial Action Project site began in 1992; completion is expected in 1995. Ground water and surface water will be sampled semiannually at the Gunnison processing site and disposal site. Results of previous water sampling at the Gunnison processing site indicate that ground water in the alluvium is contaminated by the former uranium processing activities. Background ground water conditions have been established in the uppermost aquifer at the Gunnison disposal site. The monitor well locations provide a representative distribution of sampling points to characterize ground water quality and ground water flow conditions in the vicinity of the sites. The list of analytes has been modified with time to reflect constituents that are related to uranium processing activities and the parameters needed for geochemical evaluation

  19. Engineering task plan for the development, fabrication and installation of rotary mode core sample truck bellows

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    The Rotary Mode Core Sampling Trucks (RMSCTs) currently use a multi-sectioned bellows between the grapple box and the quill rod to compensate for drill head motion and to provide a path for purge gas. The current bellows, which is detailed on drawing H-2-690059, is expensive to procure, has a lengthy procurement cycle, and is prone to failure. Therefore, a task has been identified to design, fabricate, and install a replacement bellows. This Engineering Task Plan (ETP) is the management plan document for accomplishing the identified tasks. Any changes in scope of the ETP shall require formal direction by the Characterization Engineering manager. This document shall also be considered the work planning document for developmental control per Development Control Requirements (HNF 1999a). This Engineering Task Plan (ETP) is the management plan document for accomplishing the design, fabrication, and installation of a replacement bellows assembly for the Rotary Mode Core Sampling Trucks 3 and 4 (RMCST)

  20. 50 CFR 260.61 - Sampling plans and procedures for determining lot compliance.

    Science.gov (United States)

    2010-10-01

    ... determining lot compliance. 260.61 Section 260.61 Wildlife and Fisheries NATIONAL MARINE FISHERIES SERVICE... Sampling plans and procedures for determining lot compliance. (a) Except as otherwise provided for in this... shall be selected from each lot in the exact number of sample units indicated for the lot size in the...

  1. 10 CFR Appendix A to Subpart U of... - Sampling Plan for Enforcement Testing of Electric Motors

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Sampling Plan for Enforcement Testing of Electric Motors A Appendix A to Subpart U of Part 431 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY EFFICIENCY... mean energy efficiency of the first sample (X 1) is equal to or greater than the lower control limit...

  2. Sampling and analysis plan for the former Atomic Energy Commission bus lot property

    Energy Technology Data Exchange (ETDEWEB)

    Nielson, R.R.

    1998-07-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities proposed in support of an initial investigation of the former Atomic Energy Commission (AEC) bus lot property currently owned by Battelle Memorial Institute. The purpose of the proposed sampling and analysis activity is to investigate the potential for contamination above established action levels. The SAP will provide defensible data of sufficient quality and quantity to support recommendations of whether any further action within the study area is warranted. To assist in preparing sampling plans and reports, the Washington State Department of Ecology (Ecology) has published Guidance on Sampling and Data Analysis Methods. To specifically address sampling plans for petroleum-contaminated sites, Ecology has also published Guidance for Remediation of Petroleum Contaminated Sites. Both documents were used as guidance in preparing this plan. In 1992, a soil sample was taken within the current study area as part of a project to remove two underground storage tanks (USTs) at Battelle`s Sixth Street Warehouse Petroleum Dispensing Station (Section 1.3). The results showed that the sample contained elevated levels of total petroleum hydrocarbons (TPH) in the heavy distillate range. This current study was initiated in part as a result of that discovery. The following topics are considered: the historical background of the site, current site conditions, previous investigations performed at the site, an evaluation based on the available data, and the contaminants of potential concern (COPC).

  3. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  4. Test plan for K Basin Sludge Canister and Floor Sampling Device

    International Nuclear Information System (INIS)

    Meling, T.A.

    1995-01-01

    This document provides the test plan and procedure forms for conducting the functional and operational acceptance testing of the K Basin Sludge Canister and Floor Sampling Device(s). These samplers samples sludge off the floor of the 100K Basins and out of 100K fuel storage canisters

  5. Sampling and analysis plan for the former Atomic Energy Commission bus lot property

    International Nuclear Information System (INIS)

    Nielson, R.R.

    1998-07-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities proposed in support of an initial investigation of the former Atomic Energy Commission (AEC) bus lot property currently owned by Battelle Memorial Institute. The purpose of the proposed sampling and analysis activity is to investigate the potential for contamination above established action levels. The SAP will provide defensible data of sufficient quality and quantity to support recommendations of whether any further action within the study area is warranted. To assist in preparing sampling plans and reports, the Washington State Department of Ecology (Ecology) has published Guidance on Sampling and Data Analysis Methods. To specifically address sampling plans for petroleum-contaminated sites, Ecology has also published Guidance for Remediation of Petroleum Contaminated Sites. Both documents were used as guidance in preparing this plan. In 1992, a soil sample was taken within the current study area as part of a project to remove two underground storage tanks (USTs) at Battelle's Sixth Street Warehouse Petroleum Dispensing Station (Section 1.3). The results showed that the sample contained elevated levels of total petroleum hydrocarbons (TPH) in the heavy distillate range. This current study was initiated in part as a result of that discovery. The following topics are considered: the historical background of the site, current site conditions, previous investigations performed at the site, an evaluation based on the available data, and the contaminants of potential concern (COPC)

  6. Acceptance Sampling Plans Based on Truncated Life Tests for Sushila Distribution

    Directory of Open Access Journals (Sweden)

    Amer Ibrahim Al-Omari

    2018-03-01

    Full Text Available An acceptance sampling plan problem based on truncated life tests when the lifetime following a Sushila distribution is considered in this paper. For various acceptance numbers, confidence levels and values of the ratio between fixed experiment time and particular mean lifetime, the minimum sample sizes required to ascertain a specified mean life were found. The operating characteristic function values of the suggested sampling plans and the producer’s risk are presented. Some tables are provided and the results are illustrated by an example of a real data set.

  7. Validity of the negative binomial distribution in particle production

    International Nuclear Information System (INIS)

    Cugnon, J.; Harouna, O.

    1987-01-01

    Some aspects of the clan picture for particle production in nuclear and in high-energy processes are examined. In particular, it is shown that the requirement of having logarithmic distribution for the number of particles within a clan in order to generate a negative binomial should not be taken strictly. Large departures are allowed without distorting too much the negative binomial. The question of the undetected particles is also studied. It is shown that, under reasonable circumstances, the latter do not affect the negative binomial character of the multiplicity distribution

  8. Zweifache Stichprobenpruefplaene fuer Qualitative und Quantitative Markmale mit Minimaler Maximaler ASN (Double Sample Plans for Qualitative and Quantitative Features with Minimal Maximal Average Sample Number (ASN))

    National Research Council Canada - National Science Library

    Mueller, Kai

    1998-01-01

    ... with minimal maximal Average Sampler Number (ASN) in chapter three. Chapter four features single and double variable sample plans and, as expected, chapter five provides the determination of these plans with minimal maximal ASN...

  9. Gas liquid sampling for closed canisters in KW Basin - test plan

    International Nuclear Information System (INIS)

    Pitkoff, C.C.

    1995-01-01

    Test procedures for the gas/liquid sampler. Characterization of the Spent Nuclear Fuel, SNF, sealed in canisters at KW-Basin is needed to determine the state of storing SNF wet. Samples of the liquid and the gas in the closed canisters will be taken to gain characterization information. Sampling equipment has been designed to retrieve gas and liquid from the closed canisters in KW basin. This plan is written to outline the test requirements for this developmental sampling equipment

  10. Sampling and analysis plan for Wayne Interim Storage Site (WISS), Wayne, New Jersey

    International Nuclear Information System (INIS)

    Brown, K.S.; Murray, M.E.; Rodriguez, R.E.

    1998-10-01

    This field sampling plan describes the methodology to perform an independent radiological verification survey and chemical characterization of a remediated area of the subpile at the Wayne Interim Storage Site, Wayne, New Jersey.Data obtained from collection and analysis of systematic and biased soil samples will be used to assess the status of remediation at the site and verify the final radiological status. The objective of this plan is to describe the methods for obtaining sufficient and valid measurements and analytical data to supplement and verify a radiological profile already established by the Project Remediation Management Contractor (PMC). The plan describes the procedure for obtaining sufficient and valid analytical data on soil samples following remediation of the first layer of the subpile. Samples will be taken from an area of the subpile measuring approximately 30 m by 80 m from which soil has been excavated to a depth of approximately 20 feet to confirm that the soil beneath the excavated area does not exceed radiological guidelines established for the site or chemical regulatory limits for inorganic metals. After the WISS has been fully remediated, the Department of Energy will release it for industrial/commercial land use in accordance with the Record of Decision. This plan provides supplemental instructions to guidelines and procedures established for sampling and analysis activities. Procedures will be referenced throughout this plan as applicable, and are available for review if necessary

  11. An Internationally Coordinated Science Management Plan for Samples Returned from Mars

    Science.gov (United States)

    Haltigin, T.; Smith, C. L.

    2015-12-01

    Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.

  12. Sampling-Based Coverage Path Planning for Complex 3D Structures

    Science.gov (United States)

    2012-09-01

    not only are divers at risk of serious injury , but there is a possibility that hidden ordinance may go undetected if any portion of the hull is missed...coverage planning problem. Definition 6 (Probabilistic Completeness of a Local Coverage Planning Algorithm). Let LCA be a proposed algorithm for the...clearance from obstacles along the full length of the path, the probability that such a path is found by LCA approaches one as the number of samples drawn

  13. Statistical Inference for a Class of Multivariate Negative Binomial Distributions

    DEFF Research Database (Denmark)

    Rubak, Ege H.; Møller, Jesper; McCullagh, Peter

    This paper considers statistical inference procedures for a class of models for positively correlated count variables called -permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...

  14. Multifractal structure of multiplicity distributions and negative binomials

    International Nuclear Information System (INIS)

    Malik, S.; Delhi, Univ.

    1997-01-01

    The paper presents experimental results of the multifractal structure analysis in proton-emulsion interactions at 800 GeV. The multiplicity moments have a power law dependence on the mean multiplicity in varying bin sizes of pseudorapidity. The values of generalised dimensions are calculated from the slope value. The multifractal characteristics are also examined in the light of negative binomials. The observed multiplicity moments and those derived from the negative-binomial fits agree well with each other. Also the values of D q , both observed and derived from the negative-binomial fits not only decrease with q typifying multifractality but also agree well each other showing consistency with the negative-binomial form

  15. Sampling and analysis plan for the 100-D Ponds voluntary remediation project

    International Nuclear Information System (INIS)

    1996-08-01

    This Sampling and Analysis Plan (SAP) describes the sampling and analytical activities which will be performed to support closure of the 100-D Ponds Resource Conservation and Recovery Act (RCRA) treatment, storage, and/or disposal (TSD) unit. This SAP includes the Field Sampling Plan (FSP) presented in Section 2.0, and the Quality Assurance Project Plan (QAPjP) described in Section 3.0. The FSP defines the sampling and analytical methodologies to be performed, and the QAPjP provides or includes information on the requirements for precision, accuracy, representativeness, comparability, and completeness of the analytical data. This sampling and analysis plan was developed using the Environmental Protection Agency's Seven-Step Data Quality Objectives (DQO) Guidance (EPA, 1994). The purpose of the DQO meetings was (1) to identify the contaminants of concern and their cleanup levels under the Washington State Model Toxics Control Act (MTCA, WAC-173-340) Method B, and (2) to determine the number and locations of samples necessary to verify that the 100-D Ponds meet the cleanup criteria. The data collected will be used to support RCRA closure of this TSD unit

  16. Entanglement of Generalized Two-Mode Binomial States and Teleportation

    International Nuclear Information System (INIS)

    Wang Dongmei; Yu Youhong

    2009-01-01

    The entanglement of the generalized two-mode binomial states in the phase damping channel is studied by making use of the relative entropy of the entanglement. It is shown that the factors of q and p play the crucial roles in control the relative entropy of the entanglement. Furthermore, we propose a scheme of teleporting an unknown state via the generalized two-mode binomial states, and calculate the mean fidelity of the scheme. (general)

  17. Detecting non-binomial sex allocation when developmental mortality operates.

    Science.gov (United States)

    Wilkinson, Richard D; Kapranas, Apostolos; Hardy, Ian C W

    2016-11-07

    Optimal sex allocation theory is one of the most intricately developed areas of evolutionary ecology. Under a range of conditions, particularly under population sub-division, selection favours sex being allocated to offspring non-randomly, generating non-binomial variances of offspring group sex ratios. Detecting non-binomial sex allocation is complicated by stochastic developmental mortality, as offspring sex can often only be identified on maturity with the sex of non-maturing offspring remaining unknown. We show that current approaches for detecting non-binomiality have limited ability to detect non-binomial sex allocation when developmental mortality has occurred. We present a new procedure using an explicit model of sex allocation and mortality and develop a Bayesian model selection approach (available as an R package). We use the double and multiplicative binomial distributions to model over- and under-dispersed sex allocation and show how to calculate Bayes factors for comparing these alternative models to the null hypothesis of binomial sex allocation. The ability to detect non-binomial sex allocation is greatly increased, particularly in cases where mortality is common. The use of Bayesian methods allows for the quantification of the evidence in favour of each hypothesis, and our modelling approach provides an improved descriptive capability over existing approaches. We use a simulation study to demonstrate substantial improvements in power for detecting non-binomial sex allocation in situations where current methods fail, and we illustrate the approach in real scenarios using empirically obtained datasets on the sexual composition of groups of gregarious parasitoid wasps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Negative Binomial Distribution and the multiplicity moments at the LHC

    International Nuclear Information System (INIS)

    Praszalowicz, Michal

    2011-01-01

    In this work we show that the latest LHC data on multiplicity moments C 2 -C 5 are well described by a two-step model in the form of a convolution of the Poisson distribution with energy-dependent source function. For the source function we take Γ Negative Binomial Distribution. No unexpected behavior of Negative Binomial Distribution parameter k is found. We give also predictions for the higher energies of 10 and 14 TeV.

  19. Negative binomial properties and clan structure in multiplicity distributions

    International Nuclear Information System (INIS)

    Giovannini, A.; Van Hove, L.

    1988-01-01

    We review the negative binomial properties measured recently for many multiplicity distributions of high energy hadronic, semi-leptonic reactions in selected rapidity intervals. We analyse them in terms of the ''clan'' structure which can be defined for any negative binomial distribution. By comparing reactions we exhibit a number of regularities for the average number N-bar of clans and the average charged multiplicity (n-bar) c per clan. 22 refs., 6 figs. (author)

  20. On some binomial [Formula: see text]-difference sequence spaces.

    Science.gov (United States)

    Meng, Jian; Song, Meimei

    2017-01-01

    In this paper, we introduce the binomial sequence spaces [Formula: see text], [Formula: see text] and [Formula: see text] by combining the binomial transformation and difference operator. We prove the BK -property and some inclusion relations. Furthermore, we obtain Schauder bases and compute the α -, β - and γ -duals of these sequence spaces. Finally, we characterize matrix transformations on the sequence space [Formula: see text].

  1. Using Load Balancing to Scalably Parallelize Sampling-Based Motion Planning Algorithms

    KAUST Repository

    Fidel, Adam

    2014-05-01

    Motion planning, which is the problem of computing feasible paths in an environment for a movable object, has applications in many domains ranging from robotics, to intelligent CAD, to protein folding. The best methods for solving this PSPACE-hard problem are so-called sampling-based planners. Recent work introduced uniform spatial subdivision techniques for parallelizing sampling-based motion planning algorithms that scaled well. However, such methods are prone to load imbalance, as planning time depends on region characteristics and, for most problems, the heterogeneity of the sub problems increases as the number of processors increases. In this work, we introduce two techniques to address load imbalance in the parallelization of sampling-based motion planning algorithms: an adaptive work stealing approach and bulk-synchronous redistribution. We show that applying these techniques to representatives of the two major classes of parallel sampling-based motion planning algorithms, probabilistic roadmaps and rapidly-exploring random trees, results in a more scalable and load-balanced computation on more than 3,000 cores. © 2014 IEEE.

  2. Partitioning detectability components in populations subject to within-season temporary emigration using binomial mixture models.

    Science.gov (United States)

    O'Donnell, Katherine M; Thompson, Frank R; Semlitsch, Raymond D

    2015-01-01

    Detectability of individual animals is highly variable and nearly always binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model's potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3-5 surveys each spring and fall 2010-2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability.

  3. Comparison of multinomial and binomial proportion methods for analysis of multinomial count data.

    Science.gov (United States)

    Galyean, M L; Wester, D B

    2010-10-01

    Simulation methods were used to generate 1,000 experiments, each with 3 treatments and 10 experimental units/treatment, in completely randomized (CRD) and randomized complete block designs. Data were counts in 3 ordered or 4 nominal categories from multinomial distributions. For the 3-category analyses, category probabilities were 0.6, 0.3, and 0.1, respectively, for 2 of the treatments, and 0.5, 0.35, and 0.15 for the third treatment. In the 4-category analysis (CRD only), probabilities were 0.3, 0.3, 0.2, and 0.2 for treatments 1 and 2 vs. 0.4, 0.4, 0.1, and 0.1 for treatment 3. The 3-category data were analyzed with generalized linear mixed models as an ordered multinomial distribution with a cumulative logit link or by regrouping the data (e.g., counts in 1 category/sum of counts in all categories), followed by analysis of single categories as binomial proportions. Similarly, the 4-category data were analyzed as a nominal multinomial distribution with a glogit link or by grouping data as binomial proportions. For the 3-category CRD analyses, empirically determined type I error rates based on pair-wise comparisons (F- and Wald chi(2) tests) did not differ between multinomial and individual binomial category analyses with 10 (P = 0.38 to 0.60) or 50 (P = 0.19 to 0.67) sampling units/experimental unit. When analyzed as binomial proportions, power estimates varied among categories, with analysis of the category with the greatest counts yielding power similar to the multinomial analysis. Agreement between methods (percentage of experiments with the same results for the overall test for treatment effects) varied considerably among categories analyzed and sampling unit scenarios for the 3-category CRD analyses. Power (F-test) was 24.3, 49.1, 66.9, 83.5, 86.8, and 99.7% for 10, 20, 30, 40, 50, and 100 sampling units/experimental unit for the 3-category multinomial CRD analyses. Results with randomized complete block design simulations were similar to those with the CRD

  4. Sampling and Analysis Plan for N-Springs ERA pump-and-treat waste media

    International Nuclear Information System (INIS)

    Stankovich, M.T.

    1996-07-01

    This Sampling and Analysis Plan details the administrative procedures to be used to conduct sampling activities for characterization of spent ion-exchange resin, clinoptilolite, generated from the N-Springs pump-and-treat expedited response action. N-Springs (riverbank seeps) is located in the 100-N Area of the Hanford Site. Groundwater contained in the 100-NR-2 Operable Unit is contaminated with various radionuclides derived from wastewater disposal practices and spills associated with 100-N Reactor Operations

  5. Chemical Hygiene Plan for Onsite Measurement and Sample Shipping Facility Activities

    International Nuclear Information System (INIS)

    Price, W.H.

    1998-01-01

    This chemical hygiene plan presents the requirements established to ensure the protection of employee health while performing work in mobile laboratories, the sample shipping facility, and at the onsite radiological counting facility. This document presents the measures to be taken to promote safe work practices and to minimize worker exposure to hazardous chemicals. Specific hazardous chemicals present in the mobile laboratories, the sample shipping facility, and in the radiological counting facility are presented in Appendices A through G

  6. Diagnosis disclosure and advance care planning in Alzheimer disease: opinions of a sample of Italian citizens.

    Science.gov (United States)

    Riva, Maddalena; Caratozzolo, Salvatore; Cerea, Erica; Gottardi, Federica; Zanetti, Marina; Vicini Chilovi, Barbara; Cristini, Carlo; Padovani, Alessandro; Rozzini, Luca

    2014-08-01

    In current Alzheimer disease (AD) research there is growing asymmetry between the modest benefits of the currently available treatments, in contrast to the possibility to diagnose AD early in its natural history. This complex situation brings along a number of important ethical issues about diagnosis disclosure and end-of-life decisions that need to be addressed. The principal aim of the study was to investigate the attitudes towards disclosure of a diagnosis of AD and disposition towards completion of advance care planning, in a sample of Italian citizens. A convenience sample of 1,111 Italian citizens recruited from a community hospital in Brescia were interviewed using a structured questionnaire with both yes/no and multiple choice format questions about AD. The majority of the sample (83 %) wanted disclosure for themselves. Women and caregivers were significantly less likely to agree that their hypothetically afflicted relative should be informed of a diagnosis of AD. The majority of the sample (81 %) was in favor of advance care planning completion, most of all younger participants and non-caregivers. Less than a third of the sample (24 %) was aware of the existence a judicially appointed guardian for patients affected by dementia. The majority of the participants wanted a potential diagnosis of AD to be disclosed to them and to their relatives if they were to be afflicted. The utility of completion of advance care planning and designation of a judicially appointed guardian is frequently endorsed by the sample.

  7. Practical sampling plans for Varroa destructor (Acari: Varroidae) in Apis mellifera (Hymenoptera: Apidae) colonies and apiaries.

    Science.gov (United States)

    Lee, K V; Moon, R D; Burkness, E C; Hutchison, W D; Spivak, M

    2010-08-01

    The parasitic mite Varroa destructor Anderson & Trueman (Acari: Varroidae) is arguably the most detrimental pest of the European-derived honey bee, Apis mellifera L. Unfortunately, beekeepers lack a standardized sampling plan to make informed treatment decisions. Based on data from 31 commercial apiaries, we developed sampling plans for use by beekeepers and researchers to estimate the density of mites in individual colonies or whole apiaries. Beekeepers can estimate a colony's mite density with chosen level of precision by dislodging mites from approximately to 300 adult bees taken from one brood box frame in the colony, and they can extrapolate to mite density on a colony's adults and pupae combined by doubling the number of mites on adults. For sampling whole apiaries, beekeepers can repeat the process in each of n = 8 colonies, regardless of apiary size. Researchers desiring greater precision can estimate mite density in an individual colony by examining three, 300-bee sample units. Extrapolation to density on adults and pupae may require independent estimates of numbers of adults, of pupae, and of their respective mite densities. Researchers can estimate apiary-level mite density by taking one 300-bee sample unit per colony, but should do so from a variable number of colonies, depending on apiary size. These practical sampling plans will allow beekeepers and researchers to quantify mite infestation levels and enhance understanding and management of V. destructor.

  8. Food safety assurance systems: Microbiological testing, sampling plans, and microbiological criteria

    NARCIS (Netherlands)

    Zwietering, M.H.; Ross, T.; Gorris, L.G.M.

    2014-01-01

    Microbiological criteria give information about the quality or safety of foods. A key component of a microbiological criterion is the sampling plan. Considering: (1) the generally low level of pathogens that are deemed tolerable in foods, (2) large batch sizes, and (3) potentially substantial

  9. Group SkSP-R sampling plan for accelerated life tests

    Indian Academy of Sciences (India)

    Muhammad Aslam

    2017-09-15

    Sep 15, 2017 ... determined through a non-linear optimisation problem for fixed values of producer's risk and consumer's risk. The advantages of the proposed plan over the existing one are explained with some practical examples. Keywords. SkSP-R sampling; life test; Weibull distribution; producer's risk; consumer's risk.

  10. Application of a binomial cusum control chart to monitor one drinking water indicator

    Directory of Open Access Journals (Sweden)

    Elisa Henning

    2014-02-01

    Full Text Available The aim of this study is to analyze the use of a binomial cumulative sum chart (CUSUM to monitor the presence of total coliforms, biological indicators of quality of water supplies in water treatment processes. The sample series were monthly taken from a water treatment plant and were analyzed from 2007 to 2009. The statistical treatment of the data was performed using GNU R, and routines were created for the approximation of the upper limit of the binomial CUSUM chart. Furthermore, a comparative study was conducted to investigate whether there is a significant difference in sensitivity between the use of CUSUM and the traditional Shewhart chart, the most commonly used chart in process monitoring. The results obtained demonstrate that this study was essential for making the right choice in selecting a chart for the statistical analysis of this process.

  11. EMP Attachment 1 DOE-SC PNNL Site Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    Meier, Kirsten M.

    2011-11-10

    This Sampling and Analysis Plan (SAP) is written for the radiological environmental air surveillance program for the DOE-SC PNNL Site, Richland Washington. It provides the requirements for planning sampling events, and the requirements imposed on the analytical laboratory analyzing the air samples. The actual air sampling process is in procedure EPRP-AIR-029. The rationale for analyte selection, media, and sampling site location has been vetted through the data quality objectives (DQO) process (Barnett et al. 2010). The results from the DQO process have been reviewed and approved by the Washington State Department of Health. The DQO process (Barnett et al. 2010) identified seven specific radionuclides for analysis along with the need for gross alpha and gross beta radiological analyses. The analytes are {sup 241}Am, {sup 243}Am, {sup 244}Cm, {sup 60}Co, {sup 238}Pu, {sup 239}Pu, and {sup 233}U. The report also determined that air samples for particulates are the only sample matrix required for the monitoring program. These samples are collected on 47-mm glass-fiber filters.

  12. Visual Sample Plan (VSP) Statistical Software as Related to the CTBTO's On-Site Inspection Procedure

    International Nuclear Information System (INIS)

    Pulsipher, Trenton C.; Walsh, Stephen J.; Pulsipher, Brent A.; Milbrath, Brian D.

    2010-01-01

    In the event of a potential nuclear weapons test the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is commissioned to conduct an on-site investigation (OSI) of the suspected test site in an effort to find confirmatory evidence of the nuclear test. The OSI activities include collecting air, surface soil, and underground samples to search for indications of a nuclear weapons test - these indicators include radionuclides and radioactive isotopes Ar and Xe. This report investigates the capability of the Visual Sample Plan (VSP) software to contribute to the sampling activities of the CTBTO during an OSI. VSP is a statistical sampling design software, constructed under data quality objectives, which has been adapted for environmental remediation and contamination detection problems for the EPA, US Army, DoD and DHS among others. This report provides discussion of a number of VSP sample designs, which may be pertinent to the work undertaken during an OSI. Examples and descriptions of such designs include hot spot sampling, combined random and judgment sampling, multiple increment sampling, radiological transect surveying, and a brief description of other potentially applicable sampling methods. Further, this work highlights a potential need for the use of statistically based sample designs in OSI activities. The use of such designs may enable canvassing a sample area without full sampling, provide a measure of confidence that radionuclides are not present, and allow investigators to refocus resources in other areas of concern.

  13. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    TEMPLETON, A.M.

    2000-04-10

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.

  14. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    TEMPLETON, A.M.

    2000-01-31

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.

  15. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    TEMPLETON, A.M.

    2000-03-06

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications.

  16. Tank 241-AZ-101 Mixer Pump Test Vapor Sampling and Analysis Plan

    International Nuclear Information System (INIS)

    TEMPLETON, A.M.

    2000-01-01

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for vapor samples obtained during the operation of mixer pumps in tank 241-AZ-101. The primary purpose of the mixer pump test (MPT) is to demonstrate that the two 300 horsepower mixer pumps installed in tank 241-AZ-101 can mobilize the settled sludge so that it can be retrieved for treatment and vitrification. Sampling will be performed in accordance with Tank 241-AZ-101 Mixer Pump Test Data Quality Objective (Banning 1999) and Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis (Mulkey 1999). The sampling will verify if current air emission estimates used in the permit application are correct and provide information for future air permit applications

  17. Planning for the Collection and Analysis of Samples of Martian Granular Materials Potentially to be Returned by Mars Sample Return

    Science.gov (United States)

    Carrier, B. L.; Beaty, D. W.

    2017-12-01

    NASA's Mars 2020 rover is scheduled to land on Mars in 2021 and will be equipped with a sampling system capable of collecting rock cores, as well as a specialized drill bit for collecting unconsolidated granular material. A key mission objective is to collect a set of samples that have enough scientific merit to justify returning to Earth. In the case of granular materials, we would like to catalyze community discussion on what we would do with these samples if they arrived in our laboratories, as input to decision-making related to sampling the regolith. Numerous scientific objectives have been identified which could be achieved or significantly advanced via the analysis of martian rocks, "regolith," and gas samples. The term "regolith" has more than one definition, including one that is general and one that is much more specific. For the purpose of this analysis we use the term "granular materials" to encompass the most general meaning and restrict "regolith" to a subset of that. Our working taxonomy includes the following: 1) globally sourced airfall dust (dust); 2) saltation-sized particles (sand); 3) locally sourced decomposed rock (regolith); 4) crater ejecta (ejecta); and, 5) other. Analysis of martian granular materials could serve to advance our understanding areas including habitability and astrobiology, surface-atmosphere interactions, chemistry, mineralogy, geology and environmental processes. Results of these analyses would also provide input into planning for future human exploration of Mars, elucidating possible health and mechanical hazards caused by the martian surface material, as well as providing valuable information regarding available resources for ISRU and civil engineering purposes. Results would also be relevant to matters of planetary protection and ground-truthing orbital observations. We will present a preliminary analysis of the following, in order to generate community discussion and feedback on all issues relating to: What are the

  18. UMTRA project water sampling and analysis plan, Falls City, Texas. Revision 1

    International Nuclear Information System (INIS)

    1995-09-01

    Planned, routine ground water sampling activities at the US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project site near Falls City, Texas, are described in this water sampling and analysis plan (WSAP). The following plan identifies and justifies the sampling locations, analytical parameters, and sampling frequency for the routine monitoring stations at the site. The ground water data are used for site characterization and risk assessment. The regulatory basis for routine ground water monitoring at UMTRA Project sites is derived from the US Environmental Protection Agency (EPA) regulations in 40 CFR Part 192. Sampling procedures are guided by the UMTRA Project standard operating procedures (SOP) (JEG, n.d.), the Technical Approach Document (TAD) (DOE, 1989), and the most effective technical approach for the site. The Falls City site is in Karnes County, Texas, approximately 8 miles [13 kilometers southwest of the town of Falls City and 46 mi (74 km) southeast of San Antonio, Texas. Before surface remedial action, the tailings site consisted of two parcels. Parcel A consisted of the mill site, one mill building, five tailings piles, and one tailings pond south of Farm-to-Market (FM) Road 1344 and west of FM 791. A sixth tailings pile designated Parcel B was north of FM 791 and east of FM 1344

  19. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  20. [Using log-binomial model for estimating the prevalence ratio].

    Science.gov (United States)

    Ye, Rong; Gao, Yan-hui; Yang, Yi; Chen, Yue

    2010-05-01

    To estimate the prevalence ratios, using a log-binomial model with or without continuous covariates. Prevalence ratios for individuals' attitude towards smoking-ban legislation associated with smoking status, estimated by using a log-binomial model were compared with odds ratios estimated by logistic regression model. In the log-binomial modeling, maximum likelihood method was used when there were no continuous covariates and COPY approach was used if the model did not converge, for example due to the existence of continuous covariates. We examined the association between individuals' attitude towards smoking-ban legislation and smoking status in men and women. Prevalence ratio and odds ratio estimation provided similar results for the association in women since smoking was not common. In men however, the odds ratio estimates were markedly larger than the prevalence ratios due to a higher prevalence of outcome. The log-binomial model did not converge when age was included as a continuous covariate and COPY method was used to deal with the situation. All analysis was performed by SAS. Prevalence ratio seemed to better measure the association than odds ratio when prevalence is high. SAS programs were provided to calculate the prevalence ratios with or without continuous covariates in the log-binomial regression analysis.

  1. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Energy-Aware Path Planning for UAS Persistent Sampling and Surveillance

    Science.gov (United States)

    Shaw-Cortez, Wenceslao

    The focus of this work is to develop an energy-aware path planning algorithm that maximizes UAS endurance, while performing sampling and surveillance missions in a known, stationary wind environment. The energy-aware aspect is specifically tailored to extract energy from the wind to reduce thrust use, thereby increasing aircraft endurance. Wind energy extraction is performed by static soaring and dynamic soaring. Static soaring involves using upward wind currents to increase altitude and potential energy. Dynamic soaring involves taking advantage of wind gradients to exchange potential and kinetic energy. The path planning algorithm developed in this work uses optimization to combine these soaring trajectories with the overarching sampling and surveillance mission. The path planning algorithm uses a simplified aircraft model to tractably optimize soaring trajectories. This aircraft model is presented and along with the derivation of the equations of motion. A nonlinear program is used to create the soaring trajectories based on a given optimization problem. This optimization problem is defined using a heuristic decision tree, which defines appropriate problems given a sampling and surveillance mission and a wind model. Simulations are performed to assess the path planning algorithm. The results are used to identify properties of soaring trajectories as well as to determine what wind conditions support minimal thrust soaring. Additional results show how the path planning algorithm can be tuned between maximizing aircraft endurance and performing the sampling and surveillance mission. A means of trajectory stitching is demonstrated to show how the periodic soaring segments can be combined together to provide a full solution to an infinite/long horizon problem.

  3. DOE responses to Ecology review comments for ''Sampling and analysis plans for the 100-D Ponds voluntary remediation project''

    International Nuclear Information System (INIS)

    1996-01-01

    The Sampling and Analysis Plan describes the sampling and analytical activities which will be performed to support closure of the 100-D Ponds at the Hanford Reservation. This report contains responses by the US Department of Energy to Ecology review for ''Sampling and Analysis Plan for the 100-D Ponds Voluntary Remediation Project.''

  4. Guidance document for preparing water sampling and analysis plans for UMTRA Project sites. Revision 1

    International Nuclear Information System (INIS)

    1995-09-01

    A water sampling and analysis plan (WSAP) is prepared for each Uranium Mill Tailings Remedial Action (UMTRA) Project site to provide the rationale for routine ground water sampling at disposal sites and former processing sites. The WSAP identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the routine ground water monitoring stations at each site. This guidance document has been prepared by the Technical Assistance Contractor (TAC) for the US Department of Energy (DOE). Its purpose is to provide a consistent technical approach for sampling and monitoring activities performed under the WSAP and to provide a consistent format for the WSAP documents. It is designed for use by the TAC in preparing WSAPs and by the DOE, US Nuclear Regulatory Commission, state and tribal agencies, other regulatory agencies, and the public in evaluating the content of WSAPS

  5. [Planning of sampling studies of morbidity and hospitalization at large territories].

    Science.gov (United States)

    Voronenko, Iu V

    1990-01-01

    This paper provides the methodological principles of planning sample survey of hospital morbidity of population at the level of large economic regions and the republic at large. Proceeding from the example of socio-hygienic study of burn injuries requiring hospital treatment, and organization of hospital care for the burned, the methods of elaborating the principles of sample surveys, the use of stratified cluster selection with optimum placing of sample units are described. The methodological scheme permits to minimize the size of the sample and to obtain representative evaluations of the needed indices at large territories. While studying 23 sample units from 796 formed in the republic, evaluations of a number of indices were obtained which virtually coincided with the indices calculated according to the total data reported.

  6. Planning spatial sampling of the soil from an uncertain reconnaissance variogram

    Science.gov (United States)

    Lark, R. Murray; Hamilton, Elliott M.; Kaninga, Belinda; Maseka, Kakoma K.; Mutondo, Moola; Sakala, Godfrey M.; Watts, Michael J.

    2017-12-01

    An estimated variogram of a soil property can be used to support a rational choice of sampling intensity for geostatistical mapping. However, it is known that estimated variograms are subject to uncertainty. In this paper we address two practical questions. First, how can we make a robust decision on sampling intensity, given the uncertainty in the variogram? Second, what are the costs incurred in terms of oversampling because of uncertainty in the variogram model used to plan sampling? To achieve this we show how samples of the posterior distribution of variogram parameters, from a computational Bayesian analysis, can be used to characterize the effects of variogram parameter uncertainty on sampling decisions. We show how one can select a sample intensity so that a target value of the kriging variance is not exceeded with some specified probability. This will lead to oversampling, relative to the sampling intensity that would be specified if there were no uncertainty in the variogram parameters. One can estimate the magnitude of this oversampling by treating the tolerable grid spacing for the final sample as a random variable, given the target kriging variance and the posterior sample values. We illustrate these concepts with some data on total uranium content in a relatively sparse sample of soil from agricultural land near mine tailings in the Copperbelt Province of Zambia.

  7. UMTRA Project water sampling and analysis plan, Canonsburg, Pennsylvania. Revision 1

    International Nuclear Information System (INIS)

    1995-09-01

    Surface remedial action was completed at the US Department of Energy (DOE) Canonsburg and Burrell Uranium Mill Tailings Remedial Action (UMTRA) Project sites in southwestern Pennsylvania in 1985 and 1987, respectively. The Burrell disposal site, included in the UMTRA Project as a vicinity property, was remediated in conjunction with the remedial action at Canonsburg. On 27 May 1994, the Nuclear Regulatory Commission (NRC) accepted the DOE final Long-Term Surveillance Plan (LTSP) (DOE, 1993) for Burrell thus establishing the site under the general license in 10 CFR section 40.27 (1994). In accordance with the DOE guidance document for long-term surveillance (DOE, 1995), all NRC/DOE interaction on the Burrell site's long-term care now is conducted with the DOE Grand Junction Projects Office in Grand Junction, Colorado, and is no longer the responsibility of the DOE UMTRA Project Team in Albuquerque, New Mexico. Therefore, the planned sampling activities described in this water sampling and analysis plan (WSAP) are limited to the Canonsburg site. This WSAP identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequencies for routine monitoring at the Canonsburg site for calendar years 1995 and 1996. Currently, the analytical data further the site characterization and demonstrate that the disposal cell's initial performance is in accordance with design requirements

  8. Zero inflated Poisson and negative binomial regression models: application in education.

    Science.gov (United States)

    Salehi, Masoud; Roudbari, Masoud

    2015-01-01

    The number of failed courses and semesters in students are indicators of their performance. These amounts have zero inflated (ZI) distributions. Using ZI Poisson and negative binomial distributions we can model these count data to find the associated factors and estimate the parameters. This study aims at to investigate the important factors related to the educational performance of students. This cross-sectional study performed in 2008-2009 at Iran University of Medical Sciences (IUMS) with a population of almost 6000 students, 670 students selected using stratified random sampling. The educational and demographical data were collected using the University records. The study design was approved at IUMS and the students' data kept confidential. The descriptive statistics and ZI Poisson and negative binomial regressions were used to analyze the data. The data were analyzed using STATA. In the number of failed semesters, Poisson and negative binomial distributions with ZI, students' total average and quota system had the most roles. For the number of failed courses, total average, and being in undergraduate or master levels had the most effect in both models. In all models the total average have the most effect on the number of failed courses or semesters. The next important factor is quota system in failed semester and undergraduate and master levels in failed courses. Therefore, average has an important inverse effect on the numbers of failed courses and semester.

  9. Extra-binomial variation approach for analysis of pooled DNA sequencing data

    Science.gov (United States)

    Wallace, Chris

    2012-01-01

    Motivation: The invention of next-generation sequencing technology has made it possible to study the rare variants that are more likely to pinpoint causal disease genes. To make such experiments financially viable, DNA samples from several subjects are often pooled before sequencing. This induces large between-pool variation which, together with other sources of experimental error, creates over-dispersed data. Statistical analysis of pooled sequencing data needs to appropriately model this additional variance to avoid inflating the false-positive rate. Results: We propose a new statistical method based on an extra-binomial model to address the over-dispersion and apply it to pooled case-control data. We demonstrate that our model provides a better fit to the data than either a standard binomial model or a traditional extra-binomial model proposed by Williams and can analyse both rare and common variants with lower or more variable pool depths compared to the other methods. Availability: Package ‘extraBinomial’ is on http://cran.r-project.org/ Contact: chris.wallace@cimr.cam.ac.uk Supplementary information: Supplementary data are available at Bioinformatics Online. PMID:22976083

  10. Visual Sample Plan Version 7.0 User's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Matzke, Brett D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Newburn, Lisa LN [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bramer, Lisa M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilson, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dowson, Scott T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sego, Landon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pulsipher, Brent A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-03-01

    User's guide for VSP 7.0 This user's guide describes Visual Sample Plan (VSP) Version 7.0 and provides instructions for using the software. VSP selects the appropriate number and location of environmental samples to ensure that the results of statistical tests performed to provide input to risk decisions have the required confidence and performance. VSP Version 7.0 provides sample-size equations or algorithms needed by specific statistical tests appropriate for specific environmental sampling objectives. It also provides data quality assessment and statistical analysis functions to support evaluation of the data and determine whether the data support decisions regarding sites suspected of contamination. The easy-to-use program is highly visual and graphic. VSP runs on personal computers with Microsoft Windows operating systems (XP, Vista, Windows 7, and Windows 8). Designed primarily for project managers and users without expertise in statistics, VSP is applicable to two- and three-dimensional populations to be sampled (e.g., rooms and buildings, surface soil, a defined layer of subsurface soil, water bodies, and other similar applications) for studies of environmental quality. VSP is also applicable for designing sampling plans for assessing chem/rad/bio threat and hazard identification within rooms and buildings, and for designing geophysical surveys for unexploded ordnance (UXO) identification.

  11. Planning considerations for a Mars Sample Receiving Facility: summary and interpretation of three design studies.

    Science.gov (United States)

    Beaty, David W; Allen, Carlton C; Bass, Deborah S; Buxbaum, Karen L; Campbell, James K; Lindstrom, David J; Miller, Sylvia L; Papanastassiou, Dimitri A

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  12. Sampling and analysis plan for assessment of beryllium in soils surrounding TA-40 building 15

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-19

    Technical Area (TA) 40 Building 15 (40-15) is an active firing site at Los Alamos National Laboratory. The weapons facility operations (WFO) group plans to build an enclosure over the site in 2017, so that test shots may be conducted year-round. The enclosure project is described in PRID 16P-0209. 40-15 is listed on LANL OSH-ISH’s beryllium inventory, which reflects the potential for beryllium in/on soils and building surfaces at 40-15. Some areas in and around 40-15 have previously been sampled for beryllium, but past sampling efforts did not achieve complete spatial coverage of the area. This Sampling and Analysis Plan (SAP) investigates the area surrounding 40-15 via 9 deep (≥1-ft.) soil samples and 11 shallow (6-in.) soil samples. These samples will fill the spatial data gaps for beryllium at 40-15, and will be used to support OSH-ISH’s final determination of 40-15’s beryllium registry status. This SAP has been prepared by the Environmental Health Physics program in consultation with the Industrial Hygiene program. Industrial Hygiene is the owner of LANL’s beryllium program, and will make a final determination with regard to the regulatory status of beryllium at 40-15.

  13. Chromosome aberration analysis based on a beta-binomial distribution

    International Nuclear Information System (INIS)

    Otake, Masanori; Prentice, R.L.

    1983-10-01

    Analyses carried out here generalized on earlier studies of chromosomal aberrations in the populations of Hiroshima and Nagasaki, by allowing extra-binomial variation in aberrant cell counts corresponding to within-subject correlations in cell aberrations. Strong within-subject correlations were detected with corresponding standard errors for the average number of aberrant cells that were often substantially larger than was previously assumed. The extra-binomial variation is accomodated in the analysis in the present report, as described in the section on dose-response models, by using a beta-binomial (B-B) variance structure. It is emphasized that we have generally satisfactory agreement between the observed and the B-B fitted frequencies by city-dose category. The chromosomal aberration data considered here are not extensive enough to allow a precise discrimination between competing dose-response models. A quadratic gamma ray and linear neutron model, however, most closely fits the chromosome data. (author)

  14. Hadronic multiplicity distributions: the negative binomial and its alternatives

    International Nuclear Information System (INIS)

    Carruthers, P.

    1986-01-01

    We review properties of the negative binomial distribution, along with its many possible statistical or dynamical origins. Considering the relation of the multiplicity distribution to the density matrix for Boson systems, we re-introduce the partially coherent laser distribution, which allows for coherent as well as incoherent hadronic emission from the k fundamental cells, and provides equally good phenomenological fits to existing data. The broadening of non-single diffractive hadron-hadron distributions can be equally well due to the decrease of coherent with increasing energy as to the large (and rapidly decreasing) values of k deduced from negative binomial fits. Similarly the narrowness of e + -e - multiplicity distribution is due to nearly coherent (therefore nearly Poissonian) emission from a small number of jets, in contrast to the negative binomial with enormous values of k. 31 refs

  15. Hadronic multiplicity distributions: the negative binomial and its alternatives

    International Nuclear Information System (INIS)

    Carruthers, P.

    1986-01-01

    We review properties of the negative binomial distribution, along with its many possible statistical or dynamical origins. Considering the relation of the multiplicity distribution to the density matrix for boson systems, we re-introduce the partially coherent laser distribution, which allows for coherent as well as incoherent hadronic emission from the k fundamental cells, and provides equally good phenomenological fits to existing data. The broadening of non-single diffractive hadron-hadron distributions can be equally well due to the decrease of coherence with increasing energy as to the large (and rapidly decreasing) values of k deduced from negative binomial fits. Similarly the narrowness of e + -e - multiplicity distribution is due to nearly coherent (therefore nearly Poissonian) emission from a small number of jets, in contrast to the negative binomial with enormous values of k. 31 refs

  16. The k-Binomial Transforms and the Hankel Transform

    Science.gov (United States)

    Spivey, Michael Z.; Steil, Laura L.

    2006-01-01

    We give a new proof of the invariance of the Hankel transform under the binomial transform of a sequence. Our method of proof leads to three variations of the binomial transform; we call these the k-binomial transforms. We give a simple means of constructing these transforms via a triangle of numbers. We show how the exponential generating function of a sequence changes after our transforms are applied, and we use this to prove that several sequences in the On-Line Encyclopedia of Integer Sequences are related via our transforms. In the process, we prove three conjectures in the OEIS. Addressing a question of Layman, we then show that the Hankel transform of a sequence is invariant under one of our transforms, and we show how the Hankel transform changes after the other two transforms are applied. Finally, we use these results to determine the Hankel transforms of several integer sequences.

  17. Sampling and analysis plan for sampling of liquid waste streams generated by 222-S Laboratory Complex operations

    International Nuclear Information System (INIS)

    Benally, A.B.

    1997-01-01

    This Sampling and Analysis Plan (SAP) establishes the requirements and guidelines to be used by the Waste Management Federal Services of Hanford, Inc. personnel in characterizing liquid waste generated at the 222-S Laboratory Complex. The characterization process to verify the accuracy of process knowledge used for designation and subsequent management of wastes consists of three steps: to prepare the technical rationale and the appendix in accordance with the steps outlined in this SAP; to implement the SAP by sampling and analyzing the requested waste streams; and to compile the report and evaluate the findings to the objectives of this SAP. This SAP applies to portions of the 222-S Laboratory Complex defined as Generator under the Resource Conservation and Recovery Act (RCRA). Any portion of the 222-S Laboratory Complex that is defined or permitted under RCRA as a treatment, storage, or disposal (TSD) facility is excluded from this document. This SAP applies to the liquid waste generated in the 222-S Laboratory Complex. Because the analytical data obtained will be used to manage waste properly, including waste compatibility and waste designation, this SAP will provide directions for obtaining and maintaining the information as required by WAC173-303

  18. Savannah River Site sample and analysis plan for Clemson Technical Center waste

    International Nuclear Information System (INIS)

    Hagstrom, T.

    1998-04-01

    The purpose of this sampling and analysis plan is to determine the chemical, physical and radiological properties of the SRS radioactive Polychlorinated Biphenyl (PCB) liquid waste stream, to verify that it conforms to Waste Acceptance Criteria of the Department of Energy (DOE) East Tennessee Technology Park (ETTP) Toxic Substance Control Act (TSCA) Incineration Facility. Waste being sent to the ETTP TSCA Incinerator for treatment must be sufficiently characterized to ensure that the waste stream meets the waste acceptance criteria to ensure proper handling, classification, and processing of incoming waste to meet the Waste Storage and Treatment Facility's Operating Permits. This sampling and analysis plan is limited to WSRC container(s) of homogeneous or multiphasic radioactive PCB contaminated liquids generated in association with a treatability study at Clemson Technical Center (CTC) and currently stored at the WSRC Solid Waste Division Mixed Waste Storage Facility (MWSF)

  19. Interpretations and implications of negative binomial distributions of multiparticle productions

    International Nuclear Information System (INIS)

    Arisawa, Tetsuo

    2006-01-01

    The number of particles produced in high energy experiments is approximated by a negative binomial distribution. Deriving a representation of the distribution from a stochastic equation, conditions for the process to satisfy the distribution are clarified. Based on them, it is proposed that multiparticle production consists of spontaneous and induced production. The rate of the induced production is proportional to the number of existing particles. The ratio of the two production rates remains constant during the process. The ''NBD space'' is also defined where the number of particles produced in its subspaces follows negative binomial distributions with different parameters

  20. Spatial distribution pattern and sequential sampling plans for Bactrocera oleae (Gmelin (Dip: Tephritidae in olive orchards

    Directory of Open Access Journals (Sweden)

    A. Arbab

    2016-04-01

    Full Text Available The distribution of adult and larvae Bactrocera oleae (Diptera: Tephritidae, a key pest of olive, was studied in olive orchards. The first objective was to analyze the dispersion of this insect on olive and the second was to develop sampling plans based on fixed levels of precision for estimating B. oleae populations. The Taylor’s power law and Iwao’s patchiness regression models were used to analyze the data. Our results document that Iwao’s patchiness provided a better description between variance and mean density. Taylor’s b and Iwao’s β were both significantly more than 1, indicating that adults and larvae had aggregated spatial distribution. This result was further supported by the calculated common k of 2.17 and 4.76 for adult and larvae, respectively. Iwao’s a for larvae was significantly less than 0, indicating that the basic distribution component of B. oleae is the individual insect. Optimal sample sizes for fixed precision levels of 0.10 and 0.25 were estimated with Iwao’s patchiness coefficients. The optimum sample size for adult and larvae fluctuated throughout the seasons and depended upon the fly density and desired level of precision. For adult, this generally ranged from 2 to 11 and 7 to 15 traps to achieve precision levels of 0.25 and 0.10, respectively. With respect to optimum sample size, the developed fixed-precision sequential sampling plans was suitable for estimating flies density at a precision level of D=0.25. Sampling plans, presented here, should be a tool for research on pest management decisions of B. oleae.

  1. Health and safety plan for characterization sampling of ETR and MTR facilities

    International Nuclear Information System (INIS)

    Baxter, D.E.

    1994-10-01

    This health and safety plan establishes the procedures and requirements that will be used to minimize health and safety risks to persons performing Engineering Test Reactor and Materials Test Reactor characterization sampling activities, as required by the Occupational Safety and Health Administration standard, 29 CFR 1910.120. It contains information about the hazards involved in performing the tasks, and the specific actions and equipment that will be used to protect persons working at the site

  2. Laboratory Guide for Residual Stress Sample Alignment and Experiment Planning-October 2011 Version

    International Nuclear Information System (INIS)

    Cornwell, Paris A.; Bunn, Jeffrey R.; Schmidlin, Joshua E.; Hubbard, Camden R.

    2012-01-01

    The December 2010 version of the guide, ORNL/TM-2008/159, by Jeff Bunn, Josh Schmidlin, Camden Hubbard, and Paris Cornwell, has been further revised due to a major change in the GeoMagic Studio software for constructing a surface model. The Studio software update also includes a plug-in module to operate the FARO Scan Arm. Other revisions for clarity were also made. The purpose of this revision document is to guide the reader through the process of laser alignment used by NRSF2 at HFIR and VULCAN at SNS. This system was created to increase the spatial accuracy of the measurement points in a sample, reduce the use of neutron time used for alignment, improve experiment planning, and reduce operator error. The need for spatial resolution has been driven by the reduction in gauge volumes to the sub-millimeter level, steep strain gradients in some samples, and requests to mount multiple samples within a few days for relating data from each sample to a common sample coordinate system. The first step in this process involves mounting the sample on an indexer table in a laboratory set up for offline sample mounting and alignment in the same manner it would be mounted at either instrument. In the shared laboratory, a FARO ScanArm is used to measure the coordinates of points on the sample surface ('point cloud'), specific features and fiducial points. A Sample Coordinate System (SCS) needs to be established first. This is an advantage of the technique because the SCS can be defined in such a way to facilitate simple definition of measurement points within the sample. Next, samples are typically mounted to a frame of 80/20 and fiducial points are attached to the sample or frame then measured in the established sample coordinate system. The laser scan probe on the ScanArm can then be used to scan in an 'as-is' model of the sample as well as mounting hardware. GeoMagic Studio 12 is the software package used to construct the model from the point cloud the scan arm creates. Once

  3. Laboratory Guide for Residual Stress Sample Alignment and Experiment Planning-October 2011 Version

    Energy Technology Data Exchange (ETDEWEB)

    Cornwell, Paris A [ORNL; Bunn, Jeffrey R [ORNL; Schmidlin, Joshua E [ORNL; Hubbard, Camden R [ORNL

    2012-04-01

    The December 2010 version of the guide, ORNL/TM-2008/159, by Jeff Bunn, Josh Schmidlin, Camden Hubbard, and Paris Cornwell, has been further revised due to a major change in the GeoMagic Studio software for constructing a surface model. The Studio software update also includes a plug-in module to operate the FARO Scan Arm. Other revisions for clarity were also made. The purpose of this revision document is to guide the reader through the process of laser alignment used by NRSF2 at HFIR and VULCAN at SNS. This system was created to increase the spatial accuracy of the measurement points in a sample, reduce the use of neutron time used for alignment, improve experiment planning, and reduce operator error. The need for spatial resolution has been driven by the reduction in gauge volumes to the sub-millimeter level, steep strain gradients in some samples, and requests to mount multiple samples within a few days for relating data from each sample to a common sample coordinate system. The first step in this process involves mounting the sample on an indexer table in a laboratory set up for offline sample mounting and alignment in the same manner it would be mounted at either instrument. In the shared laboratory, a FARO ScanArm is used to measure the coordinates of points on the sample surface ('point cloud'), specific features and fiducial points. A Sample Coordinate System (SCS) needs to be established first. This is an advantage of the technique because the SCS can be defined in such a way to facilitate simple definition of measurement points within the sample. Next, samples are typically mounted to a frame of 80/20 and fiducial points are attached to the sample or frame then measured in the established sample coordinate system. The laser scan probe on the ScanArm can then be used to scan in an 'as-is' model of the sample as well as mounting hardware. GeoMagic Studio 12 is the software package used to construct the model from the point cloud the

  4. Supplement to the UMTRA Project water sampling and analysis plan, Maybell, Colorado

    International Nuclear Information System (INIS)

    1995-09-01

    This water sampling and analysis plan (WSAP) supplement supports the regulatory and technical basis for water sampling at the Maybell, Colorado, Uranium Mill Tailings Remedial Action (UMTRA) Project site, as defined in the 1994 WSAP document for Maybell (DOE, 1994a). Further, this supplement serves to confirm our present understanding of the site relative to the hydrogeology and contaminant distribution as well as our intention to continue to use the sampling strategy as presented in the 1994 WSAP document for Maybell. Ground water and surface water monitoring activities are derived from the US Environmental Protection Agency regulations in 40 CFR Part 192 (1994) and 60 CFR 2854 (1 995). Sampling procedures are guided by the UMTRA Project standard operating procedures (JEG, n.d.), the Technical Approach Document (DOE, 1989), and the most effective technical approach for the site. Additional site-specific documents relevant to the Maybell site are the Maybell Baseline Risk Assessment (currently in progress), the Maybell Remedial Action Plan (RAP) (DOE, 1994b), and the Maybell Environmental Assessment (DOE, 1995)

  5. Generation of the reciprocal-binomial state for optical fields

    International Nuclear Information System (INIS)

    Valverde, C.; Avelar, A.T.; Baseia, B.; Malbouisson, J.M.C.

    2003-01-01

    We compare the efficiencies of two interesting schemes to generate truncated states of the light field in running modes, namely the 'quantum scissors' and the 'beam-splitter array' schemes. The latter is applied to create the reciprocal-binomial state as a travelling wave, required to implement recent experimental proposals of phase-distribution determination and of quantum lithography

  6. Improved binomial charts for monitoring high-quality processes

    NARCIS (Netherlands)

    Albers, Willem/Wim

    2009-01-01

    For processes concerning attribute data with (very) small failure rate p, often negative binomial control charts are used. The decision whether to stop or continue is made each time r failures have occurred, for some r≥1. Finding the optimal r for detecting a given increase of p first requires

  7. Statistical inference for a class of multivariate negative binomial distributions

    DEFF Research Database (Denmark)

    Rubak, Ege Holger; Møller, Jesper; McCullagh, Peter

    This paper considers statistical inference procedures for a class of models for positively correlated count variables called α-permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...

  8. Improved binomial charts for high-quality processes

    NARCIS (Netherlands)

    Albers, Willem/Wim

    For processes concerning attribute data with (very) small failure rate p, often negative binomial control charts are used. The decision whether to stop or continue is made each time r failures have occurred, for some r≥1. Finding the optimal r for detecting a given increase of p first requires

  9. Calculation of generalized secant integral using binomial coefficients

    International Nuclear Information System (INIS)

    Guseinov, I.I.; Mamedov, B.A.

    2004-01-01

    A single series expansion relation is derived for the generalized secant (GS) integral in terms of binomial coefficients, exponential integrals and incomplete gamma functions. The convergence of the series is tested by the concrete cases of parameters. The formulas given in this study for the evaluation of GS integral show good rate of convergence and numerical stability

  10. Selecting Tools to Model Integer and Binomial Multiplication

    Science.gov (United States)

    Pratt, Sarah Smitherman; Eddy, Colleen M.

    2017-01-01

    Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…

  11. Using the β-binomial distribution to characterize forest health

    Science.gov (United States)

    S.J. Zarnoch; R.L. Anderson; R.M. Sheffield

    1995-01-01

    The β-binomial distribution is suggested as a model for describing and analyzing the dichotomous data obtained from programs monitoring the health of forests in the United States. Maximum likelihood estimation of the parameters is given as well as asymptotic likelihood ratio tests. The procedure is illustrated with data on dogwood anthracnose infection (caused...

  12. Currency lookback options and observation frequency: A binomial approach

    NARCIS (Netherlands)

    T.H.F. Cheuk; A.C.F. Vorst (Ton)

    1997-01-01

    textabstractIn the last decade, interest in exotic options has been growing, especially in the over-the-counter currency market. In this paper we consider Iookback currency options, which are path-dependent. We show that a one-state variable binomial model for currency Iookback options can

  13. Application of Negative Binomial Regression for Assessing Public ...

    African Journals Online (AJOL)

    Because the variance was nearly two times greater than the mean, the negative binomial regression model provided an improved fit to the data and accounted better for overdispersion than the Poisson regression model, which assumed that the mean and variance are the same. The level of education and race were found

  14. Sampling and analysis plan (SAP) for WESF drains and TK-100 sump

    International Nuclear Information System (INIS)

    Simmons, F.M.

    1998-01-01

    The intent of this project is to determine whether the Waste Encapsulation and Storage Facility (WESF) floor drain piping and the TK-100 sump are free from contamination. TK-100 is currently used as a catch tank to transfer low level liquid waste from WESF to Tank Farms via B Plant. This system is being modified as part of the WESF decoupling since B Plant is being deactivated. As a result of the 1,1,1-trichloroethane (TCA) discovery in TK-100, the associated WESF floor drains and the pit sump need to be sampled. Breakdown constituents have been reviewed and found to be non-hazardous. There are 29 floor drains that tie into a common header leading into the tank. To prevent high exposure during sampling of the drains, TK-100 will be removed into the B Plant canyon and a new tank will be placed in the pit before any floor drain samples are taken. The sump will be sampled prior to TK-100 removal. A sample of the sludge and any liquid in the sump will be taken and analyzed for TCA and polychlorinated biphenyl (PCB). After the sump has been sampled, the vault floor will be flushed. The flush will be transferred from the sump into TK-100. TK-100 will be moved into B Plant. The vault will then be cleaned of debris and visually inspected. If there is no visual indication of TCA or PCB staining, the vault will be painted and a new tank installed. If there is an indication of TCA or PCB from laboratory analysis or staining, further negotiations will be required to determine a path forward. A total of 8 sets of three 40ml samples will be required for all of the floor drains and sump. The sump set will include one 125ml solid sample. The only analysis required will be for TCA in liquids. PCBs will be checked in sump solids only. The Sampling and Analysis Plan (SAP) is written to provide direction for the sampling and analytical activities of the 29 WESF floor drains and the TK-100 sump. The intent of this plan is to define the responsibilities of the various organizations

  15. 40 CFR Appendix Xi to Part 86 - Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Light-Duty Vehicles XI Appendix XI to Part 86 Protection of Environment ENVIRONMENTAL... Enforcement Auditing of Light-Duty Vehicles 40% AQL Table 1—Sampling Plan Code Letter Annual sales of...

  16. WIPP Sampling and Analysis Plan for Solid Waste Management Units and Areas of Concern

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2000-05-23

    This Sampling and Analysis Plan (SAP) has been prepared to fulfill requirements of Module VII, Section VII.M.2 and Table VII.1, requirement 4 of the Waste Isolation Pilot Plant (WIPP) Hazardous Waste Permit, NM4890139088-TSDF (the Permit); (NMED [New Mexico Environment Department], 1999a). This SAP describes the approach for investigation of the Solid Waste Management Units (SWMU) and Areas of Concern (AOC) specified in the Permit. This SAP addresses the current Permit requirements for a RCRA Facility Investigation(RFI) investigation of SWMUs and AOCs. It uses the results of previous investigations performed at WIPP and expands the investigations as required by the Permit. As an alternative to the RFI specified in Module VII of the Permit, current NMED guidance identifies an Accelerated Corrective Action Approach (ACAA) that may be used for any SWMU or AOC (NMED, 1998). This accelerated approach is used to replace the standard RFI work plan and report sequence with a more flexible decision-making approach. The ACAA process allows a facility to exit the schedule of compliance contained in the facility's Hazardous and Solid Waste Amendments (HSWA) permit module and proceed on an accelerated time frame. Thus, the ACAA process can beentered either before or after a RFI work plan. According to NMED's guidance, a facility can prepare a RFI work plan or SAP for any SWMU or AOC (NMED, 1998).

  17. WIPP Sampling and Analysis Plan for Solid Waste Management Units and Areas of Concern

    International Nuclear Information System (INIS)

    2000-01-01

    This Sampling and Analysis Plan (SAP) has been prepared to fulfill requirements of Module VII, Section VII.M.2 and Table VII.1, requirement 4 of the Waste Isolation Pilot Plant (WIPP) Hazardous Waste Permit, NM4890139088-TSDF (the Permit); (NMED [New Mexico Environment Department], 1999a). This SAP describes the approach for investigation of the Solid Waste Management Units (SWMU) and Areas of Concern (AOC) specified in the Permit. This SAP addresses the current Permit requirements for a RCRA Facility Investigation(RFI) investigation of SWMUs and AOCs. It uses the results of previous investigations performed at WIPP and expands the investigations as required by the Permit. As an alternative to the RFI specified in Module VII of the Permit, current NMED guidance identifies an Accelerated Corrective Action Approach (ACAA) that may be used for any SWMU or AOC (NMED, 1998). This accelerated approach is used to replace the standard RFI work plan and report sequence with a more flexible decision-making approach. The ACAA process allows a facility to exit the schedule of compliance contained in the facility's Hazardous and Solid Waste Amendments (HSWA) permit module and proceed on an accelerated time frame. Thus, the ACAA process can beentered either before or after a RFI work plan. According to NMED's guidance, a facility can prepare a RFI work plan or SAP for any SWMU or AOC (NMED, 1998).

  18. FIRM: Sampling-based feedback motion-planning under motion uncertainty and imperfect measurements

    KAUST Repository

    Agha-mohammadi, A.-a.

    2013-11-15

    In this paper we present feedback-based information roadmap (FIRM), a multi-query approach for planning under uncertainty which is a belief-space variant of probabilistic roadmap methods. The crucial feature of FIRM is that the costs associated with the edges are independent of each other, and in this sense it is the first method that generates a graph in belief space that preserves the optimal substructure property. From a practical point of view, FIRM is a robust and reliable planning framework. It is robust since the solution is a feedback and there is no need for expensive replanning. It is reliable because accurate collision probabilities can be computed along the edges. In addition, FIRM is a scalable framework, where the complexity of planning with FIRM is a constant multiplier of the complexity of planning with PRM. In this paper, FIRM is introduced as an abstract framework. As a concrete instantiation of FIRM, we adopt stationary linear quadratic Gaussian (SLQG) controllers as belief stabilizers and introduce the so-called SLQG-FIRM. In SLQG-FIRM we focus on kinematic systems and then extend to dynamical systems by sampling in the equilibrium space. We investigate the performance of SLQG-FIRM in different scenarios. © The Author(s) 2013.

  19. Oral health workforce planning. Part 1: Data available in a sample of FDI member countries.

    Science.gov (United States)

    Yamalik, Nermin; Ensaldo-Carrasco, Eduardo; Bourgeois, Denis

    2013-12-01

    Workforce planning is a resource to measure and compare current versus future workforce. Organised dentistry needs to focus on the benefits and the determinants and various systems of workforce planning together with the challenges, new trends and threats. The aim of the study was to identify data sources from countries relating to a selection of oral health indicators in a sample of FDI member countries. The potential for differences between developed and developing countries was also examined. A cross-sectional survey study was carried out among FDI member countries classified in developed and developing countries between October 2011 and January/February 2012. A questionnaire was developed addressing the availability of 40 selected indicators distributed in four domains. Mann-Whitney U-tests to identify differences between developed and developing countries and chi-square tests for the degree of information regularly available were carried out. There is an important lack of information about indicators relevant to oral health between FDI participating countries regardless of their level of economic development. Although not significant, the availability of indicators for developing countries showed higher variability and minimum values of zero for all domains. Surveys were the source of information more frequently reported. Standardised and reliable methodologies are needed to gather information for successful workforce planning. It is of utmost importance to increase the awareness and understanding of the member National Dental Associations regarding the role, basic elements, benefits, challenges, models and critical elements of an ideal workforce planning system.

  20. Short-Run Contexts and Imperfect Testing for Continuous Sampling Plans

    Directory of Open Access Journals (Sweden)

    Mirella Rodriguez

    2018-04-01

    Full Text Available Continuous sampling plans are used to ensure a high level of quality for items produced in long-run contexts. The basic idea of these plans is to alternate between 100% inspection and a reduced rate of inspection frequency. Any inspected item that is found to be defective is replaced with a non-defective item. Because not all items are inspected, some defective items will escape to the customer. Analytical formulas have been developed that measure both the customer perceived quality and also the level of inspection effort. The analysis of continuous sampling plans does not apply to short-run contexts, where only a finite-size batch of items is to be produced. In this paper, a simulation algorithm is designed and implemented to analyze the customer perceived quality and the level of inspection effort for short-run contexts. A parameter representing the effectiveness of the test used during inspection is introduced to the analysis, and an analytical approximation is discussed. An application of the simulation algorithm that helped answer questions for the U.S. Navy is discussed.

  1. A scalable method for parallelizing sampling-based motion planning algorithms

    KAUST Repository

    Jacobs, Sam Ade

    2012-05-01

    This paper describes a scalable method for parallelizing sampling-based motion planning algorithms. It subdivides configuration space (C-space) into (possibly overlapping) regions and independently, in parallel, uses standard (sequential) sampling-based planners to construct roadmaps in each region. Next, in parallel, regional roadmaps in adjacent regions are connected to form a global roadmap. By subdividing the space and restricting the locality of connection attempts, we reduce the work and inter-processor communication associated with nearest neighbor calculation, a critical bottleneck for scalability in existing parallel motion planning methods. We show that our method is general enough to handle a variety of planning schemes, including the widely used Probabilistic Roadmap (PRM) and Rapidly-exploring Random Trees (RRT) algorithms. We compare our approach to two other existing parallel algorithms and demonstrate that our approach achieves better and more scalable performance. Our approach achieves almost linear scalability on a 2400 core LINUX cluster and on a 153,216 core Cray XE6 petascale machine. © 2012 IEEE.

  2. Sampling and analysis plan for the 116-C-5 retention basins characteristic dangerous waste determination

    International Nuclear Information System (INIS)

    Bauer, R.G.; Dunks, K.L.

    1996-03-01

    Cooling water flow from the rear face of the 100-B and 100-C reactors was diverted to large retention basins prior to discharge to the Columbia River. These retention basins delayed the release of the reactor coolant for decay of the short-lived activation products and for thermal cooling. Some of the activation products were deposited in sludge that settled in the basins and discharge lines. In addition, some contamination was deposited in soil around the basins and associated piping. The sampling objective of this project is to determine if regulated levels of leachable lead are present in the abrasive materials used to decontaminate the retention basin tank walls, in the material between the tank base plate and the concrete foundation, and in the soils immediately surrounding the perimeter of the retention basins. Sampling details, including sampling locations, frequencies, and analytical requirements, are discussed. Also described is the quality assurance plan for this project

  3. Growth Estimators and Confidence Intervals for the Mean of Negative Binomial Random Variables with Unknown Dispersion

    Directory of Open Access Journals (Sweden)

    David Shilane

    2013-01-01

    Full Text Available The negative binomial distribution becomes highly skewed under extreme dispersion. Even at moderately large sample sizes, the sample mean exhibits a heavy right tail. The standard normal approximation often does not provide adequate inferences about the data's expected value in this setting. In previous work, we have examined alternative methods of generating confidence intervals for the expected value. These methods were based upon Gamma and Chi Square approximations or tail probability bounds such as Bernstein's inequality. We now propose growth estimators of the negative binomial mean. Under high dispersion, zero values are likely to be overrepresented in the data. A growth estimator constructs a normal-style confidence interval by effectively removing a small, predetermined number of zeros from the data. We propose growth estimators based upon multiplicative adjustments of the sample mean and direct removal of zeros from the sample. These methods do not require estimating the nuisance dispersion parameter. We will demonstrate that the growth estimators' confidence intervals provide improved coverage over a wide range of parameter values and asymptotically converge to the sample mean. Interestingly, the proposed methods succeed despite adding both bias and variance to the normal approximation.

  4. Stochastic analysis of complex reaction networks using binomial moment equations.

    Science.gov (United States)

    Barzel, Baruch; Biham, Ofer

    2012-09-01

    The stochastic analysis of complex reaction networks is a difficult problem because the number of microscopic states in such systems increases exponentially with the number of reactive species. Direct integration of the master equation is thus infeasible and is most often replaced by Monte Carlo simulations. While Monte Carlo simulations are a highly effective tool, equation-based formulations are more amenable to analytical treatment and may provide deeper insight into the dynamics of the network. Here, we present a highly efficient equation-based method for the analysis of stochastic reaction networks. The method is based on the recently introduced binomial moment equations [Barzel and Biham, Phys. Rev. Lett. 106, 150602 (2011)]. The binomial moments are linear combinations of the ordinary moments of the probability distribution function of the population sizes of the interacting species. They capture the essential combinatorics of the reaction processes reflecting their stoichiometric structure. This leads to a simple and transparent form of the equations, and allows a highly efficient and surprisingly simple truncation scheme. Unlike ordinary moment equations, in which the inclusion of high order moments is prohibitively complicated, the binomial moment equations can be easily constructed up to any desired order. The result is a set of equations that enables the stochastic analysis of complex reaction networks under a broad range of conditions. The number of equations is dramatically reduced from the exponential proliferation of the master equation to a polynomial (and often quadratic) dependence on the number of reactive species in the binomial moment equations. The aim of this paper is twofold: to present a complete derivation of the binomial moment equations; to demonstrate the applicability of the moment equations for a representative set of example networks, in which stochastic effects play an important role.

  5. Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach.

    Science.gov (United States)

    Ferri, Gabriele; Cococcioni, Marco; Alvarez, Alberto

    2015-12-26

    This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality), used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called A η , is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support). The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided and show that So

  6. Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach

    Directory of Open Access Journals (Sweden)

    Gabriele Ferri

    2015-12-01

    Full Text Available This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality, used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called A η , is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support. The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided

  7. Sample size planning with the cost constraint for testing superiority and equivalence of two independent groups.

    Science.gov (United States)

    Guo, Jiin-Huarng; Chen, Hubert J; Luh, Wei-Ming

    2011-11-01

    The allocation of sufficient participants into different experimental groups for various research purposes under given constraints is an important practical problem faced by researchers. We address the problem of sample size determination between two independent groups for unequal and/or unknown variances when both the power and the differential cost are taken into consideration. We apply the well-known Welch approximate test to derive various sample size allocation ratios by minimizing the total cost or, equivalently, maximizing the statistical power. Two types of hypotheses including superiority/non-inferiority and equivalence of two means are each considered in the process of sample size planning. A simulation study is carried out and the proposed method is validated in terms of Type I error rate and statistical power. As a result, the simulation study reveals that the proposed sample size formulas are very satisfactory under various variances and sample size allocation ratios. Finally, a flowchart, tables, and figures of several sample size allocations are presented for practical reference. ©2011 The British Psychological Society.

  8. Optimal sampling plan for clean development mechanism lighting projects with lamp population decay

    International Nuclear Information System (INIS)

    Ye, Xianming; Xia, Xiaohua; Zhang, Jiangfeng

    2014-01-01

    Highlights: • A metering cost minimisation model is built with the lamp population decay to optimise CDM lighting projects sampling plan. • The model minimises the total metering cost and optimise the annual sample size during the crediting period. • The required 90/10 criterion sampling accuracy is satisfied for each CDM monitoring report. - Abstract: This paper proposes a metering cost minimisation model that minimises metering cost under the constraints of sampling accuracy requirement for clean development mechanism (CDM) energy efficiency (EE) lighting project. Usually small scale (SSC) CDM EE lighting projects expect a crediting period of 10 years given that the lighting population will decay as time goes by. The SSC CDM sampling guideline requires that the monitored key parameters for the carbon emission reduction quantification must satisfy the sampling accuracy of 90% confidence and 10% precision, known as the 90/10 criterion. For the existing registered CDM lighting projects, sample sizes are either decided by professional judgment or by rule-of-thumb without considering any optimisation. Lighting samples are randomly selected and their energy consumptions are monitored continuously by power meters. In this study, the sampling size determination problem is formulated as a metering cost minimisation model by incorporating a linear lighting decay model as given by the CDM guideline AMS-II.J. The 90/10 criterion is formulated as constraints to the metering cost minimisation problem. Optimal solutions to the problem minimise the metering cost whilst satisfying the 90/10 criterion for each reporting period. The proposed metering cost minimisation model is applicable to other CDM lighting projects with different population decay characteristics as well

  9. The importance of distribution-choice in modeling substance use data: a comparison of negative binomial, beta binomial, and zero-inflated distributions.

    Science.gov (United States)

    Wagner, Brandie; Riggs, Paula; Mikulich-Gilbertson, Susan

    2015-01-01

    It is important to correctly understand the associations among addiction to multiple drugs and between co-occurring substance use and psychiatric disorders. Substance-specific outcomes (e.g. number of days used cannabis) have distributional characteristics which range widely depending on the substance and the sample being evaluated. We recommend a four-part strategy for determining the appropriate distribution for modeling substance use data. We demonstrate this strategy by comparing the model fit and resulting inferences from applying four different distributions to model use of substances that range greatly in the prevalence and frequency of their use. Using Timeline Followback (TLFB) data from a previously-published study, we used negative binomial, beta-binomial and their zero-inflated counterparts to model proportion of days during treatment of cannabis, cigarettes, alcohol, and opioid use. The fit for each distribution was evaluated with statistical model selection criteria, visual plots and a comparison of the resulting inferences. We demonstrate the feasibility and utility of modeling each substance individually and show that no single distribution provides the best fit for all substances. Inferences regarding use of each substance and associations with important clinical variables were not consistent across models and differed by substance. Thus, the distribution chosen for modeling substance use must be carefully selected and evaluated because it may impact the resulting conclusions. Furthermore, the common procedure of aggregating use across different substances may not be ideal.

  10. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    Science.gov (United States)

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  11. Modeling random telegraph signal noise in CMOS image sensor under low light based on binomial distribution

    International Nuclear Information System (INIS)

    Zhang Yu; Wang Guangyi; Lu Xinmiao; Hu Yongcai; Xu Jiangtao

    2016-01-01

    The random telegraph signal noise in the pixel source follower MOSFET is the principle component of the noise in the CMOS image sensor under low light. In this paper, the physical and statistical model of the random telegraph signal noise in the pixel source follower based on the binomial distribution is set up. The number of electrons captured or released by the oxide traps in the unit time is described as the random variables which obey the binomial distribution. As a result, the output states and the corresponding probabilities of the first and the second samples of the correlated double sampling circuit are acquired. The standard deviation of the output states after the correlated double sampling circuit can be obtained accordingly. In the simulation section, one hundred thousand samples of the source follower MOSFET have been simulated, and the simulation results show that the proposed model has the similar statistical characteristics with the existing models under the effect of the channel length and the density of the oxide trap. Moreover, the noise histogram of the proposed model has been evaluated at different environmental temperatures. (paper)

  12. Supplement to the UMTRA project water sampling and analysis plan, Slick Rock, Colorado

    International Nuclear Information System (INIS)

    1995-09-01

    The water sampling and analysis plan (WSAP) provides the regulatory and technical basis for ground water and surface water sampling at the Uranium Mill Tailings Remedial Action (UMTRA) Project Union Carbide (UC) and North Continent (NC) processing sites and the Burro Canyon disposal site near Slick Rock, Colorado. The initial WSAP was finalized in August 1994 and will be completely revised in accordance with the WSAP guidance document (DOE, 1995) in late 1996. This version supplements the initial WSAP, reflects only minor changes in sampling that occurred in 1995, covers sampling scheduled for early 1996, and provides a preliminary projection of the next 5 years of sampling and monitoring activities. Once surface remedial action is completed at the former processing sites, additional and more detailed hydrogeologic characterization may be needed to develop the Ground Water Program conceptual ground water model and proposed compliance strategy. In addition, background ground water quality needs to be clearly defined to ensure that the baseline risk assessment accurately estimated risks from the contaminants of potential concern in contaminated ground water at the UC and NC sites

  13. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. 300-FF-1 remedial action sampling and analysis plan DQO process summary report

    International Nuclear Information System (INIS)

    Carlson, R.A.

    1997-01-01

    The scope of the 300-FF-1 Operable Unit (OU) data quality objective (DQO) process includes the soil and debris from the waste sites identified in the Proposed Plan for the 300-FF-1 and 300-FF-5 Operable units and, as the DQO process evolved, refined by the 300-FF-1 Record of Decision (ROD). Exclusions are identified in Section 1.3. The 300-FF-1 waste sites are addressed throughout the DQO process as they were subdivided in the proposed plan. The waste sites are divided into areas or zones known to be above the cleanup standards, zones below the cleanup standards, and zones where the contamination level is currently unknown. The project objectives are (1) to identify the criteria by which 300-FF-1 contaminated soil and debris may be shipped to the Environmental Restoration Disposal Facility (ERDF), (2) to identify the criteria by which the 300-FF-1 waste sites can be demonstrated as meeting the cleanup criteria, and (3) to develop a sampling and analysis plan (SAP) that adequately allows the evaluation of soil and debris analysis against the criteria

  15. Generalization of Binomial Coefficients to Numbers on the Nodes of Graphs

    NARCIS (Netherlands)

    Khmelnitskaya, A.; van der Laan, G.; Talman, Dolf

    2016-01-01

    The triangular array of binomial coefficients, or Pascal's triangle, is formed by starting with an apex of 1. Every row of Pascal's triangle can be seen as a line-graph, to each node of which the corresponding binomial coefficient is assigned. We show that the binomial coefficient of a node is equal

  16. Generalization of binomial coefficients to numbers on the nodes of graphs

    NARCIS (Netherlands)

    Khmelnitskaya, Anna Borisovna; van der Laan, Gerard; Talman, Dolf

    The triangular array of binomial coefficients, or Pascal's triangle, is formed by starting with an apex of 1. Every row of Pascal's triangle can be seen as a line-graph, to each node of which the corresponding binomial coefficient is assigned. We show that the binomial coefficient of a node is equal

  17. A comparison of methods for the analysis of binomial clustered outcomes in behavioral research.

    Science.gov (United States)

    Ferrari, Alberto; Comelli, Mario

    2016-12-01

    In behavioral research, data consisting of a per-subject proportion of "successes" and "failures" over a finite number of trials often arise. This clustered binary data are usually non-normally distributed, which can distort inference if the usual general linear model is applied and sample size is small. A number of more advanced methods is available, but they are often technically challenging and a comparative assessment of their performances in behavioral setups has not been performed. We studied the performances of some methods applicable to the analysis of proportions; namely linear regression, Poisson regression, beta-binomial regression and Generalized Linear Mixed Models (GLMMs). We report on a simulation study evaluating power and Type I error rate of these models in hypothetical scenarios met by behavioral researchers; plus, we describe results from the application of these methods on data from real experiments. Our results show that, while GLMMs are powerful instruments for the analysis of clustered binary outcomes, beta-binomial regression can outperform them in a range of scenarios. Linear regression gave results consistent with the nominal level of significance, but was overall less powerful. Poisson regression, instead, mostly led to anticonservative inference. GLMMs and beta-binomial regression are generally more powerful than linear regression; yet linear regression is robust to model misspecification in some conditions, whereas Poisson regression suffers heavily from violations of the assumptions when used to model proportion data. We conclude providing directions to behavioral scientists dealing with clustered binary data and small sample sizes. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Phase 1 Characterization sampling and analysis plan West Valley demonstration project.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R. L. (Environmental Science Division)

    2011-06-30

    The Phase 1 Characterization Sampling and Analysis Plan (CSAP) provides details about environmental data collection that will be taking place to support Phase 1 decommissioning activities described in the Phase 1 Decommissioning Plan for the West Valley Demonstration Project, Revision 2 (Phase I DP; DOE 2009). The four primary purposes of CSAP data collection are: (1) pre-design data collection, (2) remedial support, (3) post-remediation status documentation, and (4) Phase 2 decision-making support. Data collection to support these four main objectives is organized into two distinct data collection efforts. The first is data collection that will take place prior to the initiation of significant Phase 1 decommissioning activities (e.g., the Waste Management Area [WMA] 1 and WMA 2 excavations). The second is data collection that will occur during and immediately after environmental remediation in support of remediation activities. Both data collection efforts have a set of well-defined objectives that encompass the data needs of the four main CSAP data collection purposes detailed in the CSAP. The main body of the CSAP describes the overall data collection strategies that will be used to satisfy data collection objectives. The details of pre-remediation data collection are organized by WMA. The CSAP contains an appendix for each WMA that describes the details of WMA-specific pre-remediation data collection activities. The CSAP is intended to expand upon the data collection requirements identified in the Phase 1 Decommissioning Plan. The CSAP is intended to tightly integrate with the Phase 1 Final Status Survey Plan (FSSP). Data collection described by the CSAP is consistent with the FSSP where appropriate and to the extent possible.

  19. UMTRA project water sampling and analysis plan, Ambrosia Lake, New Mexico

    International Nuclear Information System (INIS)

    1994-02-01

    This water sampling and analysis plan (WSAP) provides the basis for ground water sampling at the Ambrosia Lake Uranium Mill Tailings Remedial Action (UMTRA) Project site during fiscal year 1994. It identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the monitoring locations and will be updated annually. The Ambrosia Lake site is in McKinley County, New Mexico, about 40 kilometers (km) (25 miles [mi]) north of Grants, New Mexico, and 1.6 km (1 mi) east of New Mexico Highway 509 (Figure 1.1). The town closest to the tailings pile is San Mateo, about 16 km ( 10 mi) southeast (Figure 1.2). The former mill and tailings pile are in Section 28, and two holding ponds are in Section 33, Township 14 North, Range 9 West. The site is shown on the US Geological Survey (USGS) map (USGS, 1980). The site is approximately 2100 meters (m) (7000 feet [ft]) above sea level

  20. USE OF SCALED SEMIVARIOGRAMS IN THE PLANNING SAMPLE OF SOIL PHYSICAL PROPERTIES IN SOUTHERN AMAZONAS, BRAZIL

    Directory of Open Access Journals (Sweden)

    Renato Eleotério de Aquino

    2015-02-01

    Full Text Available There is a great lack of information from soil surveys in the southern part of the State of Amazonas, Brazil. The use of tools such as geostatistics may improve environmental planning, use and management. In this study, we aimed to use scaled semivariograms in sample design of soil physical properties of some environments in Amazonas. We selected five areas located in the south of the state of Amazonas, Brazil, with varied soil uses, such as forest, archaeological dark earth (ADE, pasture, sugarcane cropping, and agroforestry. Regular mesh grids were set up in these areas with 64 sample points spaced at 10 m from each other. At these points, we determined the particle size composition, soil resistance to penetration, moisture, soil bulk density and particle density, macroporosity, microporosity, total porosity, and aggregate stability in water at a depth of 0.00-0.20 m. Descriptive and geostatistical analyses were performed. The sample density requirements were lower in the pasture area but higher in the forest. We concluded that managed-environments had differences in their soil physical properties compared to the natural forest; notably, the soil in the ADE environment is physically improved in relation to the others. The physical properties evaluated showed a structure of spatial dependence with a slight variability of the forest compared to the others. The use of the range parameter of the semivariogram analysis proved to be effective in determining an ideal sample density.

  1. Soil Sampling Plan for the transuranic storage area soil overburden and final report: Soil overburden sampling at the RWMC transuranic storage area

    Energy Technology Data Exchange (ETDEWEB)

    Stanisich, S.N.

    1994-12-01

    This Soil Sampling Plan (SSP) has been developed to provide detailed procedural guidance for field sampling and chemical and radionuclide analysis of selected areas of soil covering waste stored at the Transuranic Storage Area (TSA) at the Idaho National Engineering Laboratory`s (INEL) Radioactive Waste Management Complex (RWMC). The format and content of this SSP represents a complimentary hybrid of INEL Waste Management--Environmental Restoration Program, and Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) Remedial Investigation/Feasibility Study (RI/FS) sampling guidance documentation. This sampling plan also functions as a Quality Assurance Project Plan (QAPP). The QAPP as a controlling mechanism during sampling to ensure that all data collected are valid, reliabile, and defensible. This document outlines organization, objectives and quality assurance/quality control (QA/QC) activities to achieve the desired data quality goals. The QA/QC requirements for this project are outlined in the Data Collection Quality Assurance Plan (DCQAP) for the Buried Waste Program. The DCQAP is a program plan and does not outline the site specific requirements for the scope of work covered by this SSP.

  2. Soil Sampling Plan for the transuranic storage area soil overburden and final report: Soil overburden sampling at the RWMC transuranic storage area

    International Nuclear Information System (INIS)

    Stanisich, S.N.

    1994-12-01

    This Soil Sampling Plan (SSP) has been developed to provide detailed procedural guidance for field sampling and chemical and radionuclide analysis of selected areas of soil covering waste stored at the Transuranic Storage Area (TSA) at the Idaho National Engineering Laboratory's (INEL) Radioactive Waste Management Complex (RWMC). The format and content of this SSP represents a complimentary hybrid of INEL Waste Management--Environmental Restoration Program, and Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) Remedial Investigation/Feasibility Study (RI/FS) sampling guidance documentation. This sampling plan also functions as a Quality Assurance Project Plan (QAPP). The QAPP as a controlling mechanism during sampling to ensure that all data collected are valid, reliabile, and defensible. This document outlines organization, objectives and quality assurance/quality control (QA/QC) activities to achieve the desired data quality goals. The QA/QC requirements for this project are outlined in the Data Collection Quality Assurance Plan (DCQAP) for the Buried Waste Program. The DCQAP is a program plan and does not outline the site specific requirements for the scope of work covered by this SSP

  3. Determination of finite-difference weights using scaled binomial windows

    KAUST Repository

    Chu, Chunlei

    2012-05-01

    The finite-difference method evaluates a derivative through a weighted summation of function values from neighboring grid nodes. Conventional finite-difference weights can be calculated either from Taylor series expansions or by Lagrange interpolation polynomials. The finite-difference method can be interpreted as a truncated convolutional counterpart of the pseudospectral method in the space domain. For this reason, we also can derive finite-difference operators by truncating the convolution series of the pseudospectral method. Various truncation windows can be employed for this purpose and they result in finite-difference operators with different dispersion properties. We found that there exists two families of scaled binomial windows that can be used to derive conventional finite-difference operators analytically. With a minor change, these scaled binomial windows can also be used to derive optimized finite-difference operators with enhanced dispersion properties. © 2012 Society of Exploration Geophysicists.

  4. Fat suppression in MR imaging with binomial pulse sequences

    International Nuclear Information System (INIS)

    Baudovin, C.J.; Bryant, D.J.; Bydder, G.M.; Young, I.R.

    1989-01-01

    This paper reports on a study to develop pulse sequences allowing suppression of fat signal on MR images without eliminating signal from other tissues with short T1. They have developed such a technique involving selective excitation of protons in water, based on a binomial pulse sequence. Imaging is performed at 0.15 T. Careful shimming is performed to maximize separation of fat and water peaks. A spin-echo 1,500/80 sequence is used, employing 90 degrees pulse with transit frequency optimized for water with null excitation of 20 H offset, followed by a section-selective 180 degrees pulse. With use of the binomial sequence for imagining, reduction in fat signal is seen on images of the pelvis and legs of volunteers. Patient studies show dramatic improvement in visualization of prostatic carcinoma compared with standard sequences

  5. On pricing futures options on random binomial tree

    International Nuclear Information System (INIS)

    Bayram, Kamola; Ganikhodjaev, Nasir

    2013-01-01

    The discrete-time approach to real option valuation has typically been implemented in the finance literature using a binomial tree framework. Instead we develop a new model by randomizing the environment and call such model a random binomial tree. Whereas the usual model has only one environment (u, d) where the price of underlying asset can move by u times up and d times down, and pair (u, d) is constant over the life of the underlying asset, in our new model the underlying security is moving in two environments namely (u 1 , d 1 ) and (u 2 , d 2 ). Thus we obtain two volatilities σ 1 and σ 2 . This new approach enables calculations reflecting the real market since it consider the two states of market normal and extra ordinal. In this paper we define and study Futures options for such models.

  6. PENERAPAN REGRESI BINOMIAL NEGATIF UNTUK MENGATASI OVERDISPERSI PADA REGRESI POISSON

    Directory of Open Access Journals (Sweden)

    PUTU SUSAN PRADAWATI

    2013-09-01

    Full Text Available Poisson regression was used to analyze the count data which Poisson distributed. Poisson regression analysis requires state equidispersion, in which the mean value of the response variable is equal to the value of the variance. However, there are deviations in which the value of the response variable variance is greater than the mean. This is called overdispersion. If overdispersion happens and Poisson Regression analysis is being used, then underestimated standard errors will be obtained. Negative Binomial Regression can handle overdispersion because it contains a dispersion parameter. From the simulation data which experienced overdispersion in the Poisson Regression model it was found that the Negative Binomial Regression was better than the Poisson Regression model.

  7. Improvement of sampling plans for Salmonella detection in pooled table eggs by use of real-time PCR

    DEFF Research Database (Denmark)

    Pasquali, Frédérique; De Cesare, Alessandra; Valero, Antonio

    2014-01-01

    Eggs and egg products have been described as the most critical food vehicles of salmonellosis. The prevalence and level of contamination of Salmonella on table eggs are low, which severely affects the sensitivity of sampling plans applied voluntarily in some European countries, where one to five...... to improve the efficiency of sampling and reduce the number of samples to be tested....

  8. Data analysis using the Binomial Failure Rate common cause model

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1983-09-01

    This report explains how to use the Binomial Failure Rate (BFR) method to estimate common cause failure rates. The entire method is described, beginning with the conceptual model, and covering practical issues of data preparation, treatment of variation in the failure rates, Bayesian estimation of the quantities of interest, checking the model assumptions for lack of fit to the data, and the ultimate application of the answers

  9. e+-e- hadronic multiplicity distributions: negative binomial or Poisson

    International Nuclear Information System (INIS)

    Carruthers, P.; Shih, C.C.

    1986-01-01

    On the basis of fits to the multiplicity distributions for variable rapidity windows and the forward backward correlation for the 2 jet subset of e + e - data it is impossible to distinguish between a global negative binomial and its generalization, the partially coherent distribution. It is suggested that intensity interferometry, especially the Bose-Einstein correlation, gives information which will discriminate among dynamical models. 16 refs

  10. Hits per trial: Basic analysis of binomial data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report presents simple statistical methods for analyzing binomial data, such as the number of failures in some number of demands. It gives point estimates, confidence intervals, and Bayesian intervals for the failure probability. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the failure probability varies randomly. Examples and SAS programs are given

  11. Hits per trial: Basic analysis of binomial data

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.

    1994-09-01

    This report presents simple statistical methods for analyzing binomial data, such as the number of failures in some number of demands. It gives point estimates, confidence intervals, and Bayesian intervals for the failure probability. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the failure probability varies randomly. Examples and SAS programs are given.

  12. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    OpenAIRE

    S. Mori; K. Kitsukawa; M. Hisakado

    2006-01-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution h...

  13. Sampling and Analysis Plan for Supplemental Environmental Project: Aquatic Life Surveys

    Energy Technology Data Exchange (ETDEWEB)

    Berryhill, Jesse Tobias [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gaukler, Shannon Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-26

    As part of a settlement agreement for nuclear waste incidents in 2014, several supplemental environment projects (SEPs) were initiated at Los Alamos National Laboratory (LANL or the Laboratory) between the U.S. Department of Energy and the state of New Mexico. One SEP from this agreement consists of performing aquatic life surveys and will be used to assess the applicability of using generic ambient water-quality criteria (AWQC) for aquatic life. AWQC are generic criteria developed by the U.S. Environmental Protection Agency (EPA) to cover a broad range of aquatic species and are not unique to a specific region or state. AWQC are established by a composition of toxicity data, called species sensitivity distributions (SSDs), and are determined by LC50 (lethal concentration of 50% of the organisms studied) acute toxicity experiments for chemicals of interest. It is of interest to determine whether aquatic species inhabiting waters on the Pajarito Plateau are adequately protected using the current generic AWQC. The focus of this study will determine which aquatic species are present in ephemeral, intermittent, and perennial waters within LANL boundaries and from reference waters adjacent to LANL. If the species identified from these waters do not generally represent species used in the SSDs, then SSDs may need to be modified and AWQC may need to be updated. This sampling and analysis plan details the sampling methodology, surveillance locations, temporal scheduling, and analytical approaches that will be used to complete aquatic life surveys. A significant portion of this sampling and analysis plan was formalized by referring to Appendix E: SEP Aquatic Life Surveys DQO (Data Quality Objectives).

  14. Standardized binomial models for risk or prevalence ratios and differences.

    Science.gov (United States)

    Richardson, David B; Kinlaw, Alan C; MacLehose, Richard F; Cole, Stephen R

    2015-10-01

    Epidemiologists often analyse binary outcomes in cohort and cross-sectional studies using multivariable logistic regression models, yielding estimates of adjusted odds ratios. It is widely known that the odds ratio closely approximates the risk or prevalence ratio when the outcome is rare, and it does not do so when the outcome is common. Consequently, investigators may decide to directly estimate the risk or prevalence ratio using a log binomial regression model. We describe the use of a marginal structural binomial regression model to estimate standardized risk or prevalence ratios and differences. We illustrate the proposed approach using data from a cohort study of coronary heart disease status in Evans County, Georgia, USA. The approach reduces problems with model convergence typical of log binomial regression by shifting all explanatory variables except the exposures of primary interest from the linear predictor of the outcome regression model to a model for the standardization weights. The approach also facilitates evaluation of departures from additivity in the joint effects of two exposures. Epidemiologists should consider reporting standardized risk or prevalence ratios and differences in cohort and cross-sectional studies. These are readily-obtained using the SAS, Stata and R statistical software packages. The proposed approach estimates the exposure effect in the total population. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  15. Abstract knowledge versus direct experience in processing of binomial expressions.

    Science.gov (United States)

    Morgan, Emily; Levy, Roger

    2016-12-01

    We ask whether word order preferences for binomial expressions of the form A and B (e.g. bread and butter) are driven by abstract linguistic knowledge of ordering constraints referencing the semantic, phonological, and lexical properties of the constituent words, or by prior direct experience with the specific items in questions. Using forced-choice and self-paced reading tasks, we demonstrate that online processing of never-before-seen binomials is influenced by abstract knowledge of ordering constraints, which we estimate with a probabilistic model. In contrast, online processing of highly frequent binomials is primarily driven by direct experience, which we estimate from corpus frequency counts. We propose a trade-off wherein processing of novel expressions relies upon abstract knowledge, while reliance upon direct experience increases with increased exposure to an expression. Our findings support theories of language processing in which both compositional generation and direct, holistic reuse of multi-word expressions play crucial roles. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Microbiological sampling plan based on risk classification to verify supplier selection and production of served meals in food service operation.

    Science.gov (United States)

    Lahou, Evy; Jacxsens, Liesbeth; Van Landeghem, Filip; Uyttendaele, Mieke

    2014-08-01

    Food service operations are confronted with a diverse range of raw materials and served meals. The implementation of a microbial sampling plan in the framework of verification of suppliers and their own production process (functionality of their prerequisite and HACCP program), demands selection of food products and sampling frequencies. However, these are often selected without a well described scientifically underpinned sampling plan. Therefore, an approach on how to set-up a focused sampling plan, enabled by a microbial risk categorization of food products, for both incoming raw materials and meals served to the consumers is presented. The sampling plan was implemented as a case study during a one-year period in an institutional food service operation to test the feasibility of the chosen approach. This resulted in 123 samples of raw materials and 87 samples of meal servings (focused on high risk categorized food products) which were analyzed for spoilage bacteria, hygiene indicators and food borne pathogens. Although sampling plans are intrinsically limited in assessing the quality and safety of sampled foods, it was shown to be useful to reveal major non-compliances and opportunities to improve the food safety management system in place. Points of attention deduced in the case study were control of Listeria monocytogenes in raw meat spread and raw fish as well as overall microbial quality of served sandwiches and salads. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Beta-binomial model for meta-analysis of odds ratios.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena

    2017-05-20

    In meta-analysis of odds ratios (ORs), heterogeneity between the studies is usually modelled via the additive random effects model (REM). An alternative, multiplicative REM for ORs uses overdispersion. The multiplicative factor in this overdispersion model (ODM) can be interpreted as an intra-class correlation (ICC) parameter. This model naturally arises when the probabilities of an event in one or both arms of a comparative study are themselves beta-distributed, resulting in beta-binomial distributions. We propose two new estimators of the ICC for meta-analysis in this setting. One is based on the inverted Breslow-Day test, and the other on the improved gamma approximation by Kulinskaya and Dollinger (2015, p. 26) to the distribution of Cochran's Q. The performance of these and several other estimators of ICC on bias and coverage is studied by simulation. Additionally, the Mantel-Haenszel approach to estimation of ORs is extended to the beta-binomial model, and we study performance of various ICC estimators when used in the Mantel-Haenszel or the inverse-variance method to combine ORs in meta-analysis. The results of the simulations show that the improved gamma-based estimator of ICC is superior for small sample sizes, and the Breslow-Day-based estimator is the best for n⩾100. The Mantel-Haenszel-based estimator of OR is very biased and is not recommended. The inverse-variance approach is also somewhat biased for ORs≠1, but this bias is not very large in practical settings. Developed methods and R programs, provided in the Web Appendix, make the beta-binomial model a feasible alternative to the standard REM for meta-analysis of ORs. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  18. Sampling and analysis plan for the preoperational environmental survey for the immobilized low activity waste (ILAW) project W-465

    International Nuclear Information System (INIS)

    Mitchell, R.M.

    1998-01-01

    This document provides a detailed description of the Sampling and Analysis Plan for the Preoperational Survey to be conducted at the Immobilized Low Activity Waste (ILAW) Project Site in the 200 East Area

  19. Properties of parameter estimation techniques for a beta-binomial failure model. Final technical report

    International Nuclear Information System (INIS)

    Shultis, J.K.; Buranapan, W.; Eckhoff, N.D.

    1981-12-01

    Of considerable importance in the safety analysis of nuclear power plants are methods to estimate the probability of failure-on-demand, p, of a plant component that normally is inactive and that may fail when activated or stressed. Properties of five methods for estimating from failure-on-demand data the parameters of the beta prior distribution in a compound beta-binomial probability model are examined. Simulated failure data generated from a known beta-binomial marginal distribution are used to estimate values of the beta parameters by (1) matching moments of the prior distribution to those of the data, (2) the maximum likelihood method based on the prior distribution, (3) a weighted marginal matching moments method, (4) an unweighted marginal matching moments method, and (5) the maximum likelihood method based on the marginal distribution. For small sample sizes (N = or < 10) with data typical of low failure probability components, it was found that the simple prior matching moments method is often superior (e.g. smallest bias and mean squared error) while for larger sample sizes the marginal maximum likelihood estimators appear to be best

  20. Sampling and analysis plan for site assessment during the closure or replacement of nonradioactive underground storage tanks

    Energy Technology Data Exchange (ETDEWEB)

    Gitt, M.J.

    1990-08-01

    The Tank Management Program is responsible for closure or replacement of nonradioactive underground storage tanks throughout the Idaho National Engineering Laboratory (INEL). A Sampling and Analysis Plan (SAP) has been developed that complies with EPA regulations and with INEL Tank Removal Procedures for sampling activities associated with site assessment during these closure or replacement activities. The SAP will ensure that all data are valid, and it also will function as a Quality Assurance Project Plan. 18 refs., 8 figs., 11 tabs.

  1. 2018 Annual Terrestrial Sampling Plan for Sandia National Laboratories/New Mexico on Kirtland Air Force Base.

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Stacy R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2018-01-01

    The 2018 Annual Terrestrial Sampling Plan for Sandia National Laboratories/New Mexico on Kirtland Air Force Base has been prepared in accordance with the “Letter of Agreement Between Department of Energy, National Nuclear Security Administration, Sandia Field Office (DOE/NNSA/SFO) and 377th Air Base Wing (ABW), Kirtland Air Force Base (KAFB) for Terrestrial Sampling” (signed January 2017), Sandia National Laboratories, New Mexico (SNL/NM). The Letter of Agreement requires submittal of an annual terrestrial sampling plan.

  2. 2017 Annual Terrestrial Sampling Plan for Sandia National Laboratories/New Mexico on Kirtland Air Force Base

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Stacy R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    The 2017 Annual Terrestrial Sampling Plan for Sandia National Laboratories/New Mexico on Kirtland Air Force Base has been prepared in accordance with the “Letter of Agreement Between Department of Energy, National Nuclear Security Administration, Sandia Field Office (DOE/NNSA/SFO) and 377th Air Base Wing (ABW), Kirtland Air Force Base (KAFB) for Terrestrial Sampling” (signed January 2017), Sandia National Laboratories, New Mexico (SNL/NM). The Letter of Agreement requires submittal of an annual terrestrial sampling plan.

  3. Preconception health: awareness, planning, and communication among a sample of US men and women.

    Science.gov (United States)

    Mitchell, Elizabeth W; Levis, Denise M; Prue, Christine E

    2012-01-01

    It is important to educate both men and women about preconception health (PCH), but limited research exists in this area. This paper examines men's and women's awareness of exposure to PCH information and of specific PCH behaviors, PCH planning, and PCH discussions with their partners. Data from Porter Novelli's 2007 Healthstyles survey were used. Women and men of reproductive age were included in the analysis (n = 2,736) to understand their awareness, planning, and conversations around PCH. Only 27.9% of women and men reported consistently using an effective birth control method. The majority of men (52%) and women (43%) were unaware of any exposure to PCH messages; few received information from their health care provider. Women were more aware than men of specific pre-pregnancy health behaviors. Women in the sample reported having more PCH conversations with their partners than did men. PCH education should focus on both women and men. Communication about PCH is lacking, both between couples and among men and women and their health care providers. PCH education might benefit from brand development so that consumers know what to ask for and providers know what to deliver.

  4. Sampling and analysis plan for sludge located in fuel storage canisters of the 105-K West basin

    International Nuclear Information System (INIS)

    Baker, R.B.

    1997-01-01

    This Sampling and Analysis Plan (SAP) provides direction for the first sampling of sludge from the K West Basin spent fuel canisters. The specially developed sampling equipment removes representative samples of sludge while maintaining the radioactive sample underwater in the basin pool (equipment is described in WHC-SD-SNF-SDD-004). Included are the basic background logic for sample selection, the overall laboratory analyses required and the laboratory reporting required. These are based on requirements put forth in the data quality objectives (WHC-SD-SNF-DQO-012) established for this sampling and characterization activity

  5. Distribuição espacial e plano de amostragem de Calacarus heveae (Acari em seringueira Spatial distribution and sampling plan for Calacarus heveae (Acari on rubber trees

    Directory of Open Access Journals (Sweden)

    Noeli Juarez Ferla

    2007-12-01

    representative sampling unit and to develop a sampling plan to determine the populational fluctuation. This study was conducted with clones PB 260 and IAN 873, in Itiquira and Pontes e Lacerda, respectively, both in the state of Mato Grosso. In Itiquira, significant differences were observed in four occasions in relation to the average number of mites per leaf in the different plant strata. In the samplings carried out in Pontes e Lacerda, no significant differences were observed between strata in relation to that parameter. Only in Itiquira, in one occasion, a significant difference between strata was verified in relation to the proportion of infested leaves. No significant differences were verified in relation to the average number of mites per leaf and proportion of leaves infested by C. heveae at different depths in the canopy. Calacarus heveae exhibits aggregated distribution in the field. To estimate the density of C. heveae, numeric and sampling plans were developed.

  6. Sampling-Based Motion Planning Algorithms for Replanning and Spatial Load Balancing

    Energy Technology Data Exchange (ETDEWEB)

    Boardman, Beth Leigh [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-12

    The common theme of this dissertation is sampling-based motion planning with the two key contributions being in the area of replanning and spatial load balancing for robotic systems. Here, we begin by recalling two sampling-based motion planners: the asymptotically optimal rapidly-exploring random tree (RRT*), and the asymptotically optimal probabilistic roadmap (PRM*). We also provide a brief background on collision cones and the Distributed Reactive Collision Avoidance (DRCA) algorithm. The next four chapters detail novel contributions for motion replanning in environments with unexpected static obstacles, for multi-agent collision avoidance, and spatial load balancing. First, we show improved performance of the RRT* when using the proposed Grandparent-Connection (GP) or Focused-Refinement (FR) algorithms. Next, the Goal Tree algorithm for replanning with unexpected static obstacles is detailed and proven to be asymptotically optimal. A multi-agent collision avoidance problem in obstacle environments is approached via the RRT*, leading to the novel Sampling-Based Collision Avoidance (SBCA) algorithm. The SBCA algorithm is proven to guarantee collision free trajectories for all of the agents, even when subject to uncertainties in the knowledge of the other agents’ positions and velocities. Given that a solution exists, we prove that livelocks and deadlock will lead to the cost to the goal being decreased. We introduce a new deconfliction maneuver that decreases the cost-to-come at each step. This new maneuver removes the possibility of livelocks and allows a result to be formed that proves convergence to the goal configurations. Finally, we present a limited range Graph-based Spatial Load Balancing (GSLB) algorithm which fairly divides a non-convex space among multiple agents that are subject to differential constraints and have a limited travel distance. The GSLB is proven to converge to a solution when maximizing the area covered by the agents. The analysis

  7. Tank 241-AX-104 upper vadose zone cone penetrometer demonstration sampling and analysis plan

    Energy Technology Data Exchange (ETDEWEB)

    FIELD, J.G.

    1999-02-02

    This sampling and analysis plan (SAP) is the primary document describing field and laboratory activities and requirements for the tank 241-AX-104 upper vadose zone cone penetrometer (CP) demonstration. It is written in accordance with Hanford Tank Initiative Tank 241-AX-104 Upper Vadose Zone Demonstration Data Quality Objective (Banning 1999). This technology demonstration, to be conducted at tank 241-AX-104, is being performed by the Hanford Tanks Initiative (HTI) Project as a part of Tank Waste Remediation System (TWRS) Retrieval Program (EM-30) and the Office of Science and Technology (EM-50) Tanks Focus Area. Sample results obtained as part of this demonstration will provide additional information for subsequent revisions to the Retrieval Performance Evaluation (RPE) report (Jacobs 1998). The RPE Report is the result of an evaluation of a single tank farm (AX Tank Farm) used as the basis for demonstrating a methodology for developing the data and analyses necessary to support making tank waste retrieval decisions within the context of tank farm closure requirements. The RPE includes a study of vadose zone contaminant transport mechanisms, including analysis of projected tank leak characteristics, hydrogeologic characteristics of tank farm soils, and the observed distribution of contaminants in the vadose zone in the tank farms. With limited characterization information available, large uncertainties exist as to the nature and extent of contaminants that may exist in the upper vadose zone in the AX Tank Farm. Traditionally, data has been collected from soils in the vadose zone through the installation of boreholes and wells. Soil samples are collected as the bore hole is advanced and samples are screened on site and/or sent to a laboratory for analysis. Some in-situ geophysical methods of contaminant analysis can be used to evaluate radionuclide levels in the soils adjacent to an existing borehole. However, geophysical methods require compensation for well

  8. 40 CFR Appendix A to Subpart G of... - Sampling Plans for Selective Enforcement Auditing of Marine Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Marine Engines A Appendix A to Subpart G of Part 91 Protection of Environment...-IGNITION ENGINES Selective Enforcement Auditing Regulations Pt. 91, Subpt. G, App. A Appendix A to Subpart G of Part 91—Sampling Plans for Selective Enforcement Auditing of Marine Engines Table 1—Sampling...

  9. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Nonroad Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Nonroad Engines A Appendix A to Subpart F of Part 89 Protection of Environment... NONROAD COMPRESSION-IGNITION ENGINES Selective Enforcement Auditing Pt. 89, Subpt. F, App. A Appendix A to Subpart F of Part 89—Sampling Plans for Selective Enforcement Auditing of Nonroad Engines Table 1—Sampling...

  10. Corganiser: a web-based software tool for planning time-sensitive sampling of whole rounds during scientific drilling

    DEFF Research Database (Denmark)

    Marshall, Ian

    2014-01-01

    Corganiser is a software tool developed to simplify the process of preparing whole-round sampling plans for time-sensitive microbiology and geochemistry sampling during scientific drilling. It was developed during the Integrated Ocean Drilling Program (IODP) Expedition 347, but is designed to work...

  11. Sampling and Analysis Plan for White Oak Creek Watershed Remedial Investigation supplemental sampling, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1996-05-01

    This Sampling and Analysis (SAP) presents the project requirements for proposed soil sampling to support the White Oak Creek Remedial Investigation/Feasibility Study at Oak Ridge National Laboratory. During the Data Quality Objectives process for the project, it was determined that limited surface soils sampling is need to supplement the historical environmental characterization database. The primary driver for the additional sampling is the need to identify potential human health and ecological risks at various sites that have not yet proceeded through a remedial investigation. These sites include Waste Area Grouping (WAG)3, WAG 4, WAG 7, and WAG 9. WAG 4 efforts are limited to nonradiological characterization since recent seep characterization activities at the WAG have defined the radiological problem there

  12. Geographically weighted negative binomial regression applied to zonal level safety performance models.

    Science.gov (United States)

    Gomes, Marcos José Timbó Lima; Cunto, Flávio; da Silva, Alan Ricardo

    2017-09-01

    Generalized Linear Models (GLM) with negative binomial distribution for errors, have been widely used to estimate safety at the level of transportation planning. The limited ability of this technique to take spatial effects into account can be overcome through the use of local models from spatial regression techniques, such as Geographically Weighted Poisson Regression (GWPR). Although GWPR is a system that deals with spatial dependency and heterogeneity and has already been used in some road safety studies at the planning level, it fails to account for the possible overdispersion that can be found in the observations on road-traffic crashes. Two approaches were adopted for the Geographically Weighted Negative Binomial Regression (GWNBR) model to allow discrete data to be modeled in a non-stationary form and to take note of the overdispersion of the data: the first examines the constant overdispersion for all the traffic zones and the second includes the variable for each spatial unit. This research conducts a comparative analysis between non-spatial global crash prediction models and spatial local GWPR and GWNBR at the level of traffic zones in Fortaleza/Brazil. A geographic database of 126 traffic zones was compiled from the available data on exposure, network characteristics, socioeconomic factors and land use. The models were calibrated by using the frequency of injury crashes as a dependent variable and the results showed that GWPR and GWNBR achieved a better performance than GLM for the average residuals and likelihood as well as reducing the spatial autocorrelation of the residuals, and the GWNBR model was more able to capture the spatial heterogeneity of the crash frequency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Statistical properties of nonlinear intermediate states: binomial state

    Energy Technology Data Exchange (ETDEWEB)

    Abdalla, M Sebawe [Mathematics Department, College of Science, King Saud University, PO Box 2455, Riyadh 11451 (Saudi Arabia); Obada, A-S F [Department Mathematics, Faculty of Science, Al-Azhar University, Nasr City 11884, Cairo (Egypt); Darwish, M [Department of Physics, Faculty of Education, Suez Canal University, Al-Arish (Egypt)

    2005-12-01

    In the present paper we introduce a nonlinear binomial state (the state which interpolates between the nonlinear coherent and number states). The main investigation concentrates on the statistical properties for such a state where we consider the squeezing phenomenon by examining the variation in the quadrature variances for both normal and amplitude-squared squeezing. Examinations for the quasi-probability distribution functions (W-Wigner and Q-functions) are also given for both diagonal and off diagonal terms. The quadrature distribution and the phase distribution as well as the phase variances are discussed. Moreover, we give in detail a generation scheme for such state.

  14. Microbial comparative pan-genomics using binomial mixture models

    DEFF Research Database (Denmark)

    Ussery, David; Snipen, L; Almøy, T

    2009-01-01

    The size of the core- and pan-genome of bacterial species is a topic of increasing interest due to the growing number of sequenced prokaryote genomes, many from the same species. Attempts to estimate these quantities have been made, using regression methods or mixture models. We extend the latter...... occurring genes in the population. CONCLUSION: Analyzing pan-genomics data with binomial mixture models is a way to handle dependencies between genomes, which we find is always present. A bottleneck in the estimation procedure is the annotation of rarely occurring genes....

  15. Correlation Structures of Correlated Binomial Models and Implied Default Distribution

    Science.gov (United States)

    Mori, Shintaro; Kitsukawa, Kenji; Hisakado, Masato

    2008-11-01

    We show how to analyze and interpret the correlation structures, the conditional expectation values and correlation coefficients of exchangeable Bernoulli random variables. We study implied default distributions for the iTraxx-CJ tranches and some popular probabilistic models, including the Gaussian copula model, Beta binomial distribution model and long-range Ising model. We interpret the differences in their profiles in terms of the correlation structures. The implied default distribution has singular correlation structures, reflecting the credit market implications. We point out two possible origins of the singular behavior.

  16. Seeps and springs sampling and analysis plan for the environmental monitoring plan for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This Sampling and Analysis Plan addresses the monitoring, sampling, and analysis activities that will be conducted at seeps and springs and at two french drain outlets in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-land-burial disposal facility for low-level radioactive waste at Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and operated by Martin Marietta Energy Systems, Inc. Initially, sampling will be conducted at as many as 15 locations within WAG 6 (as many as 13 seeps and 2 french drain outlets). After evaluating the results obtained and reviewing the observations made by field personnel during the first round of sampling, several seeps and springs will be chosen as permanent monitoring points, together with the two french drain outlets. Baseline sampling of these points will then be conducted quarterly for 1 year (i.e., four rounds of sampling after the initial round). The samples will be analyzed for various geochemical, organic, inorganic, and radiological parameters. Permanent sampling points having suitable flow rates and conditions may be outfitted with automatic flow-monitoring equipment. The results of the sampling and flow-monitoring efforts will help to quantify flux moving across the ungauged perimeter of the site and will help to identify changes in releases from the contaminant sources.

  17. Seeps and springs sampling and analysis plan for the environmental monitoring plan for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1994-08-01

    This Sampling and Analysis Plan addresses the monitoring, sampling, and analysis activities that will be conducted at seeps and springs and at two french drain outlets in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-land-burial disposal facility for low-level radioactive waste at Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and operated by Martin Marietta Energy Systems, Inc. Initially, sampling will be conducted at as many as 15 locations within WAG 6 (as many as 13 seeps and 2 french drain outlets). After evaluating the results obtained and reviewing the observations made by field personnel during the first round of sampling, several seeps and springs will be chosen as permanent monitoring points, together with the two french drain outlets. Baseline sampling of these points will then be conducted quarterly for 1 year (i.e., four rounds of sampling after the initial round). The samples will be analyzed for various geochemical, organic, inorganic, and radiological parameters. Permanent sampling points having suitable flow rates and conditions may be outfitted with automatic flow-monitoring equipment. The results of the sampling and flow-monitoring efforts will help to quantify flux moving across the ungauged perimeter of the site and will help to identify changes in releases from the contaminant sources

  18. Rgbp: An R Package for Gaussian, Poisson, and Binomial Random Effects Models with Frequency Coverage Evaluations

    Directory of Open Access Journals (Sweden)

    Hyungsuk Tak

    2017-06-01

    Full Text Available Rgbp is an R package that provides estimates and verifiable confidence intervals for random effects in two-level conjugate hierarchical models for overdispersed Gaussian, Poisson, and binomial data. Rgbp models aggregate data from k independent groups summarized by observed sufficient statistics for each random effect, such as sample means, possibly with covariates. Rgbp uses approximate Bayesian machinery with unique improper priors for the hyper-parameters, which leads to good repeated sampling coverage properties for random effects. A special feature of Rgbp is an option that generates synthetic data sets to check whether the interval estimates for random effects actually meet the nominal confidence levels. Additionally, Rgbp provides inference statistics for the hyper-parameters, e.g., regression coefficients.

  19. Fast shading correction for cone beam CT in radiation therapy via sparse sampling on planning CT.

    Science.gov (United States)

    Shi, Linxi; Tsui, Tiffany; Wei, Jikun; Zhu, Lei

    2017-05-01

    The image quality of cone beam computed tomography (CBCT) is limited by severe shading artifacts, hindering its quantitative applications in radiation therapy. In this work, we propose an image-domain shading correction method using planning CT (pCT) as prior information which is highly adaptive to clinical environment. We propose to perform shading correction via sparse sampling on pCT. The method starts with a coarse mapping between the first-pass CBCT images obtained from the Varian TrueBeam system and the pCT. The scatter correction method embedded in the Varian commercial software removes some image errors but the CBCT images still contain severe shading artifacts. The difference images between the mapped pCT and the CBCT are considered as shading errors, but only sparse shading samples are selected for correction using empirical constraints to avoid carrying over false information from pCT. A Fourier-Transform-based technique, referred to as local filtration, is proposed to efficiently process the sparse data for effective shading correction. The performance of the proposed method is evaluated on one anthropomorphic pelvis phantom and 17 patients, who were scheduled for radiation therapy. (The codes of the proposed method and sample data can be downloaded from https://sites.google.com/view/linxicbct) RESULTS: The proposed shading correction substantially improves the CBCT image quality on both the phantom and the patients to a level close to that of the pCT images. On the phantom, the spatial nonuniformity (SNU) difference between CBCT and pCT is reduced from 74 to 1 HU. The root of mean square difference of SNU between CBCT and pCT is reduced from 83 to 10 HU on the pelvis patients, and from 101 to 12 HU on the thorax patients. The robustness of the proposed shading correction is fully investigated with simulated registration errors between CBCT and pCT on the phantom and mis-registration on patients. The sparse sampling scheme of our method successfully

  20. Sampling and Analysis Plan for Verification Sampling of LANL-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance

    Energy Technology Data Exchange (ETDEWEB)

    Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-30

    Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.

  1. Test Plan for Rotary Mode Core Sample Truck Grapple Hoist Level Wind System

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    A Grapple Hoist Assembly is currently used on the Rotary Mode Core Sampling Trucks (RMCSTs) to actuate the sampler and retrieve the pintle rod during sampling operations. The hoist assembly includes a driven drum approximately two inches wide and six inches in diameter that rotates to pay out or reel in the 5/32-in. cable The current Grapple Hoist Assembly, detailed on drawing H-2-690057, is prone to ''bird nesting'' the cable on the drum. ''Bird nesting'' is a condition in which the cable does not wind onto the drum in a uniformly layered manner, but winds in a random fashion where the cable essentially ''piles up'' inappropriately on the drum and, on some occasions, winds on the drum drive shaft. A system to help control this ''bird nesting'' problem has been designed as an addition to the existing components of the Grapple Hoist Assembly. The new design consists of a mechanism that is timed with, and driven by, the shaft that drives the drum. This mechanism traverses back and forth across the width of the drum to lay the cable on the drum in a uniformly layered manner. This test plan establishes the acceptance criteria, test procedure and test conditions It also describes the test apparatus necessary to verify the adequacy of the level wind system design. The test is defined as qualification testing (LMHC 1999b) and as such will be performed at conditions beyond the parameters that the Grapple Hoist Assembly is allowed to operate by the Safety Equipment List (SEL)(LMHC 1998)

  2. Test Plan for Rotary Mode Core Sample Truck Grapple Hoist Level Wind System

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    1999-12-09

    A Grapple Hoist Assembly is currently used on the Rotary Mode Core Sampling Trucks (RMCSTs) to actuate the sampler and retrieve the pintle rod during sampling operations. The hoist assembly includes a driven drum approximately two inches wide and six inches in diameter that rotates to pay out or reel in the 5/32-in. cable The current Grapple Hoist Assembly, detailed on drawing H-2-690057, is prone to ''bird nesting'' the cable on the drum. ''Bird nesting'' is a condition in which the cable does not wind onto the drum in a uniformly layered manner, but winds in a random fashion where the cable essentially ''piles up'' inappropriately on the drum and, on some occasions, winds on the drum drive shaft. A system to help control this ''bird nesting'' problem has been designed as an addition to the existing components of the Grapple Hoist Assembly. The new design consists of a mechanism that is timed with, and driven by, the shaft that drives the drum. This mechanism traverses back and forth across the width of the drum to lay the cable on the drum in a uniformly layered manner. This test plan establishes the acceptance criteria, test procedure and test conditions It also describes the test apparatus necessary to verify the adequacy of the level wind system design. The test is defined as qualification testing (LMHC 1999b) and as such will be performed at conditions beyond the parameters that the Grapple Hoist Assembly is allowed to operate by the Safety Equipment List (SEL)(LMHC 1998).

  3. Sampling and analysis plan for the preoperational environmental survey of the spent nuclear fuel project facilities

    Energy Technology Data Exchange (ETDEWEB)

    MITCHELL, R.M.

    1999-04-01

    This sampling and analysis plan will support the preoperational environmental monitoring for construction, development, and operation of the Spent Nuclear Fuel (SNF) Project facilities, which have been designed for the conditioning and storage of spent nuclear fuels; particularly the fuel elements associated with the operation of N-Reactor. The SNF consists principally of irradiated metallic uranium, and therefore includes plutonium and mixed fission products. The primary effort will consist of removing the SNF from the storage basins in K East and K West Areas, placing in multicanister overpacks, vacuum drying, conditioning, and subsequent dry vault storage in the 200 East Area. The primary purpose and need for this action is to reduce the risks to public health and safety and to the environment. Specifically these include prevention of the release of radioactive materials into the air or to the soil surrounding the K Basins, prevention of the potential migration of radionuclides through the soil column to the nearby Columbia River, reduction of occupational radiation exposure, and elimination of the risks to the public and to workers from the deterioration of SNF in the K Basins.

  4. Sampling and analysis plan for the preoperational environmental survey of the spent nuclear fuel project facilities

    International Nuclear Information System (INIS)

    MITCHELL, R.M.

    1999-01-01

    This sampling and analysis plan will support the preoperational environmental monitoring for construction, development, and operation of the Spent Nuclear Fuel (SNF) Project facilities, which have been designed for the conditioning and storage of spent nuclear fuels; particularly the fuel elements associated with the operation of N-Reactor. The SNF consists principally of irradiated metallic uranium, and therefore includes plutonium and mixed fission products. The primary effort will consist of removing the SNF from the storage basins in K East and K West Areas, placing in multicanister overpacks, vacuum drying, conditioning, and subsequent dry vault storage in the 200 East Area. The primary purpose and need for this action is to reduce the risks to public health and safety and to the environment. Specifically these include prevention of the release of radioactive materials into the air or to the soil surrounding the K Basins, prevention of the potential migration of radionuclides through the soil column to the nearby Columbia River, reduction of occupational radiation exposure, and elimination of the risks to the public and to workers from the deterioration of SNF in the K Basins

  5. Negative binomial models for abundance estimation of multiple closed populations

    Science.gov (United States)

    Boyce, Mark S.; MacKenzie, Darry I.; Manly, Bryan F.J.; Haroldson, Mark A.; Moody, David W.

    2001-01-01

    Counts of uniquely identified individuals in a population offer opportunities to estimate abundance. However, for various reasons such counts may be burdened by heterogeneity in the probability of being detected. Theoretical arguments and empirical evidence demonstrate that the negative binomial distribution (NBD) is a useful characterization for counts from biological populations with heterogeneity. We propose a method that focuses on estimating multiple populations by simultaneously using a suite of models derived from the NBD. We used this approach to estimate the number of female grizzly bears (Ursus arctos) with cubs-of-the-year in the Yellowstone ecosystem, for each year, 1986-1998. Akaike's Information Criteria (AIC) indicated that a negative binomial model with a constant level of heterogeneity across all years was best for characterizing the sighting frequencies of female grizzly bears. A lack-of-fit test indicated the model adequately described the collected data. Bootstrap techniques were used to estimate standard errors and 95% confidence intervals. We provide a Monte Carlo technique, which confirms that the Yellowstone ecosystem grizzly bear population increased during the period 1986-1998.

  6. Low reheating temperatures in monomial and binomial inflationary models

    International Nuclear Information System (INIS)

    Rehagen, Thomas; Gelmini, Graciela B.

    2015-01-01

    We investigate the allowed range of reheating temperature values in light of the Planck 2015 results and the recent joint analysis of Cosmic Microwave Background (CMB) data from the BICEP2/Keck Array and Planck experiments, using monomial and binomial inflationary potentials. While the well studied ϕ 2 inflationary potential is no longer favored by current CMB data, as well as ϕ p with p>2, a ϕ 1 potential and canonical reheating (w re =0) provide a good fit to the CMB measurements. In this last case, we find that the Planck 2015 68% confidence limit upper bound on the spectral index, n s , implies an upper bound on the reheating temperature of T re ≲6×10 10 GeV, and excludes instantaneous reheating. The low reheating temperatures allowed by this model open the possibility that dark matter could be produced during the reheating period instead of when the Universe is radiation dominated, which could lead to very different predictions for the relic density and momentum distribution of WIMPs, sterile neutrinos, and axions. We also study binomial inflationary potentials and show the effects of a small departure from a ϕ 1 potential. We find that as a subdominant ϕ 2 term in the potential increases, first instantaneous reheating becomes allowed, and then the lowest possible reheating temperature of T re =4 MeV is excluded by the Planck 2015 68% confidence limit

  7. Estimation of adjusted rate differences using additive negative binomial regression.

    Science.gov (United States)

    Donoghoe, Mark W; Marschner, Ian C

    2016-08-15

    Rate differences are an important effect measure in biostatistics and provide an alternative perspective to rate ratios. When the data are event counts observed during an exposure period, adjusted rate differences may be estimated using an identity-link Poisson generalised linear model, also known as additive Poisson regression. A problem with this approach is that the assumption of equality of mean and variance rarely holds in real data, which often show overdispersion. An additive negative binomial model is the natural alternative to account for this; however, standard model-fitting methods are often unable to cope with the constrained parameter space arising from the non-negativity restrictions of the additive model. In this paper, we propose a novel solution to this problem using a variant of the expectation-conditional maximisation-either algorithm. Our method provides a reliable way to fit an additive negative binomial regression model and also permits flexible generalisations using semi-parametric regression functions. We illustrate the method using a placebo-controlled clinical trial of fenofibrate treatment in patients with type II diabetes, where the outcome is the number of laser therapy courses administered to treat diabetic retinopathy. An R package is available that implements the proposed method. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Soil sampling and analysis plan for the 3718-F Alkali Metal Treatment and Storage Facility closure activities

    Energy Technology Data Exchange (ETDEWEB)

    Sonnichsen, J.C.

    1997-05-01

    Amendment V.13.B.b to the approved closure plan (DOE-RL 1995a) requires that a soil sampling and analysis plan be prepared and submitted to the Washington State Department of Ecology (Ecology) for review and approval. Amendment V.13.B.c requires that a diagram of the 3718-F Alkali Metal Treatment and Storage Facility unit (the treatment, storage, and disposal [TSD] unit) boundary that is to be closed, including the maximum extent of operation, be prepared and submitted as part is of the soil sampling and analysis plan. This document describes the sampling and analysis that is to be performed in response to these requirements and amends the closure plan. Specifically, this document supersedes Section 6.2, lines 43--46, and Section 7.3.6 of the closure plan. Results from the analysis will be compared to cleanup levels identified in the closure plan. These cleanup levels will be established using residential exposure assumptions in accordance with the Model Toxics Control Act (MTCA) Cleanup Regulation (Washington Administrative Code [WAC] 173-340) as required in Amendment V.13.B.I. Results of all sampling, including the raw analytical data, a summary of analytical results, a data validation package, and a narrative summary with conclusions will be provided to Ecology as specified in Amendment V.13.B.e. The results and process used to collect and analyze the soil samples will be certified by a licensed professional engineer. These results and a certificate of closure for the balance of the TSD unit, as outlined in Chapter 7.0 of the approved closure plan (storage shed, concrete pad, burn building, scrubber, and reaction tanks), will provide the basis for a closure determination.

  9. Soil sampling and analysis plan for the 3718-F Alkali Metal Treatment and Storage Facility closure activities

    International Nuclear Information System (INIS)

    Sonnichsen, J.C.

    1997-01-01

    Amendment V.13.B.b to the approved closure plan (DOE-RL 1995a) requires that a soil sampling and analysis plan be prepared and submitted to the Washington State Department of Ecology (Ecology) for review and approval. Amendment V.13.B.c requires that a diagram of the 3718-F Alkali Metal Treatment and Storage Facility unit (the treatment, storage, and disposal [TSD] unit) boundary that is to be closed, including the maximum extent of operation, be prepared and submitted as part is of the soil sampling and analysis plan. This document describes the sampling and analysis that is to be performed in response to these requirements and amends the closure plan. Specifically, this document supersedes Section 6.2, lines 43--46, and Section 7.3.6 of the closure plan. Results from the analysis will be compared to cleanup levels identified in the closure plan. These cleanup levels will be established using residential exposure assumptions in accordance with the Model Toxics Control Act (MTCA) Cleanup Regulation (Washington Administrative Code [WAC] 173-340) as required in Amendment V.13.B.I. Results of all sampling, including the raw analytical data, a summary of analytical results, a data validation package, and a narrative summary with conclusions will be provided to Ecology as specified in Amendment V.13.B.e. The results and process used to collect and analyze the soil samples will be certified by a licensed professional engineer. These results and a certificate of closure for the balance of the TSD unit, as outlined in Chapter 7.0 of the approved closure plan (storage shed, concrete pad, burn building, scrubber, and reaction tanks), will provide the basis for a closure determination

  10. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  11. Validation of fixed sample size plans for monitoring lepidopteran pests of Brassica oleracea crops in North Korea.

    Science.gov (United States)

    Hamilton, A J; Waters, E K; Kim, H J; Pak, W S; Furlong, M J

    2009-06-01

    The combined action of two lepidoteran pests, Plutella xylostella L. (Plutellidae) and Pieris rapae L. (Pieridae),causes significant yield losses in cabbage (Brassica oleracea variety capitata) crops in the Democratic People's Republic of Korea. Integrated pest management (IPM) strategies for these cropping systems are in their infancy, and sampling plans have not yet been developed. We used statistical resampling to assess the performance of fixed sample size plans (ranging from 10 to 50 plants). First, the precision (D = SE/mean) of the plans in estimating the population mean was assessed. There was substantial variation in achieved D for all sample sizes, and sample sizes of at least 20 and 45 plants were required to achieve the acceptable precision level of D < or = 0.3 at least 50 and 75% of the time, respectively. Second, the performance of the plans in classifying the population density relative to an economic threshold (ET) was assessed. To account for the different damage potentials of the two species the ETs were defined in terms of standard insects (SIs), where 1 SI = 1 P. rapae = 5 P. xylostella larvae. The plans were implemented using different economic thresholds (ETs) for the three growth stages of the crop: precupping (1 SI/plant), cupping (0.5 SI/plant), and heading (4 SI/plant). Improvement in the classification certainty with increasing sample sizes could be seen through the increasing steepness of operating characteristic curves. Rather than prescribe a particular plan, we suggest that the results of these analyses be used to inform practitioners of the relative merits of the different sample sizes.

  12. Estimating spatial and temporal components of variation in count data using negative binomial mixed models

    Science.gov (United States)

    Irwin, Brian J.; Wagner, Tyler; Bence, James R.; Kepler, Megan V.; Liu, Weihai; Hayes, Daniel B.

    2013-01-01

    Partitioning total variability into its component temporal and spatial sources is a powerful way to better understand time series and elucidate trends. The data available for such analyses of fish and other populations are usually nonnegative integer counts of the number of organisms, often dominated by many low values with few observations of relatively high abundance. These characteristics are not well approximated by the Gaussian distribution. We present a detailed description of a negative binomial mixed-model framework that can be used to model count data and quantify temporal and spatial variability. We applied these models to data from four fishery-independent surveys of Walleyes Sander vitreus across the Great Lakes basin. Specifically, we fitted models to gill-net catches from Wisconsin waters of Lake Superior; Oneida Lake, New York; Saginaw Bay in Lake Huron, Michigan; and Ohio waters of Lake Erie. These long-term monitoring surveys varied in overall sampling intensity, the total catch of Walleyes, and the proportion of zero catches. Parameter estimation included the negative binomial scaling parameter, and we quantified the random effects as the variations among gill-net sampling sites, the variations among sampled years, and site × year interactions. This framework (i.e., the application of a mixed model appropriate for count data in a variance-partitioning context) represents a flexible approach that has implications for monitoring programs (e.g., trend detection) and for examining the potential of individual variance components to serve as response metrics to large-scale anthropogenic perturbations or ecological changes.

  13. Reducing Monte Carlo error in the Bayesian estimation of risk ratios using log-binomial regression models.

    Science.gov (United States)

    Salmerón, Diego; Cano, Juan A; Chirlaque, María D

    2015-08-30

    In cohort studies, binary outcomes are very often analyzed by logistic regression. However, it is well known that when the goal is to estimate a risk ratio, the logistic regression is inappropriate if the outcome is common. In these cases, a log-binomial regression model is preferable. On the other hand, the estimation of the regression coefficients of the log-binomial model is difficult owing to the constraints that must be imposed on these coefficients. Bayesian methods allow a straightforward approach for log-binomial regression models and produce smaller mean squared errors in the estimation of risk ratios than the frequentist methods, and the posterior inferences can be obtained using the software WinBUGS. However, Markov chain Monte Carlo methods implemented in WinBUGS can lead to large Monte Carlo errors in the approximations to the posterior inferences because they produce correlated simulations, and the accuracy of the approximations are inversely related to this correlation. To reduce correlation and to improve accuracy, we propose a reparameterization based on a Poisson model and a sampling algorithm coded in R. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Theoretical Antecedents of Standing at Work: An Experience Sampling Approach Using the Theory of Planned Behavior

    Science.gov (United States)

    Meyer, M. Renée Umstattd; Wu, Cindy; Walsh, Shana M.

    2016-01-01

    Time spent sitting has been associated with an increased risk of diabetes, cancer, obesity, and mental health impairments. However, 75% of Americans spend most of their days sitting, with work-sitting accounting for 63% of total daily sitting time. Little research examining theory-based antecedents of standing or sitting has been conducted. This lack of solid groundwork makes it difficult to design effective intervention strategies to decrease sitting behaviors. Using the Theory of Planned Behavior (TPB) as our theoretical lens to better understand factors related with beneficial standing behaviors already being practiced, we examined relationships between TPB constructs and time spent standing at work among “positive deviants” (those successful in behavior change). Experience sampling methodology (ESM), 4 times a day (midmorning, before lunch, afternoon, and before leaving work) for 5 consecutive workdays (Monday to Friday), was used to assess employees' standing time. TPB scales assessing attitude (α = 0.81–0.84), norms (α = 0.83), perceived behavioral control (α = 0.77), and intention (α = 0.78) were developed using recommended methods and collected once on the Friday before the ESM surveys started. ESM data are hierarchically nested, therefore we tested our hypotheses using multilevel structural equation modeling with Mplus. Hourly full-time university employees (n = 50; 70.6% female, 84.3% white, mean age = 44 (SD = 11), 88.2% in full-time staff positions) with sedentary occupation types (time at desk while working ≥6 hours/day) participated. A total of 871 daily surveys were completed. Only perceived behavioral control (β = 0.45, p deviance approach to enhance perceived behavioral control, in addition to implementing environmental changes like installing standing desks. PMID:29546189

  15. Adaptive local learning in sampling based motion planning for protein folding.

    Science.gov (United States)

    Ekenna, Chinwe; Thomas, Shawna; Amato, Nancy M

    2016-08-01

    Simulating protein folding motions is an important problem in computational biology. Motion planning algorithms, such as Probabilistic Roadmap Methods, have been successful in modeling the folding landscape. Probabilistic Roadmap Methods and variants contain several phases (i.e., sampling, connection, and path extraction). Most of the time is spent in the connection phase and selecting which variant to employ is a difficult task. Global machine learning has been applied to the connection phase but is inefficient in situations with varying topology, such as those typical of folding landscapes. We develop a local learning algorithm that exploits the past performance of methods within the neighborhood of the current connection attempts as a basis for learning. It is sensitive not only to different types of landscapes but also to differing regions in the landscape itself, removing the need to explicitly partition the landscape. We perform experiments on 23 proteins of varying secondary structure makeup with 52-114 residues. We compare the success rate when using our methods and other methods. We demonstrate a clear need for learning (i.e., only learning methods were able to validate against all available experimental data) and show that local learning is superior to global learning producing, in many cases, significantly higher quality results than the other methods. We present an algorithm that uses local learning to select appropriate connection methods in the context of roadmap construction for protein folding. Our method removes the burden of deciding which method to use, leverages the strengths of the individual input methods, and it is extendable to include other future connection methods.

  16. Engineering task plan for development, fabrication, and deployment of nested, fixed depth fluidic sampling and at-tank analysis systems

    International Nuclear Information System (INIS)

    REICH, F.R.

    1999-01-01

    An engineering task plan was developed that presents the resources, responsibilities, and schedules for the development, test, and deployment of the nested, fixed-depth fluidic sampling and at-tank analysis system. The sampling system, deployed in the privatization contract double-shell tank feed tank, will provide waste samples for assuring the readiness of the tank for shipment to the privatization contractor for vitrification. The at-tank analysis system will provide ''real-time'' assessments of the sampled wastes' chemical and physical properties. These systems support the Hanford Phase 1B Privatization Contract

  17. Innovative solutions: sample financial management business plan: neurosurgical intensive care unit.

    Science.gov (United States)

    Villanueva-Baldonado, Analiza; Barrett-Sheridan, Shirley E

    2010-01-01

    This article describes one institution's intention to implement a financial management business plan for a neurosurgical intensive care unit in a level I trauma center. The financial objective of this proposed business plan includes a service increase in the patient population requiring critical care in a way that will help control costs.

  18. Group SkSP-R sampling plan for accelerated life tests

    Indian Academy of Sciences (India)

    ... use condition, while the scale parameter can be obtained from acceleration factor. The plan parameters aredetermined through a non-linear optimisation problem for fixed values of producer's risk and consumer's risk. The advantages of the proposed plan over the existing one are explained with some practical examples.

  19. LAZIO REGION: DEVELOPMENT OF A SAMPLING PLAN FOR OFFICIAL CONTROL OF FOODSTUFFS

    Directory of Open Access Journals (Sweden)

    S. Saccares

    2009-06-01

    Full Text Available This paper describes the criteria and methodologies that have been used for the planning of microbiological checks on food in the context of the wider plan of Integrated Controls (PRIC Lazio Region as set by Regulation 882/2004 (EC.

  20. A Flexible, Efficient Binomial Mixed Model for Identifying Differential DNA Methylation in Bisulfite Sequencing Data

    Science.gov (United States)

    Lea, Amanda J.

    2015-01-01

    Identifying sources of variation in DNA methylation levels is important for understanding gene regulation. Recently, bisulfite sequencing has become a popular tool for investigating DNA methylation levels. However, modeling bisulfite sequencing data is complicated by dramatic variation in coverage across sites and individual samples, and because of the computational challenges of controlling for genetic covariance in count data. To address these challenges, we present a binomial mixed model and an efficient, sampling-based algorithm (MACAU: Mixed model association for count data via data augmentation) for approximate parameter estimation and p-value computation. This framework allows us to simultaneously account for both the over-dispersed, count-based nature of bisulfite sequencing data, as well as genetic relatedness among individuals. Using simulations and two real data sets (whole genome bisulfite sequencing (WGBS) data from Arabidopsis thaliana and reduced representation bisulfite sequencing (RRBS) data from baboons), we show that our method provides well-calibrated test statistics in the presence of population structure. Further, it improves power to detect differentially methylated sites: in the RRBS data set, MACAU detected 1.6-fold more age-associated CpG sites than a beta-binomial model (the next best approach). Changes in these sites are consistent with known age-related shifts in DNA methylation levels, and are enriched near genes that are differentially expressed with age in the same population. Taken together, our results indicate that MACAU is an efficient, effective tool for analyzing bisulfite sequencing data, with particular salience to analyses of structured populations. MACAU is freely available at www.xzlab.org/software.html. PMID:26599596

  1. Phase 2 sampling and analysis plan, Quality Assurance Project Plan, and environmental health and safety plan for the Clinch River Remedial Investigation: An addendum to the Clinch River RCRA Facility Investigation plan

    International Nuclear Information System (INIS)

    Cook, R.B.; Adams, S.M.; Beauchamp, J.J.; Bevelhimer, M.S.; Blaylock, B.G.; Brandt, C.C.; Etnier, E.L.; Ford, C.J.; Frank, M.L.; Gentry, M.J.; Greeley, M.S.; Halbrook, R.S.; Harris, R.A.; Holladay, S.K.; Hook, L.A.; Howell, P.L.; Kszos, L.A.; Levine, D.A.; Skiles, J.L.; Suter, G.W.

    1992-12-01

    This document contains a three-part addendum to the Clinch River Resource Conservation and Recovery Act (RCRA) Facility Investigation Plan. The Clinch River RCRA Facility Investigation began in 1989, as part of the comprehensive remediation of facilities on the US Department of Energy Oak Ridge Reservation (ORR). The ORR was added to the National Priorities List in December 1989. The regulatory agencies have encouraged the adoption of Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) terminology; therefore, the Clinch River activity is now referred to as the Clinch River Remedial Investigation (CRRI), not the Clinch River RCRA Facility Investigation. Part 1 of this document is the plan for sampling and analysis (S ampersand A) during Phase 2 of the CRRI. Part 2 is a revision of the Quality Assurance Project Plan for the CRRI, and Part 3 is a revision of the Environmental Health and Safety Plan for the CRRI. The Clinch River RI (CRRI) is designed to address the transport, fate, and distribution of waterborne contaminants (radionuclides, metals, and organic compounds) released from the DOE Oak Ridge Reservation (ORR) and to assess potential risks to human health and the environment associated with these contaminants. Primary areas of investigation are Melton Hill Reservoir, the Clinch River from Melton Hill Dam to its confluence with the Tennessee River, Poplar Creek, and Watts Bar Reservoir. The contaminants identified in the Clinch River/Watts Bar Reservoir (CR/WBR) downstream of the ORR are those associated with the water, suspended particles, deposited sediments, aquatic organisms, and wildlife feeding on aquatic organisms. The purpose of the Phase 2 S ampersand A Plan is to describe the proposed tasks and subtasks developed to meet the primary objectives of the CRRI

  2. Phase 2 sampling and analysis plan, Quality Assurance Project Plan, and environmental health and safety plan for the Clinch River Remedial Investigation: An addendum to the Clinch River RCRA Facility Investigation plan

    Energy Technology Data Exchange (ETDEWEB)

    Cook, R.B.; Adams, S.M.; Beauchamp, J.J.; Bevelhimer, M.S.; Blaylock, B.G.; Brandt, C.C.; Etnier, E.L.; Ford, C.J.; Frank, M.L.; Gentry, M.J.; Greeley, M.S.; Halbrook, R.S.; Harris, R.A.; Holladay, S.K.; Hook, L.A.; Howell, P.L.; Kszos, L.A.; Levine, D.A.; Skiles, J.L.; Suter, G.W.

    1992-12-01

    This document contains a three-part addendum to the Clinch River Resource Conservation and Recovery Act (RCRA) Facility Investigation Plan. The Clinch River RCRA Facility Investigation began in 1989, as part of the comprehensive remediation of facilities on the US Department of Energy Oak Ridge Reservation (ORR). The ORR was added to the National Priorities List in December 1989. The regulatory agencies have encouraged the adoption of Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) terminology; therefore, the Clinch River activity is now referred to as the Clinch River Remedial Investigation (CRRI), not the Clinch River RCRA Facility Investigation. Part 1 of this document is the plan for sampling and analysis (S A) during Phase 2 of the CRRI. Part 2 is a revision of the Quality Assurance Project Plan for the CRRI, and Part 3 is a revision of the Environmental Health and Safety Plan for the CRRI. The Clinch River RI (CRRI) is designed to address the transport, fate, and distribution of waterborne contaminants (radionuclides, metals, and organic compounds) released from the DOE Oak Ridge Reservation (ORR) and to assess potential risks to human health and the environment associated with these contaminants. Primary areas of investigation are Melton Hill Reservoir, the Clinch River from Melton Hill Dam to its confluence with the Tennessee River, Poplar Creek, and Watts Bar Reservoir. The contaminants identified in the Clinch River/Watts Bar Reservoir (CR/WBR) downstream of the ORR are those associated with the water, suspended particles, deposited sediments, aquatic organisms, and wildlife feeding on aquatic organisms. The purpose of the Phase 2 S A Plan is to describe the proposed tasks and subtasks developed to meet the primary objectives of the CRRI.

  3. Covering Resilience: A Recent Development for Binomial Checkpointing

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Andrea; Narayanan, Sri Hari Krishna

    2016-09-12

    In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, required, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algorithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massive parallel simulations and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We describe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding implementation and discuss first numerical results.

  4. A Bayesian equivalency test for two independent binomial proportions.

    Science.gov (United States)

    Kawasaki, Yohei; Shimokawa, Asanao; Yamada, Hiroshi; Miyaoka, Etsuo

    2016-01-01

    In clinical trials, it is often necessary to perform an equivalence study. The equivalence study requires actively denoting equivalence between two different drugs or treatments. Since it is not possible to assert equivalence that is not rejected by a superiority test, statistical methods known as equivalency tests have been suggested. These methods for equivalency tests are based on the frequency framework; however, there are few such methods in the Bayesian framework. Hence, this article proposes a new index that suggests the equivalency of binomial proportions, which is constructed based on the Bayesian framework. In this study, we provide two methods for calculating the index and compare the probabilities that have been calculated by these two calculation methods. Moreover, we apply this index to the results of actual clinical trials to demonstrate the utility of the index.

  5. Comprehensive work plan and health and safety plan for the 7500 Area Contamination Site sampling at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Burman, S.N.; Landguth, D.C.; Uziel, M.S.; Hatmaker, T.L.; Tiner, P.F.

    1992-05-01

    As part of the Environmental Restoration Program sponsored by the US Department of Energy's Office of Environmental Restoration and Waste Management, this plan has been developed for the environmental sampling efforts at the 7500 Area Contamination Site, Oak Ridge National Laboratory (ORNL), Oak Ridge, Tennessee. This plan was developed by the Measurement Applications and Development Group (MAD) of the Health and Safety Research Division of ORNL and will be implemented by ORNL/MAD. Major components of the plan include (1) a quality assurance project plan that describes the scope and objectives of ORNL/MAD activities at the 7500 Area Contamination Site, assigns responsibilities, and provides emergency information for contingencies that may arise during field operations; (2) sampling and analysis sections; (3) a site-specific health and safety section that describes general site hazards, hazards associated with specific tasks, personnel protection requirements, and mandatory safety procedures; (4) procedures and requirements for equipment decontamination and responsibilities for generated wastes, waste management, and contamination control; and (5) a discussion of form completion and reporting required to document activities at the 7500 Area Contamination Site

  6. Comprehensive work plan and health and safety plan for the 7500 Area Contamination Site sampling at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    Burman, S.N.; Landguth, D.C.; Uziel, M.S.; Hatmaker, T.L.; Tiner, P.F.

    1992-05-01

    As part of the Environmental Restoration Program sponsored by the US Department of Energy`s Office of Environmental Restoration and Waste Management, this plan has been developed for the environmental sampling efforts at the 7500 Area Contamination Site, Oak Ridge National Laboratory (ORNL), Oak Ridge, Tennessee. This plan was developed by the Measurement Applications and Development Group (MAD) of the Health and Safety Research Division of ORNL and will be implemented by ORNL/MAD. Major components of the plan include (1) a quality assurance project plan that describes the scope and objectives of ORNL/MAD activities at the 7500 Area Contamination Site, assigns responsibilities, and provides emergency information for contingencies that may arise during field operations; (2) sampling and analysis sections; (3) a site-specific health and safety section that describes general site hazards, hazards associated with specific tasks, personnel protection requirements, and mandatory safety procedures; (4) procedures and requirements for equipment decontamination and responsibilities for generated wastes, waste management, and contamination control; and (5) a discussion of form completion and reporting required to document activities at the 7500 Area Contamination Site.

  7. Comprehensive work plan and health and safety plan for the 7500 Area Contamination Site sampling at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Burman, S.N.; Landguth, D.C.; Uziel, M.S.; Hatmaker, T.L.; Tiner, P.F.

    1992-05-01

    As part of the Environmental Restoration Program sponsored by the US Department of Energy's Office of Environmental Restoration and Waste Management, this plan has been developed for the environmental sampling efforts at the 7500 Area Contamination Site, Oak Ridge National Laboratory (ORNL), Oak Ridge, Tennessee. This plan was developed by the Measurement Applications and Development Group (MAD) of the Health and Safety Research Division of ORNL and will be implemented by ORNL/MAD. Major components of the plan include (1) a quality assurance project plan that describes the scope and objectives of ORNL/MAD activities at the 7500 Area Contamination Site, assigns responsibilities, and provides emergency information for contingencies that may arise during field operations; (2) sampling and analysis sections; (3) a site-specific health and safety section that describes general site hazards, hazards associated with specific tasks, personnel protection requirements, and mandatory safety procedures; (4) procedures and requirements for equipment decontamination and responsibilities for generated wastes, waste management, and contamination control; and (5) a discussion of form completion and reporting required to document activities at the 7500 Area Contamination Site.

  8. 40 CFR Appendix A to Subpart F of... - Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines

    Science.gov (United States)

    2010-07-01

    ... Enforcement Auditing of Small Nonroad Engines A Appendix A to Subpart F of Part 90 Protection of Environment...-IGNITION ENGINES AT OR BELOW 19 KILOWATTS Selective Enforcement Auditing Pt. 90, Subpt. F, App. A Appendix A to Subpart F of Part 90—Sampling Plans for Selective Enforcement Auditing of Small Nonroad Engines...

  9. Planning and processing multistage samples with a computer program—MUST.

    Science.gov (United States)

    John W. Hazard; Larry E. Stewart

    1974-01-01

    A computer program was written to handle multistage sampling designs in insect populations. It is, however, general enough to be used for any population where the number of stages does not exceed three. The program handles three types of sampling situations, all of which assume equal probability sampling. Option 1 takes estimates of sample variances, costs, and either...

  10. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  11. Theoretical Antecedents of Standing at Work: An Experience Sampling Approach Using the Theory of Planned Behavior

    Directory of Open Access Journals (Sweden)

    M. Renée Umstattd Meyer

    2016-09-01

    Full Text Available Time spent sitting has been associated with an increased risk of diabetes, cancer, obesity, and mental health impairments. However, 75% of Americans spend most of their days sitting, with work-sitting accounting for 63% of total daily sitting time. Little research examining theory-based antecedents of standing or sitting has been conducted. This lack of solid groundwork makes it difficult to design effective intervention strategies to decrease sitting behaviors. Using the Theory of Planned Behavior (TPB as our theoretical lens to better understand factors related with beneficial standing behaviors already being practiced, we examined relationships between TPB constructs and time spent standing at work among “positive deviants” (those successful in behavior change. Experience sampling methodology (ESM, 4 times a day (midmorning, before lunch, afternoon, and before leaving work for 5 consecutive workdays (Monday to Friday, was used to assess employees’ standing time. TPB scales assessing attitude (α = 0.81–0.84, norms (α = 0.83, perceived behavioral control (α = 0.77, and intention (α = 0.78 were developed using recommended methods and collected once on the Friday before the ESM surveys started. ESM data are hierarchically nested, therefore we tested our hypotheses using multilevel structural equation modeling with Mplus. Hourly full-time university employees (n = 50; 70.6% female, 84.3% white, mean age = 44 (SD = 11, 88.2%in full-time staff positions with sedentary occupation types (time at desk while working ≥6 hours/day participated. A total of 871 daily surveys were completed. Only perceived behavioral control (β = 0.45, p < 0.05 was related with work-standing at the event-level (model fit: just fit; mediation through intention was not supported. This is the first study to examine theoretical antecedents of real-time work-standing in a naturalistic field setting among positive deviants. These relationships should be further

  12. Sampling and analysis plan for sludge located on the floor and in the pits of the 105-K basins

    International Nuclear Information System (INIS)

    BAKER, R.B.

    1998-01-01

    This Sampling and Analysis Plan (SAP) provides direction for the sampling of the sludge found on the floor and in the remote pits of the 105-K Basins to provide: (1) basic data for the sludges that have not been characterized to-date and (2) representative Sludge material for process tests to be made by the SNF Project/K Basins sludge treatment process subproject. The sampling equipment developed will remove representative samples of the radioactive sludge from underwater at the K Basins, depositing them in shielded containers for transport to the Hanford Site laboratories. Included in the present document is the basic background logic for selection of the samples to meet the requirements established in the Data Quality Objectives (DQO), HNF-2033, for this sampling activity. The present document also includes the laboratory analyses, methods, procedures, and reporting that will be required to meet the DQO

  13. Engineering Task Plan for Development and Fabrication and Deployment of a mobile, variable depth sampling At-Tank Analysis Systems

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    This engineering task plan identifies the resources, responsibilities, and schedules for the development and deployment of a mobile, variable depth sampling system and an at-tank analysis system. The mobile, variable depth sampling system concept was developed after a cost assessment indicated a high cost for multiple deployments of the nested, fixed-depth sampling system. The sampling will provide double-shell tank (DST) staging tank waste samples for assuring the readiness of the waste for shipment to the LAW/HLW plant for treatment and immobilization. The at-tank analysis system will provide ''real-time'' assessments of the samples' chemical and physical properties. These systems support the Hanford Phase 1B vitrification project

  14. Revealing Word Order: Using Serial Position in Binomials to Predict Properties of the Speaker

    Science.gov (United States)

    Iliev, Rumen; Smirnova, Anastasia

    2016-01-01

    Three studies test the link between word order in binomials and psychological and demographic characteristics of a speaker. While linguists have already suggested that psychological, cultural and societal factors are important in choosing word order in binomials, the vast majority of relevant research was focused on general factors and on broadly…

  15. Modeling and Predistortion of Envelope Tracking Power Amplifiers using a Memory Binomial Model

    DEFF Research Database (Denmark)

    Tafuri, Felice Francesco; Sira, Daniel; Larsen, Torben

    2013-01-01

    . The model definition is based on binomial series, hence the name of memory binomial model (MBM). The MBM is here applied to measured data-sets acquired from an ET measurement set-up. When used as a PA model the MBM showed an NMSE (Normalized Mean Squared Error) as low as −40dB and an ACEPR (Adjacent Channel...

  16. Some normed binomial difference sequence spaces related to the [Formula: see text] spaces.

    Science.gov (United States)

    Song, Meimei; Meng, Jian

    2017-01-01

    The aim of this paper is to introduce the normed binomial sequence spaces [Formula: see text] by combining the binomial transformation and difference operator, where [Formula: see text]. We prove that these spaces are linearly isomorphic to the spaces [Formula: see text] and [Formula: see text], respectively. Furthermore, we compute Schauder bases and the α -, β - and γ -duals of these sequence spaces.

  17. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  18. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland: Volume 2, Quality Assurance Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, S.; Martino, L.; Patton, T.

    1995-03-01

    J-Field encompasses about 460 acres at the southern end of the Gunpowder Neck Peninsula in the Edgewood Area of APG (Figure 2.1). Since World War II, the Edgewood Area of APG has been used to develop, manufacture, test, and destroy chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). For the purposes of this project, J-Field has been divided into eight geographic areas or facilities that are designated as areas of concern (AOCs): the Toxic Burning Pits (TBP), the White Phosphorus Burning Pits (WPP), the Riot Control Burning Pit (RCP), the Robins Point Demolition Ground (RPDG), the Robins Point Tower Site (RPTS), the South Beach Demolition Ground (SBDG), the South Beach Trench (SBT), and the Prototype Building (PB). The scope of this project is to conduct a remedial investigation/feasibility study (RI/FS) and ecological risk assessment to evaluate the impacts of past disposal activities at the J-Field site. Sampling for the RI will be carried out in three stages (I, II, and III) as detailed in the FSP. A phased approach will be used for the J-Field ecological risk assessment (ERA).

  19. Semiparametric Allelic Tests for Mapping Multiple Phenotypes: Binomial Regression and Mahalanobis Distance.

    Science.gov (United States)

    Majumdar, Arunabha; Witte, John S; Ghosh, Saurabh

    2015-12-01

    Binary phenotypes commonly arise due to multiple underlying quantitative precursors and genetic variants may impact multiple traits in a pleiotropic manner. Hence, simultaneously analyzing such correlated traits may be more powerful than analyzing individual traits. Various genotype-level methods, e.g., MultiPhen (O'Reilly et al. []), have been developed to identify genetic factors underlying a multivariate phenotype. For univariate phenotypes, the usefulness and applicability of allele-level tests have been investigated. The test of allele frequency difference among cases and controls is commonly used for mapping case-control association. However, allelic methods for multivariate association mapping have not been studied much. In this article, we explore two allelic tests of multivariate association: one using a Binomial regression model based on inverted regression of genotype on phenotype (Binomial regression-based Association of Multivariate Phenotypes [BAMP]), and the other employing the Mahalanobis distance between two sample means of the multivariate phenotype vector for two alleles at a single-nucleotide polymorphism (Distance-based Association of Multivariate Phenotypes [DAMP]). These methods can incorporate both discrete and continuous phenotypes. Some theoretical properties for BAMP are studied. Using simulations, the power of the methods for detecting multivariate association is compared with the genotype-level test MultiPhen's. The allelic tests yield marginally higher power than MultiPhen for multivariate phenotypes. For one/two binary traits under recessive mode of inheritance, allelic tests are found to be substantially more powerful. All three tests are applied to two different real data and the results offer some support for the simulation study. We propose a hybrid approach for testing multivariate association that implements MultiPhen when Hardy-Weinberg Equilibrium (HWE) is violated and BAMP otherwise, because the allelic approaches assume HWE

  20. Inverse sampled Bernoulli (ISB) procedure for estimating a population proportion, with nuclear material applications

    International Nuclear Information System (INIS)

    Wright, T.

    1982-01-01

    A new sampling procedure is introduced for estimating a population proportion. The procedure combines the ideas of inverse binomial sampling and Bernoulli sampling. An unbiased estimator is given with its variance. The procedure can be viewed as a generalization of inverse binomial sampling

  1. Sampling and analysis plan for remediation of Operable Unit 100-IU-3 waste site 600-104

    International Nuclear Information System (INIS)

    1997-08-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities to support remediation of 100-IU-3 Operable Unit waste site 600-104. The purpose of the proposed sampling and analysis activities is to demonstrate that time-critical remediation of the waste site for soil containing 2,4-Dichlorophonoxyacetic acid salts and esters (2,4-D) and dioxin/furan isomers at concentrations that exceed cleanup levels has been effective. This shall be accomplished by sampling various locations of the waste site before and after remediation, analyzing the samples, and comparing the results to action levels set by the Washington State Department of Ecology

  2. Environmental hazards in the developing world, a sample study of Pakistan: assessments, impacts and plans

    International Nuclear Information System (INIS)

    Khan, A.

    2003-01-01

    Centralized planning policies and lack of democratic participation of the masses at community level have not only created uneven and unsustainable development and rural-urban bias, but have also generated various issues of water, air and land pollution, effecting adversely human development in the developing world in general but in Pakistan in particular. (author)

  3. 34 CFR Appendix A to Subpart N of... - Sample Default Prevention Plan

    Science.gov (United States)

    2010-07-01

    ... implement when developing a default prevention plan. I. Core Default Reduction Strategies 1. Establish your.... Establish a process to ensure the accuracy of your rate. II. Additional Default Reduction Strategies 1... debt management activities. 2. Enhance the enrollment retention and academic persistence of borrowers...

  4. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    Energy Technology Data Exchange (ETDEWEB)

    Grabbe, R.R.

    1995-03-02

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A.

  5. Counting, enumerating and sampling of execution plans in a cost-based query optimizer

    NARCIS (Netherlands)

    F. Waas; C.A. Galindo-Legaria

    1999-01-01

    textabstractTesting an SQL database system by running large sets of deterministic or stochastic SQL statements is common practice in commercial database development. However, code defects often remain undetected as the query optimizer's choice of an execution plan is not only depending on

  6. Counting, Enumerating and Sampling of Execution Plans in a Cost-Based Query Optimizer

    NARCIS (Netherlands)

    F. Waas; C.A. Galindo-Legaria

    2000-01-01

    textabstractTesting an SQL database system by running large sets of deterministic or stochastic SQL statements is common practice in commercial database development. However, code defects often remain undetected as the query optimizer's choice of an execution plan is not only depending on the query

  7. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    International Nuclear Information System (INIS)

    Grabbe, R.R.

    1995-01-01

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A

  8. Field Sampling Plan for Closure of the Central Facilities Area Sewage Treatment Plant Lagoon 3 and Land Application Area

    International Nuclear Information System (INIS)

    Lewis, Michael George

    2016-01-01

    This field sampling plan describes sampling of the soil/liner of Lagoon 3 at the Central Facilities Area Sewage Treatment Plant. The lagoon is to be closed, and samples obtained from the soil/liner will provide information to determine if Lagoon 3 and the land application area can be closed in a manner that renders it safe to human health and the environment. Samples collected under this field sampling plan will be compared to Idaho National Laboratory background soil concentrations. If the concentrations of constituents of concern exceed the background level, they will be compared to Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals and Resource Conservation and Recovery Act levels. If the concentrations of constituents of concern are lower than the background levels, Resource Conservation and Recovery Act levels, or the preliminary remediation goals, then Lagoon 3 and the land application area will be closed. If the Resource Conservation and Recovery Act levels and/or the Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals are exceeded, additional sampling and action may be required.

  9. Field Sampling Plan for Closure of the Central Facilities Area Sewage Treatment Plant Lagoon 3 and Land Application Area

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, Michael George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-10-01

    This field sampling plan describes sampling of the soil/liner of Lagoon 3 at the Central Facilities Area Sewage Treatment Plant. The lagoon is to be closed, and samples obtained from the soil/liner will provide information to determine if Lagoon 3 and the land application area can be closed in a manner that renders it safe to human health and the environment. Samples collected under this field sampling plan will be compared to Idaho National Laboratory background soil concentrations. If the concentrations of constituents of concern exceed the background level, they will be compared to Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals and Resource Conservation and Recovery Act levels. If the concentrations of constituents of concern are lower than the background levels, Resource Conservation and Recovery Act levels, or the preliminary remediation goals, then Lagoon 3 and the land application area will be closed. If the Resource Conservation and Recovery Act levels and/or the Comprehensive Environmental Response, Compensation, and Liability Act preliminary remediation goals are exceeded, additional sampling and action may be required.

  10. UMTRA Project water sampling and analysis plan, Belfield and Bowman, North Dakota

    International Nuclear Information System (INIS)

    1994-08-01

    Surface remedial action is scheduled to begin at the Belfield and Bowman Uranium Mill Tailings Remedial Action (UMTRA) Project sites in the spring of 1996. Water sampling was conducted in 1993 at both the Belfield processing site and the Bowman processing/disposal site. Results of the sampling at both sites indicate that ground water conditions have remained relatively stable over time. Water sampling activities are not scheduled for 1994 because ground water conditions at the two sites are relatively stable, the 1993 sampling was comprehensive, and surface remediation activities are not scheduled to start until 1996. The next water sampling event is scheduled before the start of remedial activities and will include sampling selected monitor wells at both sites and several domestic wells in the vicinity

  11. Discrete Sampling Test Plan for the 200-BP-5 Operable Unit

    Energy Technology Data Exchange (ETDEWEB)

    Sweeney, Mark D.

    2010-02-04

    The Discrete Groundwater Sampling Project is conducted by the Pacific Northwest National Laboratory (PNNL) on behalf of CH2M HILL Plateau Remediation Company. The project is focused on delivering groundwater samples from proscribed horizons within select groundwater wells residing in the 200-BP-5 Operable Unit (200-BP-5 OU) on the Hanford Site. This document provides the scope, schedule, methodology, and other details of the PNNL discrete sampling effort.

  12. Meteorological monitoring sampling and analysis plan for the environmental monitoring plan at Waste Area Grouping 6, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-09-01

    This Sampling and Analysis Plan addresses meteorological monitoring activities that wall be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Lockheed Martin Energy Systems, Inc. Meteorological monitoring of various climatological parameters (e.g., temperature, wind speed, humidity) will be collected by instruments installed at WAG 6. Data will be recorded electronically at frequencies varying from 5-min intervals to 1-h intervals, dependent upon parameter. The data will be downloaded every 2 weeks, evaluated, compressed, and uploaded into a WAG 6 data base for subsequent use. The meteorological data will be used in water balance calculations in support of the WAG 6 hydrogeological model

  13. Fixed-Precision Sequential Sampling Plans for Estimating Alfalfa Caterpillar, Colias lesbia, Egg Density in Alfalfa, Medicago sativa, Fields in Córdoba, Argentina

    Science.gov (United States)

    Serra, Gerardo V.; Porta, Norma C. La; Avalos, Susana; Mazzuferi, Vilma

    2013-01-01

    The alfalfa caterpillar, Colias lesbia (Fabricius) (Lepidoptera: Pieridae), is a major pest of alfalfa, Medicago sativa L. (Fabales: Fabaceae), crops in Argentina. Its management is based mainly on chemical control of larvae whenever the larvae exceed the action threshold. To develop and validate fixed-precision sequential sampling plans, an intensive sampling programme for C. lesbia eggs was carried out in two alfalfa plots located in the Province of Córdoba, Argentina, from 1999 to 2002. Using Resampling for Validation of Sampling Plans software, 12 additional independent data sets were used to validate the sequential sampling plan with precision levels of 0.10 and 0.25 (SE/mean), respectively. For a range of mean densities of 0.10 to 8.35 eggs/sample, an average sample size of only 27 and 26 sample units was required to achieve a desired precision level of 0.25 for the sampling plans of Green and Kuno, respectively. As the precision level was increased to 0.10, average sample size increased to 161 and 157 sample units for the sampling plans of Green and Kuno, respectively. We recommend using Green's sequential sampling plan because it is less sensitive to changes in egg density. These sampling plans are a valuable tool for researchers to study population dynamics and to evaluate integrated pest management strategies. PMID:23909840

  14. Field Sampling Plan for the HWMA/RCRA Closure Certification of the TRA-731 Caustic and Acid Storage Tank System - 1997 Notice of Violation Consent Order

    International Nuclear Information System (INIS)

    Evans, S.K.

    2002-01-01

    This Field Sampling Plan for the HWMA/RCRA Closure Certification of the TRA-731 Caustic and Acid Storage Tank System is one of two documents that comprise the Sampling and Analysis Plan for the HWMA/RCRA closure certification of the TRA-731 caustic and acid storage tank system at the Idaho National Engineering and Environmental Laboratory. This plan, which provides information about sampling design, required analyses, and sample collection and handling procedures, is to be used in conjunction with the Quality Assurance Project Plan for the HWMA/RCRA Closure Certification of the TRA-731 Caustic and Acid Storage Tank System

  15. Sampling and Analysis Plan for Release of the 105-C Below-Grade Structures and Underlying Soils

    International Nuclear Information System (INIS)

    Bauer, R.G.

    1998-02-01

    This sampling and analysis plan (SAP) presents the rationale and strategies for the sampling, field measurements, and analyses of the below-grade concrete structures from the Hanford 105-C Reactor Building and the underlying soils consistent with the land-use assumptions in the Record of Decision for the U.S. Department of Energy 100-BC-1, 100-DR-1, and 100-HR-1 Operable Units (ROD) (EPA 1995). This structure is one of nine surplus production reactors in the 100 Area of the Hanford Site. This SAP is based on the data quality objectives developed for the 105-C below-grade structures and underlying soils (BHI 1997)

  16. Sampling and analysis plan for the characterization of eight drums at the 200-BP-5 pump-and-treat systems

    International Nuclear Information System (INIS)

    Laws, J.R.

    1995-01-01

    Samples will be collected and analyzed to provide sufficient information for characterization of mercury and aluminum contamination in drums from the final rinse of the tanks in the two pump-and-treat systems supporting the 200-BP-5 Operable Unit. The data will be used to determine the type of contamination in the drums to properly designate the waste for disposal or treatment. This sampling plan does not substitute the sampling requirements but is a separate sampling event to manage eight drums containing waste generated during an unanticipated contamination of the process water with mercury and aluminum nitrate nonahydrate (ANN). The Toxicity Characteristic Leaching Procedure (TCLP) will be used for extraction, and standard US Environmental Protection Agency (EPA) methods will be used for analysis

  17. Microbial comparative pan-genomics using binomial mixture models

    Directory of Open Access Journals (Sweden)

    Ussery David W

    2009-08-01

    Full Text Available Abstract Background The size of the core- and pan-genome of bacterial species is a topic of increasing interest due to the growing number of sequenced prokaryote genomes, many from the same species. Attempts to estimate these quantities have been made, using regression methods or mixture models. We extend the latter approach by using statistical ideas developed for capture-recapture problems in ecology and epidemiology. Results We estimate core- and pan-genome sizes for 16 different bacterial species. The results reveal a complex dependency structure for most species, manifested as heterogeneous detection probabilities. Estimated pan-genome sizes range from small (around 2600 gene families in Buchnera aphidicola to large (around 43000 gene families in Escherichia coli. Results for Echerichia coli show that as more data become available, a larger diversity is estimated, indicating an extensive pool of rarely occurring genes in the population. Conclusion Analyzing pan-genomics data with binomial mixture models is a way to handle dependencies between genomes, which we find is always present. A bottleneck in the estimation procedure is the annotation of rarely occurring genes.

  18. Dynamic prediction of cumulative incidence functions by direct binomial regression.

    Science.gov (United States)

    Grand, Mia K; de Witte, Theo J M; Putter, Hein

    2018-03-25

    In recent years there have been a series of advances in the field of dynamic prediction. Among those is the development of methods for dynamic prediction of the cumulative incidence function in a competing risk setting. These models enable the predictions to be updated as time progresses and more information becomes available, for example when a patient comes back for a follow-up visit after completing a year of treatment, the risk of death, and adverse events may have changed since treatment initiation. One approach to model the cumulative incidence function in competing risks is by direct binomial regression, where right censoring of the event times is handled by inverse probability of censoring weights. We extend the approach by combining it with landmarking to enable dynamic prediction of the cumulative incidence function. The proposed models are very flexible, as they allow the covariates to have complex time-varying effects, and we illustrate how to investigate possible time-varying structures using Wald tests. The models are fitted using generalized estimating equations. The method is applied to bone marrow transplant data and the performance is investigated in a simulation study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Confidence limits for parameters of Poisson and binomial distributions

    International Nuclear Information System (INIS)

    Arnett, L.M.

    1976-04-01

    The confidence limits for the frequency in a Poisson process and for the proportion of successes in a binomial process were calculated and tabulated for the situations in which the observed values of the frequency or proportion and an a priori distribution of these parameters are available. Methods are used that produce limits with exactly the stated confidence levels. The confidence interval [a,b] is calculated so that Pr [a less than or equal to lambda less than or equal to b c,μ], where c is the observed value of the parameter, and μ is the a priori hypothesis of the distribution of this parameter. A Bayesian type analysis is used. The intervals calculated are narrower and appreciably different from results, known to be conservative, that are often used in problems of this type. Pearson and Hartley recognized the characteristics of their methods and contemplated that exact methods could someday be used. The calculation of the exact intervals requires involved numerical analyses readily implemented only on digital computers not available to Pearson and Hartley. A Monte Carlo experiment was conducted to verify a selected interval from those calculated. This numerical experiment confirmed the results of the analytical methods and the prediction of Pearson and Hartley that their published tables give conservative results

  20. Phase 1 sampling and analysis plan for the 304 Concretion Facility closure activities

    International Nuclear Information System (INIS)

    Adler, J.G.

    1994-01-01

    This document provides guidance for the initial (Phase 1) sampling and analysis activities associated with the proposed Resource Conservation and Recovery Act of 1976 (RCRA) clean closure of the 304 Concretion Facility. Over its service life, the 304 Concretion Facility housed the pilot plants associated with cladding uranium cores, was used to store engineering equipment and product chemicals, was used to treat low-level radioactive mixed waste, recyclable scrap uranium generated during nuclear fuel fabrication, and uranium-titanium alloy chips, and was used for the repackaging of spent halogenated solvents from the nuclear fuels manufacturing process. The strategy for clean closure of the 304 Concretion Facility is to decontaminate, sample (Phase 1 sampling), and evaluate results. If the evaluation indicates that a limited area requires additional decontamination for clean closure, the limited area will be decontaminated, resampled (Phase 2 sampling), and the result evaluated. If the evaluation indicates that the constituents of concern are below action levels, the facility will be clean closed. Or, if the evaluation indicates that the constituents of concern are present above action levels, the condition of the facility will be evaluated and appropriate action taken. There are a total of 37 sampling locations comprising 12 concrete core, 1 concrete chip, 9 soil, 11 wipe, and 4 asphalt core sampling locations. Analysis for inorganics and volatile organics will be performed on the concrete core and soil samples. Separate concrete core samples will be required for the inorganic and volatile organic analysis (VOA). Analysis for inorganics only will be performed on the concrete chip, wipe, and asphalt samples

  1. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  2. UMTRA Project water sampling and analysis plan, Gunnison, Colorado. Revision 2

    International Nuclear Information System (INIS)

    1995-09-01

    Surface remedial action at the Gunnison Uranium Mill Tailings Remedial Action Project site began in 1992; completion is expected in 1995. Ground water and surface water will be sampled semiannually at the Gunnison processing site (GUN-01) and disposal site (GUN-08). Results of previous water sampling at the Gunnison processing site indicate that ground water in the alluvium is contaminated by the former uranium processing activities. Background ground water conditions have been established in the uppermost aquifer (Tertiary gravels) at the Gunnison disposal site. Semiannual water sampling is scheduled for the spring and fall. Water quality sampling is conducted at the processing site (1) to ensure protection of human health and the environment, (2) for ground water compliance monitoring during remedial action construction, and (3) to define the extent of contamination. At the processing site, the frequency and duration of sampling will be dependent upon the nature and extent of residual contamination and the compliance strategy chosen. The monitor well locations provide a representative distribution of sampling points to characterize ground water quality and ground water flow conditions in the vicinity of the sites. The list of analytes has been modified with time to reflect constituents that are related to uranium processing activities and the parameters needed for geochemical evaluation

  3. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy

    2014-09-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  4. On the revival of the negative binomial distribution in multiparticle production

    International Nuclear Information System (INIS)

    Ekspong, G.

    1990-01-01

    This paper is based on published and some unpublished material pertaining to the revival of interest in and success of applying the negative binomial distribution to multiparticle production since 1983. After a historically oriented introduction going farther back in time, the main part of the paper is devoted to an unpublished derivation of the negative binomial distribution based on empirical observations of forward-backward multiplicity correlations. Some physical processes leading to the negative binomial distribution are mentioned and some comments made on published criticisms

  5. Comparison of multiplicity distributions to the negative binomial distribution in muon-proton scattering

    International Nuclear Information System (INIS)

    Arneodo, M.; Ferrero, M.I.; Peroni, C.; Bee, C.P.; Bird, I.; Coughlan, J.; Sloan, T.; Braun, H.; Brueck, H.; Drees, J.; Edwards, A.; Krueger, J.; Montgomery, H.E.; Peschel, H.; Pietrzyk, U.; Poetsch, M.; Schneider, A.; Dreyer, T.; Ernst, T.; Haas, J.; Kabuss, E.M.; Landgraf, U.; Mohr, W.; Rith, K.; Schlagboehmer, A.; Schroeder, T.; Stier, H.E.; Wallucks, W.

    1987-01-01

    The multiplicity distributions of charged hadrons produced in the deep inelastic muon-proton scattering at 280 GeV are analysed in various rapidity intervals, as a function of the total hadronic centre of mass energy W ranging from 4-20 GeV. Multiplicity distributions for the backward and forward hemispheres are also analysed separately. The data can be well parameterized by binomial distributions, extending their range of applicability to the case of lepton-proton scattering. The energy and the rapidity dependence of the parameters is presented and a smooth transition from the binomial distribution via Poissonian to the ordinary binomial is observed. (orig.)

  6. Groundwater level monitoring sampling and analysis plan for the environmental monitoring plan at waste area grouping 6, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This document is the Groundwater Level Monitoring Sampling and Analysis Plan (SAP) for Waste Area Grouping (WAG) 6 at Oak Ridge National Laboratory (ORNL). Note that this document is referred to as a SAP even though no sampling and analysis will be conducted. The term SAP is used for consistency. The procedures described herein are part of the Environmental Monitoring Plan (EMP) for WAG 6, which also includes monitoring tasks for seeps and springs, groundwater quality, surface water, and meteorological parameters. Separate SAPs are being issued concurrently to describe each of these monitoring programs. This SAP has been written for the use of the field personnel responsible for implementation of the EMP, with the intent that the field personnel will be able to take these documents to the field and quickly find the appropriate steps required to complete a specific task. In many cases, Field Operations Procedures (FOPs) will define the steps required for an activity. The FOPs for the EMP are referenced and briefly described in the relevant sections of the SAPs, and are contained within the FOP Manual. Both these documents (the SAP and the FOP Manual) will be available to personnel in the field.

  7. Groundwater level monitoring sampling and analysis plan for the environmental monitoring plan at waste area grouping 6, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-09-01

    This document is the Groundwater Level Monitoring Sampling and Analysis Plan (SAP) for Waste Area Grouping (WAG) 6 at Oak Ridge National Laboratory (ORNL). Note that this document is referred to as a SAP even though no sampling and analysis will be conducted. The term SAP is used for consistency. The procedures described herein are part of the Environmental Monitoring Plan (EMP) for WAG 6, which also includes monitoring tasks for seeps and springs, groundwater quality, surface water, and meteorological parameters. Separate SAPs are being issued concurrently to describe each of these monitoring programs. This SAP has been written for the use of the field personnel responsible for implementation of the EMP, with the intent that the field personnel will be able to take these documents to the field and quickly find the appropriate steps required to complete a specific task. In many cases, Field Operations Procedures (FOPs) will define the steps required for an activity. The FOPs for the EMP are referenced and briefly described in the relevant sections of the SAPs, and are contained within the FOP Manual. Both these documents (the SAP and the FOP Manual) will be available to personnel in the field

  8. Characterization of spatial distribution of Tetranychus urticae in peppermint in California and implication for improving sampling plan.

    Science.gov (United States)

    Rijal, Jhalendra P; Wilson, Rob; Godfrey, Larry D

    2016-02-01

    Twospotted spider mite, Tetranychus urticae Koch, is an important pest of peppermint in California, USA. Spider mite feeding on peppermint leaves causes physiological changes in the plant, which coupling with the favorable environmental condition can lead to increased mite infestations. Significant yield loss can occur in absence of pest monitoring and timely management. Understating the within-field spatial distribution of T. urticae is critical for the development of reliable sampling plan. The study reported here aims to characterize the spatial distribution of mite infestation in four commercial peppermint fields in northern California using spatial techniques, variogram and Spatial Analysis by Distance IndicEs (SADIE). Variogram analysis revealed that there was a strong evidence for spatially dependent (aggregated) mite population in 13 of 17 sampling dates and the physical distance of the aggregation reached maximum to 7 m in peppermint fields. Using SADIE, 11 of 17 sampling dates showed aggregated distribution pattern of mite infestation. Combining results from variogram and SADIE analysis, the spatial aggregation of T. urticae was evident in all four fields for all 17 sampling dates evaluated. Comparing spatial association using SADIE, ca. 62% of the total sampling pairs showed a positive association of mite spatial distribution patterns between two consecutive sampling dates, which indicates a strong spatial and temporal stability of mite infestation in peppermint fields. These results are discussed in relation to behavior of spider mite distribution within field, and its implications for improving sampling guidelines that are essential for effective pest monitoring and management.

  9. 40 CFR Appendix X to Part 86 - Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Sampling Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks X Appendix X to Part 86 Protection of... Plans for Selective Enforcement Auditing of Heavy-Duty Engines and Light-Duty Trucks Table 1—Sampling...

  10. Sampling and analysis plan for groundwater and surface water monitoring at the Y-12 Plant during calendar year 1995

    International Nuclear Information System (INIS)

    1994-10-01

    This plan provides a description of the groundwater and surface-water quality monitoring activities planned for calendar year (CY) 1995 at the Department of Energy Y-12 Plant. Included in this plan are the monitoring activities managed by the Y-12 Plant Health, Safety, Environment, and Accountability (HSEA) Organization through the Y-12 Plant Groundwater Protection Program (GWPP). Other groundwater and surface water monitoring activities (e.g. selected Environmental Restoration Program activities, National Pollution Discharge Elimination System (NPDES) monitoring) not managed through the Y-12 Plant GWPP are not addressed in this report. Several monitoring programs will be implemented in three hydrogeologic regimes: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located within Bear Creek Valley, and the Chestnut Ridge Regime is located south of the Y-12 Plant. For various reasons, modifications to the 1995 monitoring programs may be necessary during implementation. For example, changes in regulatory requirements may alter the parameters specified for selected wells, or wells could be added to or deleted from the monitoring network. All modifications to the monitoring programs will be approved by the Y-12 Plant GWPP manager and documented as addenda to this sampling and analysis plan

  11. Field Sampling Plan for the Operable Units 6-05 and 10-04 Remedial Action, Phase IV

    Energy Technology Data Exchange (ETDEWEB)

    R. Wells

    2006-11-14

    This Field Sampling Plan outlines the collection and analysis of samples in support of Phase IV of the Waste Area Group 10, Operable Units 6-05 and 10-04 remedial action. Phase IV addresses the remedial actions to areas with the potential for unexploded ordnance at the Idaho National Laboratory Site. These areas include portions of the Naval Proving Ground, the Arco High-Altitude Bombing Range, and the Twin Buttes Bombing Range. The remedial action consists of removal and disposal of ordnance by high-order detonation, followed by sampling to determine the extent, if any, of soil that might have been contaminated by the detonation activities associated with the disposal of ordnance during the Phase IV activities and explosives during the Phase II activities.

  12. USE OF SCALED SEMIVARIOGRAMS IN THE PLANNING SAMPLE OF SOIL CHEMICAL PROPERTIES IN SOUTHERN AMAZONAS, BRAZIL

    Directory of Open Access Journals (Sweden)

    Ivanildo Amorim de Oliveira

    2015-02-01

    Full Text Available The lack of information concerning the variability of soil properties has been a major concern of researchers in the Amazon region. Thus, the aim of this study was to evaluate the spatial variability of soil chemical properties and determine minimal sampling density to characterize the variability of these properties in five environments located in the south of the State of Amazonas, Brazil. The five environments were archaeological dark earth (ADE, forest, pasture land, agroforestry operation, and sugarcane crop. Regular 70 × 70 m mesh grids were set up in these areas, with 64 sample points spaced at 10 m distance. Soil samples were collected at the 0.0-0.1 m depth. The chemical properties of pH in water, OM, P, K, Ca, Mg, H+Al, SB, CEC, and V were determined at these points. Data were analyzed by descriptive and geostatistical analyses. A large part of the data analyzed showed spatial dependence. Chemical properties were best fitted to the spherical model in almost all the environments evaluated, except for the sugarcane field with a better fit to the exponential model. ADE and sugarcane areas had greater heterogeneity of soil chemical properties, showing a greater range and higher sampling density; however, forest and agroforestry areas had less variability of chemical properties.

  13. 10 CFR Appendix B to Subpart F of... - Sampling Plan For Enforcement Testing

    Science.gov (United States)

    2010-01-01

    ... standard deviation (s 1) of the measured energy or water performance of the (n 1) units in the first sample as follows: ER18MR98.011 Step 4. Compute the standard error (SX 1) of the measured energy or water... the t-statistic has the value obtained in Step 5. Step 11(a). For an Energy Efficiency Standard...

  14. Risk indicators of oral health status among young adults aged 18 years analyzed by negative binomial regression.

    Science.gov (United States)

    Lu, Hai-Xia; Wong, May Chun Mei; Lo, Edward Chin Man; McGrath, Colman

    2013-08-19

    Limited information on oral health status for young adults aged 18 year-olds is known, and no available data exists in Hong Kong. The aims of this study were to investigate the oral health status and its risk indicators among young adults in Hong Kong using negative binomial regression. A survey was conducted in a representative sample of Hong Kong young adults aged 18 years. Clinical examinations were taken to assess oral health status using DMFT index and Community Periodontal Index (CPI) according to WHO criteria. Negative binomial regressions for DMFT score and the number of sextants with healthy gums were performed to identify the risk indicators of oral health status. A total of 324 young adults were examined. Prevalence of dental caries experience among the subjects was 59% and the overall mean DMFT score was 1.4. Most subjects (95%) had a score of 2 as their highest CPI score. Negative binomial regression analyses revealed that subjects who had a dental visit within 3 years had significantly higher DMFT scores (IRR = 1.68, p < 0.001). Subjects who brushed their teeth more frequently (IRR = 1.93, p < 0.001) and those with better dental knowledge (IRR = 1.09, p = 0.002) had significantly more sextants with healthy gums. Dental caries experience of the young adults aged 18 years in Hong Kong was not high but their periodontal condition was unsatisfactory. Their oral health status was related to their dental visit behavior, oral hygiene habit, and oral health knowledge.

  15. Measured PET Data Characterization with the Negative Binomial Distribution Model.

    Science.gov (United States)

    Santarelli, Maria Filomena; Positano, Vincenzo; Landini, Luigi

    2017-01-01

    Accurate statistical model of PET measurements is a prerequisite for a correct image reconstruction when using statistical image reconstruction algorithms, or when pre-filtering operations must be performed. Although radioactive decay follows a Poisson distribution, deviation from Poisson statistics occurs on projection data prior to reconstruction due to physical effects, measurement errors, correction of scatter and random coincidences. Modelling projection data can aid in understanding the statistical nature of the data in order to develop efficient processing methods and to reduce noise. This paper outlines the statistical behaviour of measured emission data evaluating the goodness of fit of the negative binomial (NB) distribution model to PET data for a wide range of emission activity values. An NB distribution model is characterized by the mean of the data and the dispersion parameter α that describes the deviation from Poisson statistics. Monte Carlo simulations were performed to evaluate: (a) the performances of the dispersion parameter α estimator, (b) the goodness of fit of the NB model for a wide range of activity values. We focused on the effect produced by correction for random and scatter events in the projection (sinogram) domain, due to their importance in quantitative analysis of PET data. The analysis developed herein allowed us to assess the accuracy of the NB distribution model to fit corrected sinogram data, and to evaluate the sensitivity of the dispersion parameter α to quantify deviation from Poisson statistics. By the sinogram ROI-based analysis, it was demonstrated that deviation on the measured data from Poisson statistics can be quantitatively characterized by the dispersion parameter α, in any noise conditions and corrections.

  16. On bounds in Poisson approximation for distributions of independent negative-binomial distributed random variables.

    Science.gov (United States)

    Hung, Tran Loc; Giang, Le Truong

    2016-01-01

    Using the Stein-Chen method some upper bounds in Poisson approximation for distributions of row-wise triangular arrays of independent negative-binomial distributed random variables are established in this note.

  17. Distribution-free Inference of Zero-inated Binomial Data for Longitudinal Studies.

    Science.gov (United States)

    He, H; Wang, W J; Hu, J; Gallop, R; Crits-Christoph, P; Xia, Y L

    2015-10-01

    Count reponses with structural zeros are very common in medical and psychosocial research, especially in alcohol and HIV research, and the zero-inflated poisson (ZIP) and zero-inflated negative binomial (ZINB) models are widely used for modeling such outcomes. However, as alcohol drinking outcomes such as days of drinkings are counts within a given period, their distributions are bounded above by an upper limit (total days in the period) and thus inherently follow a binomial or zero-inflated binomial (ZIB) distribution, rather than a Poisson or zero-inflated Poisson (ZIP) distribution, in the presence of structural zeros. In this paper, we develop a new semiparametric approach for modeling zero-inflated binomial (ZIB)-like count responses for cross-sectional as well as longitudinal data. We illustrate this approach with both simulated and real study data.

  18. Difference of Sums Containing Products of Binomial Coefficients and Their Logarithms

    National Research Council Canada - National Science Library

    Miller, Allen R; Moskowitz, Ira S

    2005-01-01

    Properties of the difference of two sums containing products of binomial coefficients and their logarithms which arise in the application of Shannon's information theory to a certain class of covert channels are deduced...

  19. Difference of Sums Containing Products of Binomial Coefficients and their Logarithms

    National Research Council Canada - National Science Library

    Miller, Allen R; Moskowitz, Ira S

    2004-01-01

    Properties of the difference of two sums containing products of binomial coefficients and their logarithms which arise in the application of Shannon's information theory to a certain class of covert channels are deduced...

  20. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  1. Sampling plan design and analysis for a low level radioactive waste disposal program

    International Nuclear Information System (INIS)

    Hassig, N.L.; Wanless, J.W.

    1989-01-01

    Low-level wastes that are candidates for BRC (below regulatory concern) disposal must be subjected to an extensive monitoring program to insure the wastes meet (potential) bulk property and contamination concentration BRC criteria for disposal. This paper addresses the statistical implications of using various methods to verify BRC criteria. While surface and volumetric monitoring each have their advantages and disadvantages, a dual, sequential monitoring process is the preferred choice from a statistical reliability perspective. With dual monitoring, measurements on the contamination are verifiable, and sufficient to allow for a complete characterization of the wastes. As these characterizations become more reliable and stable, something less than 100% sampling may be possible for release of wastes for BRC disposal. This paper provides a survey of the issues involved in the selection of a monitoring and sampling program for the disposal of BRC wastes

  2. Household waste behaviours among a community sample in Iran: an application of the theory of planned behaviour.

    Science.gov (United States)

    Pakpour, Amir H; Zeidi, Isa Mohammadi; Emamjomeh, Mohammad Mahdi; Asefzadeh, Saeed; Pearson, Heidi

    2014-06-01

    Understanding the factors influencing recycling behaviour can lead to better and more effective recycling programs in a community. The goal of this study was to examine factors associated with household waste behaviours in the context of the theory of planned behaviour (TPB) among a community sample of Iranians that included data collection at time 1 and at follow-up one year later at time 2. Study participants were sampled from households under the coverage of eight urban health centers in the city of Qazvin. Of 2000 invited households, 1782 agreed to participate in the study. A self-reported questionnaire was used for assessing socio-demographic factors and the TPB constructs (i.e. attitude, subjective norms, perceived behavioural control, and intention). Furthermore, questions regarding moral obligation, self-identity, action planning, and past recycling behaviour were asked, creating an extended TPB. At time 2, participants were asked to complete a follow-up questionnaire on self-reported recycling behaviours. All TPB constructs had positive and significant correlations with each other. Recycling behaviour at time 1 (past behaviour) significantly related to household waste behaviour at time 2. The extended TPB explained 47% of the variance in household waste behaviour at time 2. Attitude, perceived behavioural control, intention, moral obligation, self-identity, action planning, and past recycling behaviour were significant predictors of household waste behaviour at time 2 in all models. The fact that the expanded TPB constructs significantly predicted household waste behaviours holds great promise for developing effective public campaigns and behaviour-changing interventions in a region where overall rates of household waste reduction behaviours are low. Our results indicate that educational materials which target moral obligation and action planning may be particularly effective. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Y-12 Plant Groundwater Protection Program: Groundwater and surface water sampling and analysis plan for Calendar Year 1998

    International Nuclear Information System (INIS)

    1997-09-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 1998 at the Department of Energy (DOE) Y-12 Plant. These monitoring activities are managed by the Y-12 Plant Environmental Compliance Organization through the Y-12 Plant Groundwater Protection Program (GWPP). Groundwater and surface water monitoring during CY 1998 will be performed in three hydrogeologic regimes at the Y-12 Plant: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located within Bear Creek Valley, and the Chestnut Ridge Regime is located south of the Y-12 Plant. Groundwater and surface water monitoring will be performed during CY 1998 to comply with: (1) requirements specified in Resource Conservation and Recover Act (RCRA) post-closure permits regarding RCRA corrective action monitoring and RCRA detection monitoring; (2) Tennessee Department of Environment and Conservation regulations governing detection monitoring at nonhazardous solid waste management facilities; and (3) DOE Order 5400.1 surveillance monitoring and exit pathway monitoring. Data from some of the sampling locations in each regime will be used to meet the requirements of more than one of the monitoring drivers listed above. Modifications to the CY 1998 monitoring program may be necessary during implementation. For example, changes in regulatory requirements may alter the parameters specified for selected monitoring wells, or wells could be removed from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 Plant GWPP manager and documented as addenda to this sampling and analysis plan

  4. Solvent suppression using phase-modulated binomial-like sequences and applications to diffusion measurements

    Science.gov (United States)

    Zheng, Gang; Torres, Allan M.; Price, William S.

    2008-09-01

    Two phase-modulated binomial-like π pulses have been developed by simultaneously optimizing pulse durations and phases. In combination with excitation sculpting, both of the new binomial-like sequences outperform the well-known 3-9-19 sequence in selectivity and inversion width. The new sequences provide similar selectivity and inversion width to the W5 sequence but with significantly shorter sequence durations. When used in PGSTE-WATERGATE, they afford highly selective solvent suppression in diffusion experiments.

  5. Perbandingan Metode Binomial dan Metode Black-Scholes Dalam Penentuan Harga Opsi

    Directory of Open Access Journals (Sweden)

    Surya Amami Pramuditya

    2016-04-01

    Full Text Available ABSTRAKOpsi adalah kontrak antara pemegang dan penulis  (buyer (holder dan seller (writer di mana penulis (writer memberikan hak (bukan kewajiban kepada holder untuk membeli atau menjual aset dari writer pada harga tertentu (strike atau latihan harga dan pada waktu tertentu dalam waktu (tanggal kadaluwarsa atau jatuh tempo waktu. Ada beberapa cara untuk menentukan harga opsi, diantaranya adalah  Metode Black-Scholes dan Metode Binomial. Metode binomial berasal dari model pergerakan harga saham yang membagi waktu interval [0, T] menjadi n sama panjang. Sedangkan metode Black-Scholes, dimodelkan dengan pergerakan harga saham sebagai suatu proses stokastik. Semakin besar partisi waktu n pada Metode Binomial, maka nilai opsinya akan konvergen ke nilai opsi Metode Black-Scholes.Kata kunci: opsi, Binomial, Black-Scholes.ABSTRACT Option is a contract between the holder and the writer in which the writer gives the right (not the obligation to the holder to buy or sell an asset of a writer at a specified price (the strike or exercise price and at a specified time in the future (expiry date or maturity time. There are several ways to determine the price of options, including the Black-Scholes Method and Binomial Method. Binomial method come from a model of stock price movement that divide time interval [0, T] into n equally long. While the Black Scholes method, the stock price movement is modeled as a stochastic process. More larger the partition of time n in Binomial Method, the value option will converge to the value option in Black-Scholes Method.Key words: Options, Binomial, Black-Scholes

  6. Biased binomial assessment of cross-validated estimation of classification accuracies illustrated in diagnosis predictions

    Directory of Open Access Journals (Sweden)

    Quentin Noirhomme

    2014-01-01

    Full Text Available Multivariate classification is used in neuroimaging studies to infer brain activation or in medical applications to infer diagnosis. Their results are often assessed through either a binomial or a permutation test. Here, we simulated classification results of generated random data to assess the influence of the cross-validation scheme on the significance of results. Distributions built from classification of random data with cross-validation did not follow the binomial distribution. The binomial test is therefore not adapted. On the contrary, the permutation test was unaffected by the cross-validation scheme. The influence of the cross-validation was further illustrated on real-data from a brain–computer interface experiment in patients with disorders of consciousness and from an fMRI study on patients with Parkinson disease. Three out of 16 patients with disorders of consciousness had significant accuracy on binomial testing, but only one showed significant accuracy using permutation testing. In the fMRI experiment, the mental imagery of gait could discriminate significantly between idiopathic Parkinson's disease patients and healthy subjects according to the permutation test but not according to the binomial test. Hence, binomial testing could lead to biased estimation of significance and false positive or negative results. In our view, permutation testing is thus recommended for clinical application of classification with cross-validation.

  7. Biased binomial assessment of cross-validated estimation of classification accuracies illustrated in diagnosis predictions.

    Science.gov (United States)

    Noirhomme, Quentin; Lesenfants, Damien; Gomez, Francisco; Soddu, Andrea; Schrouff, Jessica; Garraux, Gaëtan; Luxen, André; Phillips, Christophe; Laureys, Steven

    2014-01-01

    Multivariate classification is used in neuroimaging studies to infer brain activation or in medical applications to infer diagnosis. Their results are often assessed through either a binomial or a permutation test. Here, we simulated classification results of generated random data to assess the influence of the cross-validation scheme on the significance of results. Distributions built from classification of random data with cross-validation did not follow the binomial distribution. The binomial test is therefore not adapted. On the contrary, the permutation test was unaffected by the cross-validation scheme. The influence of the cross-validation was further illustrated on real-data from a brain-computer interface experiment in patients with disorders of consciousness and from an fMRI study on patients with Parkinson disease. Three out of 16 patients with disorders of consciousness had significant accuracy on binomial testing, but only one showed significant accuracy using permutation testing. In the fMRI experiment, the mental imagery of gait could discriminate significantly between idiopathic Parkinson's disease patients and healthy subjects according to the permutation test but not according to the binomial test. Hence, binomial testing could lead to biased estimation of significance and false positive or negative results. In our view, permutation testing is thus recommended for clinical application of classification with cross-validation.

  8. Sampling plan for using a motorized penetrometer in soil compaction evaluation

    Directory of Open Access Journals (Sweden)

    Lindolfo Storck

    2016-03-01

    Full Text Available ABSTRACT This study aimed to estimate the size of blocks of observations of resistance to penetration, obtained by a motorized digital penetrometer, and the number of blocks with semi-amplitude of the confidence interval between 5 and 20% of the mean penetration resistance, for different soil depth ranges and cone diameters. Data were collected in two contrasting plots of a crop-livestock integration experiment, located in Abelardo Luz, SC, Brazil. Ten blocks were delimited and the resistance to penetration was determined in 20 points spaced by 20 cm, using a motorized digital soil penetrometer. To estimate the mean of resistance to penetration, 12 blocks of four points per experimental plot should be used for a semi-amplitude of the confidence interval equal to 10% of the mean (1 - p = 0.95. Twenty random points may be sampled to estimate mean of penetration resistance for a semiamplitude confidence interval of 10% of the man (1 - p = 0.95. The sample size for the layer of 0-10 cm is larger than in the deeper layers (0-20, 0-30 and 0-40 cm and smaller for cones with larger diameter.

  9. Learning about Sampling with Boxer.

    Science.gov (United States)

    Picciotto, Henri; Ploger, Don

    1991-01-01

    Described is an introductory probability and statistics class focused on teaching the concepts of sampling and binomial distributions through a strategy based on teacher and student generated simulation using the Boxer computer language. The value of integrating programing with teaching subject matter is demonstrated, and sample student work is…

  10. Automatic Motion Generation for Robotic Milling Optimizing Stiffness with Sample-Based Planning

    Directory of Open Access Journals (Sweden)

    Julian Ricardo Diaz Posada

    2017-01-01

    Full Text Available Optimal and intuitive robotic machining is still a challenge. One of the main reasons for this is the lack of robot stiffness, which is also dependent on the robot positioning in the Cartesian space. To make up for this deficiency and with the aim of increasing robot machining accuracy, this contribution describes a solution approach for optimizing the stiffness over a desired milling path using the free degree of freedom of the machining process. The optimal motion is computed based on the semantic and mathematical interpretation of the manufacturing process modeled on its components: product, process and resource; and by configuring automatically a sample-based motion problem and the transition-based rapid-random tree algorithm for computing an optimal motion. The approach is simulated on a CAM software for a machining path revealing its functionality and outlining future potentials for the optimal motion generation for robotic machining processes.

  11. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    Science.gov (United States)

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  12. Y-12 Groundwater Protection Program Groundwater and Surface Water Sampling and Analysis Plan For Calendar Year 2009

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2008-12-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2009 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2009 will be in accordance with DOE Order 540.1 requirements and the following goals: (1) to protect the worker, the public, and the environment; (2) to maintain surveillance of existing and potential groundwater contamination sources; (3) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are most likely to migrate beyond the Oak Ridge Reservation property line; (4) to identify and characterize long-term trends in groundwater quality at Y-12; and (5) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2009 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge, along the boundary of the Oak Ridge Reservation. Modifications to the CY 2009 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and analysis plan

  13. Y-12 Groundwater Protection Program Groundwater And Surface Water Sampling And Analysis Plan For Calendar Year 2008

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2007-09-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2008 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2008 will be in accordance with DOE Order 540.1 requirements and the following goals: (1) to protect the worker, the public, and the environment; (2) to maintain surveillance of existing and potential groundwater contamination sources; (3) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are most likely to migrate beyond the Oak Ridge Reservation property line; (4) to identify and characterize long-term trends in groundwater quality at Y-12; and (5) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2008 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge, along the boundary of the Oak Ridge Reservation (Figure A.1). Modifications to the CY 2008 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and

  14. Beta-Binomial Model for the Detection of Rare Mutations in Pooled Next-Generation Sequencing Experiments.

    Science.gov (United States)

    Jakaitiene, Audrone; Avino, Mariano; Guarracino, Mario Rosario

    2017-04-01

    Against diminishing costs, next-generation sequencing (NGS) still remains expensive for studies with a large number of individuals. As cost saving, sequencing genome of pools containing multiple samples might be used. Currently, there are many software available for the detection of single-nucleotide polymorphisms (SNPs). Sensitivity and specificity depend on the model used and data analyzed, indicating that all software have space for improvement. We use beta-binomial model to detect rare mutations in untagged pooled NGS experiments. We propose a multireference framework for pooled data with ability being specific up to two patients affected by neuromuscular disorders (NMD). We assessed the results comparing with The Genome Analysis Toolkit (GATK), CRISP, SNVer, and FreeBayes. Our results show that the multireference approach applying beta-binomial model is accurate in predicting rare mutations at 0.01 fraction. Finally, we explored the concordance of mutations between the model and software, checking their involvement in any NMD-related gene. We detected seven novel SNPs, for which the functional analysis produced enriched terms related to locomotion and musculature.

  15. Technical management plan for sample generation, analysis, and data review for Phase 2 of the Clinch River Environmental Restoration Program

    International Nuclear Information System (INIS)

    Brandt, C.C.; Benson, S.B.; Beeler, D.A.

    1994-03-01

    The Clinch River Remedial Investigation (CRRI) is designed to address the transport, fate, and distribution of waterborne contaminants (radionuclides, metals, and organic compounds) released from the US Department of Energy's (DOE's) Oak Ridge Reservation (ORR) and to assess potential risks to human health and the environment associated with these contaminants. The remedial investigation is entering Phase 2, which has the following items as its objectives: define the nature and extent of the contamination in areas downstream from the DOE ORR, evaluate the human health and ecological risks posed by these contaminants, and perform preliminary identification and evaluation of potential remediation alternatives. This plan describes the requirements, responsibilities, and roles of personnel during sampling, analysis, and data review for the Clinch River Environmental Restoration Program (CR-ERP). The purpose of the plan is to formalize the process for obtaining analytical services, tracking sampling and analysis documentation, and assessing the overall quality of the CR-ERP data collection program to ensure that it will provide the necessary building blocks for the program decision-making process

  16. Y-12 Plant Groundwater Protection Program Groundwater and Surface Water sampling and Analysis Plan for Calendar Year 2000

    International Nuclear Information System (INIS)

    1999-01-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2000 at the U.S. Department of Energy (DOE) Y-12 Plant that will be managed by tie Y-12 Plant Groundwater Protection Program (GWPP). Groundwater and surface water monitoring during CY 2000 will be performed in three hydrogeologic regimes at the Y-12 Plant: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of the Y-12 Plant (Figure 1). Groundwater and surface water monitoring performed under the auspices of the Y-12 Plant GWPP during CY 2000 will comply with: Tennessee Department of Environment and Conservation regulations governing detection monitoring at nonhazardous Solid Waste Disposal Facilities (SWDF); and DOE Order 5400.1 surveillance monitoring and exit pathway/perimeter monitoring. Some of the data collected for these monitoring drivers also will be used to meet monitoring requirements of the Integrated Water Quality Program, which is managed by Bechtel Jacobs Company LLC. Data from five wells that are monitored for SWDF purposes in the Chestnut Ridge Regime will be used to comply with requirements specified in the Resource Conservation and Recovery Act post closure permit regarding corrective action monitoring. Modifications to the CY 2000 monitoring program may be necessary during implementation. Changes in regulatory or programmatic requirements may alter the analytes specified for selected monitoring wells, or wells could be added or removed from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 Plant GWPP manager and documented as addenda to this sampling and analysis plan

  17. Groundwater quality sampling and analysis plan for environmental monitoring in Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    This Sampling and Analysis Plan addresses groundwater quality sampling and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of energy and managed by martin Marietta Energy Systems, Inc. (Energy Systems). Groundwater sampling will be conducted by Energy Systems at 45 wells within WAG 6. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the groundwater quality monitoring, sampling, and analysis will aid in evaluating relative risk associated with contaminants migrating off-WAG, and also will fulfill Resource Conservation and Recovery Act (RCRA) interim permit monitoring requirements. The sampling steps described in this plan are consistent with the steps that have previously been followed by Energy Systems when conducting RCRA sampling.

  18. Groundwater quality sampling and analysis plan for environmental monitoring in Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1994-03-01

    This Sampling and Analysis Plan addresses groundwater quality sampling and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of energy and managed by martin Marietta Energy Systems, Inc. (Energy Systems). Groundwater sampling will be conducted by Energy Systems at 45 wells within WAG 6. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the groundwater quality monitoring, sampling, and analysis will aid in evaluating relative risk associated with contaminants migrating off-WAG, and also will fulfill Resource Conservation and Recovery Act (RCRA) interim permit monitoring requirements. The sampling steps described in this plan are consistent with the steps that have previously been followed by Energy Systems when conducting RCRA sampling

  19. Groundwater Quality Sampling and Analysis Plan for Environmental Monitoring Waste Area Grouping 6 at Oak Ridge National Laboratory. Environmental Restoration Program

    International Nuclear Information System (INIS)

    1995-09-01

    This Sampling and Analysis Plan addresses groundwater quality sampling and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Martin Marietta Energy Systems, Inc. (Energy Systems). Groundwater sampling will be conducted by Energy Systems at 45 wells within WAG 6. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the groundwater quality monitoring, sampling, and analysis will aid in evaluating relative risk associated with contaminants migrating off-WAG, and also will fulfill Resource Conservation and Recovery Act (RCRA) interim permit monitoring requirements. The sampling steps described in this plan are consistent with the steps that have previously been followed by Energy Systems when conducting RCRA sampling

  20. Groundwater Quality Sampling and Analysis Plan for Environmental Monitoring Waste Area Grouping 6 at Oak Ridge National Laboratory. Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This Sampling and Analysis Plan addresses groundwater quality sampling and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Martin Marietta Energy Systems, Inc. (Energy Systems). Groundwater sampling will be conducted by Energy Systems at 45 wells within WAG 6. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the groundwater quality monitoring, sampling, and analysis will aid in evaluating relative risk associated with contaminants migrating off-WAG, and also will fulfill Resource Conservation and Recovery Act (RCRA) interim permit monitoring requirements. The sampling steps described in this plan are consistent with the steps that have previously been followed by Energy Systems when conducting RCRA sampling.

  1. UMTRA project water sampling and analysis plan, Old and New Rifle, Colorado

    International Nuclear Information System (INIS)

    1994-07-01

    Surface remedial action at the Rifle, Colorado, Uranium Mill Tailings Remedial Action (UMTRA) Project site began in the spring of 1992. Results of water sampling at the Old and New Rifle processing sites for recent years indicate that ground water contamination occurs in the shallow unconfined alluvial aquifer (the uppermost aquifer) and less extensively in the underlying Wasatch Formation. Uranium and sulfate continue to exceed background ground water concentrations and/or maximum concentration limits at and downgradient from the former processing sites. These constituents provide the best indication of changes in contaminant distribution. Contamination in the uppermost (alluvial) aquifer at New Rifle extends a minimum of approximately 5000 feet (ft) (1,524 meters [m]) downgradient. At Old Rifle, the extent of contamination in the alluvial aquifer is much less (a minimum of approximately 1,000 ft [305 m]), partially due to differences in hydrologic regime. For example, the Old Rifle site lies in a relatively narrow alluvial floodplain; the New Rifle site lies in a broad floodplain. Data gathering for the Rifle baseline risk assessment is under way. The purpose of this effort is to determine with greater precision the background ground water quality and extent of ground water contamination at the processing sites. Historical surface water quality indicates that the Colorado River has not been affected by uranium processing activities. No compliance monitoring of the Estes Gulch disposal cell has been proposed, because ground water in the underlying Wasatch Formation is limited use (Class 111) ground water and because the disposal cell is hydrogeologically isolated from the uppermost aquifer

  2. Y-12 Groundwater Protection Program Groundwater And Surface Water Sampling And Analysis Plan For Calendar Year 2011

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2010-12-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2011 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2011 will be in accordance with requirements of DOE Order 540.1A and the following goals: (1) to protect the worker, the public, and the environment; (2) to maintain surveillance of existing and potential groundwater contamination sources; (3) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are most likely to migrate beyond the Oak Ridge Reservation property line; (4) to identify and characterize long-term trends in groundwater quality at Y-12; and (5) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2011 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge along the boundary of the Oak Ridge Reservation. Modifications to the CY 2011 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and analysis plan

  3. Y-12 Groundwater Protection Program Groundwater And Surface Water Sampling And Analysis Plan For Calendar Year 2010

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental LLC

    2009-09-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2010 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2010 will be in accordance with requirements of DOE Order 540.1A and the following goals: (1) to protect the worker, the public, and the environment; (2) to maintain surveillance of existing and potential groundwater contamination sources; (3) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are most likely to migrate beyond the Oak Ridge Reservation property line; (4) to identify and characterize long-term trends in groundwater quality at Y-12; and (5) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring during CY 2010 will be performed primarily in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley, and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge, along the boundary of the Oak Ridge Reservation. Modifications to the CY 2010 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. All modifications to the monitoring program will be approved by the Y-12 GWPP manager and documented as addenda to this sampling and analysis plan

  4. Y-12 Groundwater Protection Program Groundwater And Surface Water Sampling And Analysis Plan For Calendar Year 2012

    Energy Technology Data Exchange (ETDEWEB)

    Elvado Environmental, LLC

    2011-09-01

    This plan provides a description of the groundwater and surface water quality monitoring activities planned for calendar year (CY) 2012 at the U.S. Department of Energy (DOE) Y-12 National Security Complex (Y-12) that will be managed by the Y-12 Groundwater Protection Program (GWPP). Groundwater and surface water monitoring performed by the GWPP during CY 2012 is in accordance with the following goals: (1) to protect the worker, the public, and the environment; (2) to maintain surveillance of existing and potential groundwater contamination sources; (3) to provide for the early detection of groundwater contamination and determine the quality of groundwater and surface water where contaminants are most likely to migrate beyond the Oak Ridge Reservation property line; (4) to identify and characterize long-term trends in groundwater quality at Y-12; and (5) to provide data to support decisions concerning the management and protection of groundwater resources. Groundwater and surface water monitoring will be performed in three hydrogeologic regimes at Y-12: the Bear Creek Hydrogeologic Regime (Bear Creek Regime), the Upper East Fork Poplar Creek Hydrogeologic Regime (East Fork Regime), and the Chestnut Ridge Hydrogeologic Regime (Chestnut Ridge Regime). The Bear Creek and East Fork regimes are located in Bear Creek Valley and the Chestnut Ridge Regime is located south of Y-12 (Figure A.1). Additional surface water monitoring will be performed north of Pine Ridge along the boundary of the Oak Ridge Reservation. Modifications to the CY 2012 monitoring program may be necessary during implementation. Changes in programmatic requirements may alter the analytes specified for selected monitoring wells or may add or remove wells from the planned monitoring network. Each modification to the monitoring program will be approved by the Y-12 GWPP manager and documented as an addendum to this sampling and analysis plan. The following sections of this report provide details regarding

  5. Type I error probability spending for post-market drug and vaccine safety surveillance with binomial data.

    Science.gov (United States)

    Silva, Ivair R

    2018-01-15

    Type I error probability spending functions are commonly used for designing sequential analysis of binomial data in clinical trials, but it is also quickly emerging for near-continuous sequential analysis of post-market drug and vaccine safety surveillance. It is well known that, for clinical trials, when the null hypothesis is not rejected, it is still important to minimize the sample size. Unlike in post-market drug and vaccine safety surveillance, that is not important. In post-market safety surveillance, specially when the surveillance involves identification of potential signals, the meaningful statistical performance measure to be minimized is the expected sample size when the null hypothesis is rejected. The present paper shows that, instead of the convex Type I error spending shape conventionally used in clinical trials, a concave shape is more indicated for post-market drug and vaccine safety surveillance. This is shown for both, continuous and group sequential analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Sample size planning of two-arm superiority and noninferiority survival studies with discrete follow-up.

    Science.gov (United States)

    Wellek, Stefan

    2017-09-10

    In clinical trials using lifetime as primary outcome variable, it is more the rule than the exception that even for patients who are failing in the course of the study, survival time does not become known exactly since follow-up takes place according to a restricted schedule with fixed, possibly long intervals between successive visits. In practice, the discreteness of the data obtained under such circumstances is plainly ignored both in data analysis and in sample size planning of survival time studies. As a framework for analyzing the impact of making no difference between continuous and discrete recording of failure times, we use a scenario in which the partially observed times are assigned to the points of the grid of inspection times in the natural way. Evaluating the treatment effect in a two-arm trial fitting into this framework by means of ordinary methods based on Cox's relative risk model is shown to produce biased estimates and/or confidence bounds whose actual coverage exhibits marked discrepancies from the nominal confidence level. Not surprisingly, the amount of these distorting effects turns out to be the larger the coarser the grid of inspection times has been chosen. As a promising approach to correctly analyzing and planning studies generating discretely recorded failure times, we use large-sample likelihood theory for parametric models accommodating the key features of the scenario under consideration. The main result is an easily implementable representation of the expected information and hence of the asymptotic covariance matrix of the maximum likelihood estimators of all parameters contained in such a model. In two real examples of large-scale clinical trials, sample size calculation based on this result is contrasted with the traditional approach, which consists of applying the usual methods for exactly observed failure times. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Sampling mobile oceanic fishes and sharks: implications for fisheries and conservation planning.

    Science.gov (United States)

    Letessier, Tom B; Bouchet, Phil J; Meeuwig, Jessica J

    2017-05-01

    improving and steering management by exploring facets of MOFS ecology thus far poorly grasped. Advances in telemetry are increasingly used to explore ontogenic and seasonal movements, and provide means to consider MOFS migration corridors and residency patterns. The characterisation of trophic relationships and prey distribution through biochemical analysis and hydro-acoustics surveys has enabled the tracking of dietary shifts and mapping of high-quality foraging grounds. We conclude that while a scientific framework is available to inform initial design and subsequent implementation of MPAs, there is a shortage in the capacity to answer basic but critical questions about MOFS ecology (who, when, where?) required to track populations non-extractively, thereby presenting a barrier to assessing empirically the performance of MPA-based management for MOFS. This sampling gap is exacerbated by the increased establishment of large (>10000 km 2 ) and very large MPAs (VLMPAs, >100000 km 2 ) - great expanses of ocean lacking effective monitoring strategies and survey regimes appropriate to those scales. To address this shortcoming, we demonstrate the use of a non-extractive protocol to measure MOFS population recovery and MPA efficiency. We further identify technological avenues for monitoring at the VLMPA scale, through the use of spotter planes, drones, satellite technology, and horizontal acoustics, and highlight their relevance to the ecosystem-based framework of MOFS management. © 2015 Cambridge Philosophical Society.

  8. A fast algorithm for computing binomial coefficients modulo powers of two.

    Science.gov (United States)

    Andreica, Mugurel Ionut

    2013-01-01

    I present a new algorithm for computing binomial coefficients modulo 2N. The proposed method has an O(N3·Multiplication(N)+N4) preprocessing time, after which a binomial coefficient C(P, Q) with 0≤Q≤P≤2N-1 can be computed modulo 2N in O(N2·log(N)·Multiplication(N)) time. Multiplication(N) denotes the time complexity of multiplying two N-bit numbers, which can range from O(N2) to O(N·log(N)·log(log(N))) or better. Thus, the overall time complexity for evaluating M binomial coefficients C(P, Q) modulo 2N with 0≤Q≤P≤2N-1 is O((N3+M·N2·log(N))·Multiplication(N)+N4). After preprocessing, we can actually compute binomial coefficients modulo any 2R with R≤N. For larger values of P and Q, variations of Lucas' theorem must be used first in order to reduce the computation to the evaluation of multiple (O(log(P))) binomial coefficients C(P', Q') (or restricted types of factorials P'!) modulo 2N with 0≤Q'≤P'≤2N-1.

  9. Water-selective excitation of short T2 species with binomial pulses.

    Science.gov (United States)

    Deligianni, Xeni; Bär, Peter; Scheffler, Klaus; Trattnig, Siegfried; Bieri, Oliver

    2014-09-01

    For imaging of fibrous musculoskeletal components, ultra-short echo time methods are often combined with fat suppression. Due to the increased chemical shift, spectral excitation of water might become a favorable option at ultra-high fields. Thus, this study aims to compare and explore short binomial excitation schemes for spectrally selective imaging of fibrous tissue components with short transverse relaxation time (T2 ). Water selective 1-1-binomial excitation is compared with nonselective imaging using a sub-millisecond spoiled gradient echo technique for in vivo imaging of fibrous tissue at 3T and 7T. Simulations indicate a maximum signal loss from binomial excitation of approximately 30% in the limit of very short T2 (0.1 ms), as compared to nonselective imaging; decreasing rapidly with increasing field strength and increasing T2 , e.g., to 19% at 3T and 10% at 7T for T2 of 1 ms. In agreement with simulations, a binomial phase close to 90° yielded minimum signal loss: approximately 6% at 3T and close to 0% at 7T for menisci, and for ligaments 9% and 13%, respectively. Overall, for imaging of short-lived T2 components, short 1-1 binomial excitation schemes prove to offer marginal signal loss especially at ultra-high fields with overall improved scanning efficiency. Copyright © 2013 Wiley Periodicals, Inc.

  10. Censored Hurdle Negative Binomial Regression (Case Study: Neonatorum Tetanus Case in Indonesia)

    Science.gov (United States)

    Yuli Rusdiana, Riza; Zain, Ismaini; Wulan Purnami, Santi

    2017-06-01

    Hurdle negative binomial model regression is a method that can be used for discreate dependent variable, excess zero and under- and overdispersion. It uses two parts approach. The first part estimates zero elements from dependent variable is zero hurdle model and the second part estimates not zero elements (non-negative integer) from dependent variable is called truncated negative binomial models. The discrete dependent variable in such cases is censored for some values. The type of censor that will be studied in this research is right censored. This study aims to obtain the parameter estimator hurdle negative binomial regression for right censored dependent variable. In the assessment of parameter estimation methods used Maximum Likelihood Estimator (MLE). Hurdle negative binomial model regression for right censored dependent variable is applied on the number of neonatorum tetanus cases in Indonesia. The type data is count data which contains zero values in some observations and other variety value. This study also aims to obtain the parameter estimator and test statistic censored hurdle negative binomial model. Based on the regression results, the factors that influence neonatorum tetanus case in Indonesia is the percentage of baby health care coverage and neonatal visits.

  11. Technical change to the work plan for the remedial investigation of the Salmon Site, Lamar County, Mississippi: Sampling and analysis plan background soil and groundwater study

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    The Salmon Site, formerly known as the Tatum Dome Test Site, is located in south-central Mississippi, southwest of the city of Hattiesburg, in Lamar County. Between 1964 and 1970, two nuclear and two non-nuclear gas explosions were conducted deep underground in the Tatum Salt Dome beneath the site. The tests were performed as part of the former US Atomic Energy Commission`s Vela Uniform Program which was conducted to improve the United States` capability to detect underground nuclear explosions. This document details technical changes to the existing work plan for the remedial investigation of the Salmon Site. A previously conducted Remedial Investigation for the Salmon Site involved the preparation of ecological and human health risk assessments. These risk assessments, which are incorporated into the Remedial Investigation Report, identified several constituents of potential concern (COPC) that could potentially have a negative impact on ecological and human health. These COPC are the primary risk drivers for the Salmon Site; they include arsenic and naturally occurring, gamma-emitting radionuclides. If it can be demonstrated that similar concentrations of these COPCs occur naturally in surrounding areas, they can be removed from consideration in the risk assessments. The purpose of this sampling effort is to collect enough data to prove that the COPCs are naturally occurring and are not a result of the explosives testing activities conducted at the site. This will be accomplished by collecting enough soil samples to have a statistically valid population that can be used to produce defensible comparisons that prove the concentrations identified on site are the same as the background concentrations in surrounding areas.

  12. Correcting for binomial measurement error in predictors in regression with application to analysis of DNA methylation rates by bisulfite sequencing.

    Science.gov (United States)

    Buonaccorsi, John; Prochenka, Agnieszka; Thoresen, Magne; Ploski, Rafal

    2016-09-30

    Motivated by a genetic application, this paper addresses the problem of fitting regression models when the predictor is a proportion measured with error. While the problem of dealing with additive measurement error in fitting regression models has been extensively studied, the problem where the additive error is of a binomial nature has not been addressed. The measurement errors here are heteroscedastic for two reasons; dependence on the underlying true value and changing sampling effort over observations. While some of the previously developed methods for treating additive measurement error with heteroscedasticity can be used in this setting, other methods need modification. A new version of simulation extrapolation is developed, and we also explore a variation on the standard regression calibration method that uses a beta-binomial model based on the fact that the true value is a proportion. Although most of the methods introduced here can be used for fitting non-linear models, this paper will focus primarily on their use in fitting a linear model. While previous work has focused mainly on estimation of the coefficients, we will, with motivation from our example, also examine estimation of the variance around the regression line. In addressing these problems, we also discuss the appropriate manner in which to bootstrap for both inferences and bias assessment. The various methods are compared via simulation, and the results are illustrated using our motivating data, for which the goal is to relate the methylation rate of a blood sample to the age of the individual providing the sample. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Squeezing and other non-classical features in k-photon anharmonic oscillator in binomial and negative binomial states of the field

    International Nuclear Information System (INIS)

    Joshi, A.; Lawande, S.V.

    1990-01-01

    A systematic study of squeezing obtained from k-photon anharmonic oscillator (with interaction hamiltonian of the form (a † ) k , k ≥ 2) interacting with light whose statistics can be varied from sub-Poissonian to poissonian via binomial state of field and super-Poissonian to poissonian via negative binomial state of field is presented. The authors predict that for all values of k there is a tendency increase in squeezing with increased sub-Poissonian character of the field while the reverse is true with super-Poissonian field. They also present non-classical behavior of the first order coherence function explicitly for k = 2 case (i.e., for two-photon anharmonic oscillator model used for a Kerr-like medium) with variation in the statistics of the input light

  14. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    Energy Technology Data Exchange (ETDEWEB)

    Chhaiba, Hassan, E-mail: chhaiba.hassan@gmail.com [Department of Mathematics, Faculty of Sciences, Ibn Tofail University, P.O. Box 133, Kénitra (Morocco); Demni, Nizar, E-mail: nizar.demni@univ-rennes1.fr [IRMAR, Université de Rennes 1, Campus de Beaulieu, 35042 Rennes Cedex (France); Mouayn, Zouhair, E-mail: mouayn@fstbm.ac.ma [Department of Mathematics, Faculty of Sciences and Technics (M’Ghila), Sultan Moulay Slimane, P.O. Box 523, Béni Mellal (Morocco)

    2016-07-15

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  15. Possibility and Challenges of Conversion of Current Virus Species Names to Linnaean Binomials

    Energy Technology Data Exchange (ETDEWEB)

    Postler, Thomas S.; Clawson, Anna N.; Amarasinghe, Gaya K.; Basler, Christopher F.; Bavari, Sbina; Benkő, Mária; Blasdell, Kim R.; Briese, Thomas; Buchmeier, Michael J.; Bukreyev, Alexander; Calisher, Charles H.; Chandran, Kartik; Charrel, Rémi; Clegg, Christopher S.; Collins, Peter L.; Juan Carlos, De La Torre; Derisi, Joseph L.; Dietzgen, Ralf G.; Dolnik, Olga; Dürrwald, Ralf; Dye, John M.; Easton, Andrew J.; Emonet, Sébastian; Formenty, Pierre; Fouchier, Ron A. M.; Ghedin, Elodie; Gonzalez, Jean-Paul; Harrach, Balázs; Hewson, Roger; Horie, Masayuki; Jiāng, Dàohóng; Kobinger, Gary; Kondo, Hideki; Kropinski, Andrew M.; Krupovic, Mart; Kurath, Gael; Lamb, Robert A.; Leroy, Eric M.; Lukashevich, Igor S.; Maisner, Andrea; Mushegian, Arcady R.; Netesov, Sergey V.; Nowotny, Norbert; Patterson, Jean L.; Payne, Susan L.; PaWeska, Janusz T.; Peters, Clarence J.; Radoshitzky, Sheli R.; Rima, Bertus K.; Romanowski, Victor; Rubbenstroth, Dennis; Sabanadzovic, Sead; Sanfaçon, Hélène; Salvato, Maria S.; Schwemmle, Martin; Smither, Sophie J.; Stenglein, Mark D.; Stone, David M.; Takada, Ayato; Tesh, Robert B.; Tomonaga, Keizo; Tordo, Noël; Towner, Jonathan S.; Vasilakis, Nikos; Volchkov, Viktor E.; Wahl-Jensen, Victoria; Walker, Peter J.; Wang, Lin-Fa; Varsani, Arvind; Whitfield, Anna E.; Zerbini, F. Murilo; Kuhn, Jens H.

    2016-10-22

    Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authors of this article expected when conceiving the experiment. [Arenaviridae; binomials; ICTV; International Committee on Taxonomy of Viruses; Mononegavirales; virus nomenclature; virus taxonomy.

  16. Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels

    International Nuclear Information System (INIS)

    Chhaiba, Hassan; Demni, Nizar; Mouayn, Zouhair

    2016-01-01

    To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering the total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.

  17. Sampling and analysis plan for the Bear Creek Valley Boneyard/Burnyard Accelerated Action Project, Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    In the Bear Creek Valley Watershed Remedial Investigation, the Boneyard/Burnyard was identified as the source of the largest releases of uranium into groundwater and surface water in Bear Creek Valley. The proposed action for remediation of this site is selective excavation and removal of source material and capping of the remainder of the site. The schedule for this action has been accelerated so that this is the first remedial action planned to be implemented in the Bear Creek Valley Record of Decision. Additional data needs to support design of the remedial action were identified at a data quality objectives meeting held for this project. Sampling at the Boneyard/Burnyard will be conducted through the use of a phased approach. Initial or primary samples will be used to make in-the-field decisions about where to locate follow-up or secondary samples. On the basis of the results of surface water, soil, and groundwater analysis, up to six test pits will be dug. The test pits will be used to provide detailed descriptions of source materials and bulk samples. This document sets forth the requirements and procedures to protect the personnel involved in this project. This document also contains the health and safety plan, quality assurance project plan, waste management plan, data management plan, implementation plan, and best management practices plan for this project as appendices.

  18. Sampling and analysis plan for the Bear Creek Valley Boneyard/Burnyard Accelerated Action Project, Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1998-03-01

    In the Bear Creek Valley Watershed Remedial Investigation, the Boneyard/Burnyard was identified as the source of the largest releases of uranium into groundwater and surface water in Bear Creek Valley. The proposed action for remediation of this site is selective excavation and removal of source material and capping of the remainder of the site. The schedule for this action has been accelerated so that this is the first remedial action planned to be implemented in the Bear Creek Valley Record of Decision. Additional data needs to support design of the remedial action were identified at a data quality objectives meeting held for this project. Sampling at the Boneyard/Burnyard will be conducted through the use of a phased approach. Initial or primary samples will be used to make in-the-field decisions about where to locate follow-up or secondary samples. On the basis of the results of surface water, soil, and groundwater analysis, up to six test pits will be dug. The test pits will be used to provide detailed descriptions of source materials and bulk samples. This document sets forth the requirements and procedures to protect the personnel involved in this project. This document also contains the health and safety plan, quality assurance project plan, waste management plan, data management plan, implementation plan, and best management practices plan for this project as appendices

  19. Variability in results from negative binomial models for Lyme disease measured at different spatial scales.

    Science.gov (United States)

    Tran, Phoebe; Waller, Lance

    2015-01-01

    Lyme disease has been the subject of many studies due to increasing incidence rates year after year and the severe complications that can arise in later stages of the disease. Negative binomial models have been used to model Lyme disease in the past with some success. However, there has been little focus on the reliability and consistency of these models when they are used to study Lyme disease at multiple spatial scales. This study seeks to explore how sensitive/consistent negative binomial models are when they are used to study Lyme disease at different spatial scales (at the regional and sub-regional levels). The study area includes the thirteen states in the Northeastern United States with the highest Lyme disease incidence during the 2002-2006 period. Lyme disease incidence at county level for the period of 2002-2006 was linked with several previously identified key landscape and climatic variables in a negative binomial regression model for the Northeastern region and two smaller sub-regions (the New England sub-region and the Mid-Atlantic sub-region). This study found that negative binomial models, indeed, were sensitive/inconsistent when used at different spatial scales. We discuss various plausible explanations for such behavior of negative binomial models. Further investigation of the inconsistency and sensitivity of negative binomial models when used at different spatial scales is important for not only future Lyme disease studies and Lyme disease risk assessment/management but any study that requires use of this model type in a spatial context. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Entanglement properties between two atoms in the binomial optical field interacting with two entangled atoms

    International Nuclear Information System (INIS)

    Liu Tang-Kun; Zhang Kang-Long; Tao Yu; Shan Chuan-Jia; Liu Ji-Bing

    2016-01-01

    The temporal evolution of the degree of entanglement between two atoms in a system of the binomial optical field interacting with two arbitrary entangled atoms is investigated. The influence of the strength of the dipole–dipole interaction between two atoms, probabilities of the Bernoulli trial, and particle number of the binomial optical field on the temporal evolution of the atomic entanglement are discussed. The result shows that the two atoms are always in the entanglement state. Moreover, if and only if the two atoms are initially in the maximally entangled state, the entanglement evolution is not affected by the parameters, and the degree of entanglement is always kept as 1. (paper)

  1. On extinction time of a generalized endemic chain-binomial model.

    Science.gov (United States)

    Aydogmus, Ozgur

    2016-09-01

    We considered a chain-binomial epidemic model not conferring immunity after infection. Mean field dynamics of the model has been analyzed and conditions for the existence of a stable endemic equilibrium are determined. The behavior of the chain-binomial process is probabilistically linked to the mean field equation. As a result of this link, we were able to show that the mean extinction time of the epidemic increases at least exponentially as the population size grows. We also present simulation results for the process to validate our analytical findings. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Binomial confidence intervals for testing non-inferiority or superiority: a practitioner's dilemma.

    Science.gov (United States)

    Pradhan, Vivek; Evans, John C; Banerjee, Tathagata

    2016-08-01

    In testing for non-inferiority or superiority in a single arm study, the confidence interval of a single binomial proportion is frequently used. A number of such intervals are proposed in the literature and implemented in standard software packages. Unfortunately, use of different intervals leads to conflicting conclusions. Practitioners thus face a serious dilemma in deciding which one to depend on. Is there a way to resolve this dilemma? We address this question by investigating the performances of ten commonly used intervals of a single binomial proportion, in the light of two criteria, viz., coverage and expected length of the interval. © The Author(s) 2013.

  3. Addendum to Sampling and Analysis Plan (SAP) for Assessment of LANL-Derived Residual Radionuclides in Soils within Tract A-16-d for Land Conveyance and Transfer for Sewage Treatment Facility Area

    Energy Technology Data Exchange (ETDEWEB)

    Whicker, Jeffrey Jay [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gillis, Jessica Mcdonnel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-21

    This report summarizes the sampling design used, associated statistical assumptions, as well as general guidelines for conducting post-sampling data analysis. Sampling plan components presented here include how many sampling locations to choose and where within the sampling area to collect those samples. The type of medium to sample (i.e., soil, groundwater, etc.) and how to analyze the samples (in-situ, fixed laboratory, etc.) are addressed in other sections of the sampling plan.

  4. A Mixed-Effects Heterogeneous Negative Binomial Model for Postfire Conifer Regeneration in Northeastern California, USA

    Science.gov (United States)

    Justin S. Crotteau; Martin W. Ritchie; J. Morgan. Varner

    2014-01-01

    Many western USA fire regimes are typified by mixed-severity fire, which compounds the variability inherent to natural regeneration densities in associated forests. Tree regeneration data are often discrete and nonnegative; accordingly, we fit a series of Poisson and negative binomial variation models to conifer seedling counts across four distinct burn severities and...

  5. Topology of unitary groups and the prime orders of binomial coefficients

    Science.gov (United States)

    Duan, HaiBao; Lin, XianZu

    2017-09-01

    Let $c:SU(n)\\rightarrow PSU(n)=SU(n)/\\mathbb{Z}_{n}$ be the quotient map of the special unitary group $SU(n)$ by its center subgroup $\\mathbb{Z}_{n}$. We determine the induced homomorphism $c^{\\ast}:$ $H^{\\ast}(PSU(n))\\rightarrow H^{\\ast}(SU(n))$ on cohomologies by computing with the prime orders of binomial coefficients

  6. A Bayesian Approach to Functional Mixed Effect Modeling for Longitudinal Data with Binomial Outcomes

    Science.gov (United States)

    Kliethermes, Stephanie; Oleson, Jacob

    2014-01-01

    Longitudinal growth patterns are routinely seen in medical studies where individual and population growth is followed over a period of time. Many current methods for modeling growth presuppose a parametric relationship between the outcome and time (e.g., linear, quadratic); however, these relationships may not accurately capture growth over time. Functional mixed effects (FME) models provide flexibility in handling longitudinal data with nonparametric temporal trends. Although FME methods are well-developed for continuous, normally distributed outcome measures, nonparametric methods for handling categorical outcomes are limited. We consider the situation with binomially distributed longitudinal outcomes. Although percent correct data can be modeled assuming normality, estimates outside the parameter space are possible and thus estimated curves can be unrealistic. We propose a binomial FME model using Bayesian methodology to account for growth curves with binomial (percentage) outcomes. The usefulness of our methods is demonstrated using a longitudinal study of speech perception outcomes from cochlear implant users where we successfully model both the population and individual growth trajectories. Simulation studies also advocate the usefulness of the binomial model particularly when outcomes occur near the boundary of the probability parameter space and in situations with a small number of trials. PMID:24723495

  7. A mixed-binomial model for Likert-type personality measures.

    Science.gov (United States)

    Allik, Jüri

    2014-01-01

    Personality measurement is based on the idea that values on an unobservable latent variable determine the distribution of answers on a manifest response scale. Typically, it is assumed in the Item Response Theory (IRT) that latent variables are related to the observed responses through continuous normal or logistic functions, determining the probability with which one of the ordered response alternatives on a Likert-scale item is chosen. Based on an analysis of 1731 self- and other-rated responses on the 240 NEO PI-3 questionnaire items, it was proposed that a viable alternative is a finite number of latent events which are related to manifest responses through a binomial function which has only one parameter-the probability with which a given statement is approved. For the majority of items, the best fit was obtained with a mixed-binomial distribution, which assumes two different subpopulations who endorse items with two different probabilities. It was shown that the fit of the binomial IRT model can be improved by assuming that about 10% of random noise is contained in the answers and by taking into account response biases toward one of the response categories. It was concluded that the binomial response model for the measurement of personality traits may be a workable alternative to the more habitual normal and logistic IRT models.

  8. Confidence Intervals for Weighted Composite Scores under the Compound Binomial Error Model

    Science.gov (United States)

    Kim, Kyung Yong; Lee, Won-Chan

    2018-01-01

    Reporting confidence intervals with test scores helps test users make important decisions about examinees by providing information about the precision of test scores. Although a variety of estimation procedures based on the binomial error model are available for computing intervals for test scores, these procedures assume that items are randomly…

  9. Estimating cavity tree and snag abundance using negative binomial regression models and nearest neighbor imputation methods

    Science.gov (United States)

    Bianca N.I. Eskelson; Hailemariam Temesgen; Tara M. Barrett

    2009-01-01

    Cavity tree and snag abundance data are highly variable and contain many zero observations. We predict cavity tree and snag abundance from variables that are readily available from forest cover maps or remotely sensed data using negative binomial (NB), zero-inflated NB, and zero-altered NB (ZANB) regression models as well as nearest neighbor (NN) imputation methods....

  10. Binomial Coefficients Modulo a Prime--A Visualization Approach to Undergraduate Research

    Science.gov (United States)

    Bardzell, Michael; Poimenidou, Eirini

    2011-01-01

    In this article we present, as a case study, results of undergraduate research involving binomial coefficients modulo a prime "p." We will discuss how undergraduates were involved in the project, even with a minimal mathematical background beforehand. There are two main avenues of exploration described to discover these binomial…

  11. Computational results on the compound binomial risk model with nonhomogeneous claim occurrences

    NARCIS (Netherlands)

    Tuncel, A.; Tank, F.

    2013-01-01

    The aim of this paper is to give a recursive formula for non-ruin (survival) probability when the claim occurrences are nonhomogeneous in the compound binomial risk model. We give recursive formulas for non-ruin (survival) probability and for distribution of the total number of claims under the

  12. The negative binomial distribution as a model for external corrosion defect counts in buried pipelines

    International Nuclear Information System (INIS)

    Valor, Alma; Alfonso, Lester; Caleyo, Francisco; Vidal, Julio; Perez-Baruch, Eloy; Hallen, José M.

    2015-01-01

    Highlights: • Observed external-corrosion defects in underground pipelines revealed a tendency to cluster. • The Poisson distribution is unable to fit extensive count data for these type of defects. • In contrast, the negative binomial distribution provides a suitable count model for them. • Two spatial stochastic processes lead to the negative binomial distribution for defect counts. • They are the Gamma-Poisson mixed process and the compound Poisson process. • A Rogeŕs process also arises as a plausible temporal stochastic process leading to corrosion defect clustering and to negative binomially distributed defect counts. - Abstract: The spatial distribution of external corrosion defects in buried pipelines is usually described as a Poisson process, which leads to corrosion defects being randomly distributed along the pipeline. However, in real operating conditions, the spatial distribution of defects considerably departs from Poisson statistics due to the aggregation of defects in groups or clusters. In this work, the statistical analysis of real corrosion data from underground pipelines operating in southern Mexico leads to conclude that the negative binomial distribution provides a better description for defect counts. The origin of this distribution from several processes is discussed. The analysed processes are: mixed Gamma-Poisson, compound Poisson and Roger’s processes. The physical reasons behind them are discussed for the specific case of soil corrosion.

  13. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    Science.gov (United States)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  14. Determining order-up-to levels under periodic review for compound binomial (intermittent) demand

    NARCIS (Netherlands)

    Teunter, R. H.; Syntetos, A. A.; Babai, M. Z.

    2010-01-01

    We propose a new method for determining order-up-to levels for intermittent demand items in a periodic review system. Contrary to existing methods, we exploit the intermittent character of demand by modelling lead time demand as a compound binomial process. in an extensive numerical study using

  15. Negative binomial distribution for multiplicity distributions in e/sup +/e/sup -/ annihilation

    International Nuclear Information System (INIS)

    Chew, C.K.; Lim, Y.K.

    1986-01-01

    The authors show that the negative binomial distribution fits excellently the available charged-particle multiplicity distributions of e/sup +/e/sup -/ annihilation into hadrons at three different energies √s = 14, 22 and 34 GeV

  16. A Bayesian approach to functional mixed-effects modeling for longitudinal data with binomial outcomes.

    Science.gov (United States)

    Kliethermes, Stephanie; Oleson, Jacob

    2014-08-15

    Longitudinal growth patterns are routinely seen in medical studies where individual growth and population growth are followed up over a period of time. Many current methods for modeling growth presuppose a parametric relationship between the outcome and time (e.g., linear and quadratic); however, these relationships may not accurately capture growth over time. Functional mixed-effects (FME) models provide flexibility in handling longitudinal data with nonparametric temporal trends. Although FME methods are well developed for continuous, normally distributed outcome measures, nonparametric methods for handling categorical outcomes are limited. We consider the situation with binomially distributed longitudinal outcomes. Although percent correct data can be modeled assuming normality, estimates outside the parameter space are possible, and thus, estimated curves can be unrealistic. We propose a binomial FME model using Bayesian methodology to account for growth curves with binomial (percentage) outcomes. The usefulness of our methods is demonstrated using a longitudinal study of speech perception outcomes from cochlear implant users where we successfully model both the population and individual growth trajectories. Simulation studies also advocate the usefulness of the binomial model particularly when outcomes occur near the boundary of the probability parameter space and in situations with a small number of trials. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Time evolution of negative binomial optical field in a diffusion channel

    International Nuclear Information System (INIS)

    Liu Tang-Kun; Wu Pan-Pan; Shan Chuan-Jia; Liu Ji-Bing; Fan Hong-Yi

    2015-01-01

    We find the time evolution law of a negative binomial optical field in a diffusion channel. We reveal that by adjusting the diffusion parameter, the photon number can be controlled. Therefore, the diffusion process can be considered a quantum controlling scheme through photon addition. (paper)

  18. Joint Analysis of Binomial and Continuous Traits with a Recursive Model

    DEFF Research Database (Denmark)

    Varona, Louis; Sorensen, Daniel

    2014-01-01

    This work presents a model for the joint analysis of a binomial and a Gaussian trait using a recursive parametrization that leads to a computationally efficient implementation. The model is illustrated in an analysis of mortality and litter size in two breeds of Danish pigs, Landrace and Yorkshir...

  19. Generating and revealing a quantum superposition of electromagnetic-field binomial states in a cavity

    International Nuclear Information System (INIS)

    Lo Franco, R.; Compagno, G.; Messina, A.; Napoli, A.

    2007-01-01

    We introduce the N-photon quantum superposition of two orthogonal generalized binomial states of an electromagnetic field. We then propose, using resonant atom-cavity interactions, nonconditional schemes to generate and reveal such a quantum superposition for the two-photon case in a single-mode high-Q cavity. We finally discuss the implementation of the proposed schemes

  20. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  1. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    Science.gov (United States)

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  2. Studying the Binomial Distribution Using LabVIEW

    Science.gov (United States)

    George, Danielle J.; Hammer, Nathan I.

    2015-01-01

    This undergraduate physical chemistry laboratory exercise introduces students to the study of probability distributions both experimentally and using computer simulations. Students perform the classic coin toss experiment individually and then pool all of their data together to study the effect of experimental sample size on the binomial…

  3. NBLDA: negative binomial linear discriminant analysis for RNA-Seq data.

    Science.gov (United States)

    Dong, Kai; Zhao, Hongyu; Tong, Tiejun; Wan, Xiang

    2016-09-13

    RNA-sequencing (RNA-Seq) has become a powerful technology to characterize gene expression profiles because it is more accurate and comprehensive than microarrays. Although statistical methods that have been developed for microarray data can be applied to RNA-Seq data, they are not ideal due to the discrete nature of RNA-Seq data. The Poisson distribution and negative binomial distribution are commonly used to model count data. Recently, Witten (Annals Appl Stat 5:2493-2518, 2011) proposed a Poisson linear discriminant analysis for RNA-Seq data. The Poisson assumption may not be as appropriate as the negative binomial distribution when biological replicates are available and in the presence of overdispersion (i.e., when the variance is larger than or equal to the mean). However, it is more complicated to model negative binomial variables because they involve a dispersion parameter that needs to be estimated. In this paper, we propose a negative binomial linear discriminant analysis for RNA-Seq data. By Bayes' rule, we construct the classifier by fitting a negative binomial model, and propose some plug-in rules to estimate the unknown parameters in the classifier. The relationship between the negative binomial classifier and the Poisson classifier is explored, with a numerical investigation of the impact of dispersion on the discriminant score. Simulation results show the superiority of our proposed method. We also analyze two real RNA-Seq data sets to demonstrate the advantages of our method in real-world applications. We have developed a new classifier using the negative binomial model for RNA-seq data classification. Our simulation results show that our proposed classifier has a better performance than existing works. The proposed classifier can serve as an effective tool for classifying RNA-seq data. Based on the comparison results, we have provided some guidelines for scientists to decide which method should be used in the discriminant analysis of RNA-Seq data

  4. Confirmatory Sampling and Analysis Plan for the Lower East Fork Poplar Creek operable unit, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1996-04-01

    On December 21, 1989, the EPA placed the US Department of Energy's (DOE's) Oak Ridge Reservation (ORR) on the National Priorities List (NPL). On January 1, 1992, a Federal Facilities Agreement (FFA) between the DOE Field Office in Oak Ridge (DOE-OR), EPA Region IV, and the Tennessee Department of Environment and Conservation (TDEC) went into effect. This FFA establishes the procedural framework and schedule by which DOE-OR will develop, coordinate, implement and monitor environmental restoration activities on the ORR in accordance with applicable federal and state environmental regulations. The DOE-OR Environmental Restoration Program for the ORR addresses the remediation of areas both within and outside the ORR boundaries. This sampling and analysis plan focuses on confirming the cleanup of the stretch of EFPC flowing from Lake Reality at the Y-12 Plant through the City of Oak Ridge, to Poplar Creek on the ORR and its associated floodplain. Both EFPC and its floodplain have been contaminated by releases from the Y-12 Plant since the mid-1950s. Because the EFPC site-designated as an ORR operable unit (OU) under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) is included on the NPL, its remediation must follow the specific procedures mandated by CERCLA, as amended by the Superfund Amendments and Reauthorization Act in 1986

  5. Confirmatory Sampling and Analysis Plan for the Lower East Fork Poplar Creek operable unit, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    On December 21, 1989, the EPA placed the US Department of Energy`s (DOE`s) Oak Ridge Reservation (ORR) on the National Priorities List (NPL). On January 1, 1992, a Federal Facilities Agreement (FFA) between the DOE Field Office in Oak Ridge (DOE-OR), EPA Region IV, and the Tennessee Department of Environment and Conservation (TDEC) went into effect. This FFA establishes the procedural framework and schedule by which DOE-OR will develop, coordinate, implement and monitor environmental restoration activities on the ORR in accordance with applicable federal and state environmental regulations. The DOE-OR Environmental Restoration Program for the ORR addresses the remediation of areas both within and outside the ORR boundaries. This sampling and analysis plan focuses on confirming the cleanup of the stretch of EFPC flowing from Lake Reality at the Y-12 Plant through the City of Oak Ridge, to Poplar Creek on the ORR and its associated floodplain. Both EFPC and its floodplain have been contaminated by releases from the Y-12 Plant since the mid-1950s. Because the EFPC site-designated as an ORR operable unit (OU) under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) is included on the NPL, its remediation must follow the specific procedures mandated by CERCLA, as amended by the Superfund Amendments and Reauthorization Act in 1986.

  6. Soil sampling and analysis plan for the Bear Creek Valley Floodplain at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This Sampling and Analysis Plan (SAP) for the Bear Creek Valley (BCV) Floodplain presents the approach and rationale for characterizing potentially contaminated soils and sediments of the Bear Creek floodplain and the impact of any contaminants on the floodplain ecosystem. In addition to this SAP, the Remedial Investigation Work Plan for Bear Creek (Y02-S600) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee (ES/ER-19&D2) presents background information pertaining to this floodplain investigation.

  7. Soil sampling and analysis plan for the Bear Creek Valley Floodplain at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-03-01

    This Sampling and Analysis Plan (SAP) for the Bear Creek Valley (BCV) Floodplain presents the approach and rationale for characterizing potentially contaminated soils and sediments of the Bear Creek floodplain and the impact of any contaminants on the floodplain ecosystem. In addition to this SAP, the Remedial Investigation Work Plan for Bear Creek (Y02-S600) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee (ES/ER-19 ampersand D2) presents background information pertaining to this floodplain investigation

  8. A Mechanistic Beta-Binomial Probability Model for mRNA Sequencing Data.

    Science.gov (United States)

    Smith, Gregory R; Birtwistle, Marc R

    2016-01-01

    A main application for mRNA sequencing (mRNAseq) is determining lists of differentially-expressed genes (DEGs) between two or more conditions. Several software packages exist to produce DEGs from mRNAseq data, but they typically yield different DEGs, sometimes markedly so. The underlying probability model used to describe mRNAseq data is central to deriving DEGs, and not surprisingly most softwares use different models and assumptions to analyze mRNAseq data. Here, we propose a mechanistic justification to model mRNAseq as a binomial process, with data from technical replicates given by a binomial distribution, and data from biological replicates well-described by a beta-binomial distribution. We demonstrate good agreement of this model with two large datasets. We show that an emergent feature of the beta-binomial distribution, given parameter regimes typical for mRNAseq experiments, is the well-known quadratic polynomial scaling of variance with the mean. The so-called dispersion parameter controls this scaling, and our analysis suggests that the dispersion parameter is a continually decreasing function of the mean, as opposed to current approaches that impose an asymptotic value to the dispersion parameter at moderate mean read counts. We show how this leads to current approaches overestimating variance for moderately to highly expressed genes, which inflates false negative rates. Describing mRNAseq data with a beta-binomial distribution thus may be preferred since its parameters are relatable to the mechanistic underpinnings of the technique and may improve the consistency of DEG analysis across softwares, particularly for moderately to highly expressed genes.

  9. A binary logistic regression model with complex sampling design of unmet need for family planning among all women aged (15-49) in Ethiopia.

    Science.gov (United States)

    Workie, Demeke Lakew; Zike, Dereje Tesfaye; Fenta, Haile Mekonnen; Mekonnen, Mulusew Admasu

    2017-09-01

    Unintended pregnancy related to unmet need is a worldwide problem that affects societies. The main objective of this study was to identify the prevalence and determinants of unmet need for family planning among women aged (15-49) in Ethiopia. The Performance Monitoring and Accountability2020/Ethiopia was conducted in April 2016 at round-4 from 7494 women with two-stage-stratified sampling. Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. The prevalence of unmet-need for family planning was 16.2% in Ethiopia. Women between the age range of 15-24 years were 2.266 times more likely to have unmet need family planning compared to above 35 years. Women who were currently married were about 8 times more likely to have unmet need family planning compared to never married women. Women who had no under-five child were 0.125 times less likely to have unmet need family planning compared to those who had more than two-under-5. The key determinants of unmet need family planning in Ethiopia were residence, age, marital-status, education, household members, birth-events and number of under-5 children. Thus the Government of Ethiopia would take immediate steps to address the causes of high unmet need for family planning among women.

  10. Field sampling and analysis plan for the remedial investigation of Waste Area Grouping 2 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    Boston, H.L.; Ashwood, T.L.; Borders, D.M.; Chidambariah, V.; Downing, D.J.; Fontaine, T.A.; Ketelle, R.H.; Lee, S.Y.; Miller, D.E.; Moore, G.K.; Suter, G.W.; Tardiff, M.F.; Watts, J.A.; Wickliff, D.S.

    1992-02-01

    This field sampling and analysis (S & A) plan has been developed as part of the Department of Energy`s (DOE`s) remedial investigation (RI) of Waste Area Grouping (WAG) 2 at Oak Ridge National Laboratory (ORNL) located in Oak Ridge, Tennessee. The S & A plan has been written in support of the remedial investigation (RI) plan for WAG 2 (ORNL 1990). WAG 2 consists of White Oak Creek (WOC) and its tributaries downstream of the ORNL main plant area, White Oak Lake (WOL), White Oak Creek embayment (WOCE) on the Clinch River, and the associated floodplain and subsurface environment (Fig. 1.1). The WOC system is the surface drainage for the major ORNL WAGs and has been exposed to a diversity of contaminants from operations and waste disposal activities in the WOC watershed. WAG 2 acts as a conduit through which hydrologic fluxes carry contaminants from upgradient areas to the Clinch River. Water, sediment, soil, and biota in WAG 2 are contaminated and continue to receive contaminants from upgradient WAGs. This document describes the following: an overview of the RI plan, background information for the WAG 2 system, and objectives of the S & A plan; the scope and implementation of the first 2 years of effort of the S & A plan and includes recent information about contaminants of concern, organization of S & A activities, interactions with other programs, and quality assurance specific to the S & A activities; provides details of the field sampling plans for sediment, surface water, groundwater, and biota, respectively; and describes the sample tracking and records management plan.

  11. Field sampling and analysis plan for the remedial investigation of Waste Area Grouping 2 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Boston, H.L.; Ashwood, T.L.; Borders, D.M.; Chidambariah, V.; Downing, D.J.; Fontaine, T.A.; Ketelle, R.H.; Lee, S.Y.; Miller, D.E.; Moore, G.K.; Suter, G.W.; Tardiff, M.F.; Watts, J.A.; Wickliff, D.S.

    1992-02-01

    This field sampling and analysis (S ampersand A) plan has been developed as part of the Department of Energy's (DOE's) remedial investigation (RI) of Waste Area Grouping (WAG) 2 at Oak Ridge National Laboratory (ORNL) located in Oak Ridge, Tennessee. The S ampersand A plan has been written in support of the remedial investigation (RI) plan for WAG 2 (ORNL 1990). WAG 2 consists of White Oak Creek (WOC) and its tributaries downstream of the ORNL main plant area, White Oak Lake (WOL), White Oak Creek embayment (WOCE) on the Clinch River, and the associated floodplain and subsurface environment (Fig. 1.1). The WOC system is the surface drainage for the major ORNL WAGs and has been exposed to a diversity of contaminants from operations and waste disposal activities in the WOC watershed. WAG 2 acts as a conduit through which hydrologic fluxes carry contaminants from upgradient areas to the Clinch River. Water, sediment, soil, and biota in WAG 2 are contaminated and continue to receive contaminants from upgradient WAGs. This document describes the following: an overview of the RI plan, background information for the WAG 2 system, and objectives of the S ampersand A plan; the scope and implementation of the first 2 years of effort of the S ampersand A plan and includes recent information about contaminants of concern, organization of S ampersand A activities, interactions with other programs, and quality assurance specific to the S ampersand A activities; provides details of the field sampling plans for sediment, surface water, groundwater, and biota, respectively; and describes the sample tracking and records management plan

  12. Using a Negative Binomial Regression Model for Early Warning at the Start of a Hand Foot Mouth Disease Epidemic in Dalian, Liaoning Province, China.

    Science.gov (United States)

    An, Qingyu; Wu, Jun; Fan, Xuesong; Pan, Liyang; Sun, Wei

    2016-01-01

    The hand foot and mouth disease (HFMD) is a human syndrome caused by intestinal viruses like that coxsackie A virus 16, enterovirus 71 and easily developed into outbreak in kindergarten and school. Scientifically and accurately early detection of the start time of HFMD epidemic is a key principle in planning of control measures and minimizing the impact of HFMD. The objective of this study was to establish a reliable early detection model for start timing of hand foot mouth disease epidemic in Dalian and to evaluate the performance of model by analyzing the sensitivity in detectability. The negative binomial regression model was used to estimate the weekly baseline case number of HFMD and identified the optimal alerting threshold between tested difference threshold values during the epidemic and non-epidemic year. Circular distribution method was used to calculate the gold standard of start timing of HFMD epidemic. From 2009 to 2014, a total of 62022 HFMD cases were reported (36879 males and 25143 females) in Dalian, Liaoning Province, China, including 15 fatal cases. The median age of the patients was 3 years. The incidence rate of epidemic year ranged from 137.54 per 100,000 population to 231.44 per 100,000population, the incidence rate of non-epidemic year was lower than 112 per 100,000 population. The negative binomial regression model with AIC value 147.28 was finally selected to construct the baseline level. The threshold value was 100 for the epidemic year and 50 for the non- epidemic year had the highest sensitivity(100%) both in retrospective and prospective early warning and the detection time-consuming was 2 weeks before the actual starting of HFMD epidemic. The negative binomial regression model could early warning the start of a HFMD epidemic with good sensitivity and appropriate detection time in Dalian.

  13. Sampling and Analysis Plan for Disposition of the Standing Legacy Wastes in the 105-B, -D, -H, -KE, and -KW Reactor Buildings

    International Nuclear Information System (INIS)

    McGuire, J.J.

    1999-01-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities that support disposition of legacy waste in the Hanford Site's 105-B, 105-D, 105-H, 105-KE, 105-KW Reactor buildings. For the purpose of this SAP, legacy waste is identified as any item present in a facility that is not permanently attached to the facility and is easily removed without the aid of equipment larger than a standard forklift

  14. Sampling and Analysis Plan for Disposition of the Standing Legacy Wastes in the 105-B, -D, -H, -KE, and -KW Reactor Buildings

    International Nuclear Information System (INIS)

    McGuire, J. J.

    1999-01-01

    This sampling and analysis plan (SAP) presents the rationale and strategy for the sampling and analysis activities that support disposition of legacy waste in the Hanford Site's 105-B, 105-D, 105-H,105-KE, 105-KW Reactor buildings. For the purpose of this SAP, legacy waste is identified as any item present in a facility that is not permanently attached to the facility and is easily removed without the aid of equipment larger than a standard forklift

  15. Seeps and springs sampling and analysis plant for the Environmental Monitoring Plan at Waste Area Grouping 6, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    1995-09-01

    This Sampling and Analysis Plan addresses the monitoring, sampling, and analysis activities that will be conducted at seeps and springs and at two french drain outlets in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-land-burial disposal facility for low-level radioactive waste at Oak Ridge National Laboratory, a research facility owned by the U.S. Department of Energy and operated by Lockheed Martin Energy System, Inc. Initially, sampling will be conducted at as many as 15 locations within WAG 6 (as many as 13 seeps and 2 french drain outlets). After evaluating the results obtained and reviewing the observations made by field personnel during the first round of sampling, several seeps and springs will be chosen as permanent monitoring points, together with the two french drain outlets. Baseline sampling of these points will then be conducted quarterly for 1 year (i.e., four rounds of sampling after the initial round). The samples will be analyzed for various geochemical, organic, inorganic, and radiological parameters. Permanent sampling points having suitable flow rates and conditions may be outfitted with automatic flow-monitoring equipment. The results of the sampling and flow-monitoring efforts will help to quantify flux moving across the ungauged perimeter of the site and will help to identify changes in releases from the contaminant sources

  16. Seeps and springs sampling and analysis plant for the Environmental Monitoring Plan at Waste Area Grouping 6, Oak Ridge National Laboratory, Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This Sampling and Analysis Plan addresses the monitoring, sampling, and analysis activities that will be conducted at seeps and springs and at two french drain outlets in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-land-burial disposal facility for low-level radioactive waste at Oak Ridge National Laboratory, a research facility owned by the U.S. Department of Energy and operated by Lockheed Martin Energy System, Inc. Initially, sampling will be conducted at as many as 15 locations within WAG 6 (as many as 13 seeps and 2 french drain outlets). After evaluating the results obtained and reviewing the observations made by field personnel during the first round of sampling, several seeps and springs will be chosen as permanent monitoring points, together with the two french drain outlets. Baseline sampling of these points will then be conducted quarterly for 1 year (i.e., four rounds of sampling after the initial round). The samples will be analyzed for various geochemical, organic, inorganic, and radiological parameters. Permanent sampling points having suitable flow rates and conditions may be outfitted with automatic flow-monitoring equipment. The results of the sampling and flow-monitoring efforts will help to quantify flux moving across the ungauged perimeter of the site and will help to identify changes in releases from the contaminant sources.

  17. A Test of the Theory of Planned Behavior to Predict Physical Activity in an Overweight/Obese Population Sample of Adolescents from Alberta, Canada

    Science.gov (United States)

    Plotnikoff, Ronald C.; Lubans, David R.; Costigan, Sarah A.; McCargar, Linda

    2013-01-01

    Purpose: To examine the utility of the theory of planned behavior (TPB) for explaining physical activity (PA) intention and behavior among a large population sample of overweight and obese adolescents (Alberta, Canada), using a web-based survey. Secondary objectives were to examine the mediating effects of the TPB constructs and moderating effects…

  18. A novel derivation of a within-batch sampling plan based on a Poisson-gamma model characterising low microbial counts in foods.

    Science.gov (United States)

    Gonzales-Barron, Ursula; Zwietering, Marcel H; Butler, Francis

    2013-02-01

    This study proposes a novel step-wise methodology for the derivation of a sampling plan by variables for food production systems characterised by relatively low concentrations of the inspected microorganism. After representing the universe of contaminated batches by modelling the between-batch and within-batch variability in microbial counts, a tolerance criterion defining batch acceptability (i.e., up to a tolerance percentage of the food units having microbial concentrations lower or equal to a critical concentration) is established to delineate a limiting quality contour that separates satisfactory from unsatisfactory batches. The problem consists then of finding the optimum decision criterion - arithmetic mean of the analytical results (microbiological limit, m(L)) and the sample size (n) - that satisfies a pre-defined level of confidence measured on the samples' mean distributions from all possible true within-batch distributions. This is approached by obtaining decision landscape curves representing collectively the conditional and joint producer's and consumer's risks at different microbiological limits along with confidence intervals representing uncertainty due to the propagated between-batch variability. Whilst the method requires a number of risk management decisions to be made such as the objective of the sampling plan (GMP-based or risk-based), the modality of derivation, the tolerance criterion or level of safety, and the statistical level of confidence, the proposed method can be used when past monitoring data are available so as to produce statistically-sound dynamic sampling plans with optimised efficiency and discriminatory power. For the illustration of Enterobacteriaceae concentrations on Irish sheep carcasses, a sampling regime of n=10 and m(L)=17.5CFU/cm(2) is recommended to ensure that the producer has at least a 90% confidence of accepting a satisfactory batch whilst the consumer at least a 97.5% confidence that a batch will not be

  19. Possibility and challenges of conversion of current virus species names to Linnaean binomials

    Science.gov (United States)

    Thomas, Postler; Clawson, Anna N.; Amarasinghe, Gaya K.; Basler, Christopher F.; Bavari, Sina; Benko, Maria; Blasdell, Kim R.; Briese, Thomas; Buchmeier, Michael J.; Bukreyev, Alexander; Calisher, Charles H.; Chandran, Kartik; Charrel, Remi; Clegg, Christopher S.; Collins, Peter L.; De la Torre, Juan Carlos; DeRisi, Joseph L.; Dietzgen, Ralf G.; Dolnik, Olga; Durrwald, Ralf; Dye, John M.; Easton, Andrew J.; Emonet, Sebastian; Formenty, Pierre; Fouchier, Ron A. M.; Ghedin, Elodie; Gonzalez, Jean-Paul; Harrach, Balazs; Hewson, Roger; Horie, Masayuki; Jiang, Daohong; Kobinger, Gary P.; Kondo, Hideki; Kropinski, Andrew; Krupovic, Mart; Kurath, Gael; Lamb, Robert A.; Leroy, Eric M.; Lukashevich, Igor S.; Maisner, Andrea; Mushegian, Arcady; Netesov, Sergey V.; Nowotny, Norbert; Patterson, Jean L.; Payne, Susan L.; Paweska, Janusz T.; Peters, C.J.; Radoshitzky, Sheli; Rima, Bertus K.; Romanowski, Victor; Rubbenstroth, Dennis; Sabanadzovic, Sead; Sanfacon, Helene; Salvato , Maria; Schwemmle, Martin; Smither, Sophie J.; Stenglein, Mark; Stone, D.M.; Takada , Ayato; Tesh, Robert B.; Tomonaga, Keizo; Tordo, N.; Towner, Jonathan S.; Vasilakis, Nikos; Volchkov, Victor E.; Jensen, Victoria; Walker, Peter J.; Wang, Lin-Fa; Varsani, Arvind; Whitfield , Anna E.; Zerbini, Francisco Murilo; Kuhn, Jens H.

    2017-01-01

    Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authors of this article expected when conceiving the experiment.

  20. Poisson and negative binomial item count techniques for surveys with sensitive question.

    Science.gov (United States)

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  1. The option to expand a project: its assessment with the binomial options pricing model

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    Full Text Available Traditional methods of investment appraisal, like the Net Present Value, are not able to include the value of the operational flexibility of the project. In this paper, real options, and more specifically the option to expand, are assumed to be included in the project information in addition to the expected cash flows. Thus, to calculate the total value of the project, we are going to apply the methodology of the Net Present Value to the different scenarios derived from the existence of the real option to expand. Taking into account the analogy between real and financial options, the value of including an option to expand is explored by using the binomial options pricing model. In this way, estimating the value of the option to expand is a tool which facilitates the control of the uncertainty element implicit in the project. Keywords: Real options, Option to expand, Binomial options pricing model, Investment project appraisal

  2. A Bayesian non-inferiority test for two independent binomial proportions.

    Science.gov (United States)

    Kawasaki, Yohei; Miyaoka, Etsuo

    2013-01-01

    In drug development, non-inferiority tests are often employed to determine the difference between two independent binomial proportions. Many test statistics for non-inferiority are based on the frequentist framework. However, research on non-inferiority in the Bayesian framework is limited. In this paper, we suggest a new Bayesian index τ = P(π₁  > π₂-Δ₀|X₁, X₂), where X₁ and X₂ denote binomial random variables for trials n1 and n₂, and parameters π₁ and π₂ , respectively, and the non-inferiority margin is Δ₀> 0. We show two calculation methods for τ, an approximate method that uses normal approximation and an exact method that uses an exact posterior PDF. We compare the approximate probability with the exact probability for τ. Finally, we present the results of actual clinical trials to show the utility of index τ. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Nested (inverse) binomial sums and new iterated integrals for massive Feynman diagrams

    International Nuclear Information System (INIS)

    Ablinger, Jakob; Schneider, Carsten; Bluemlein, Johannes; Raab, Clemens G.

    2014-07-01

    Nested sums containing binomial coefficients occur in the computation of massive operatormatrix elements. Their associated iterated integrals lead to alphabets including radicals, for which we determined a suitable basis. We discuss algorithms for converting between sum and integral representations, mainly relying on the Mellin transform. To aid the conversion we worked out dedicated rewrite rules, based on which also some general patterns emerging in the process can be obtained.

  4. Study on Emission Measurement of Vehicle on Road Based on Binomial Logit Model

    OpenAIRE

    Aly, Sumarni Hamid; Selintung, Mary; Ramli, Muhammad Isran; Sumi, Tomonori

    2011-01-01

    This research attempts to evaluate emission measurement of on road vehicle. In this regard, the research develops failure probability model of vehicle emission test for passenger car which utilize binomial logit model. The model focuses on failure of CO and HC emission test for gasoline cars category and Opacity emission test for diesel-fuel cars category as dependent variables, while vehicle age, engine size, brand and type of the cars as independent variables. In order to imp...

  5. Use of a negative binomial distribution to describe the presence of Sphyrion laevigatum in Genypterus blacodes

    Directory of Open Access Journals (Sweden)

    Patricio Peña-Rehbein

    Full Text Available This paper describes the frequency and number of Sphyrion laevigatum in the skin of Genypterus blacodes, an important economic resource in Chile. The analysis of a spatial distribution model indicated that the parasites tended to cluster. Variations in the number of parasites per host could be described by a negative binomial distribution. The maximum number of parasites observed per host was two.

  6. Computation of Clebsch-Gordan and Gaunt coefficients using binomial coefficients

    International Nuclear Information System (INIS)

    Guseinov, I.I.; Oezmen, A.; Atav, Ue

    1995-01-01

    Using binomial coefficients the Clebsch-Gordan and Gaunt coefficients were calculated for extremely large quantum numbers. The main advantage of this approach is directly calculating these coefficients, instead of using recursion relations. Accuracy of the results is quite high for quantum numbers l 1 , and l 2 up to 100. Despite direct calculation, the CPU times are found comparable with those given in the related literature. 11 refs., 1 fig., 2 tabs

  7. Negative binomial multiplicity distributions, a new empirical law for high energy collisions

    International Nuclear Information System (INIS)

    Van Hove, L.; Giovannini, A.

    1987-01-01

    For a variety of high energy hadron production reactions, recent experiments have confirmed the findings of the UA5 Collaboration that charged particle multiplicities in central (pseudo) rapidity intervals and in full phase space obey negative binomial (NB) distributions. The authors discuss the meaning of this new empirical law on the basis of new data and they show that they support the interpretation of the NB distributions in terms of a cascading mechanism of hardron production

  8. Generalized harmonic, cyclotomic, and binomial sums, their polylogarithms and special numbers

    Energy Technology Data Exchange (ETDEWEB)

    Ablinger, J.; Schneider, C. [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation (RISC); Bluemlein, J. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2013-10-15

    A survey is given on mathematical structures which emerge in multi-loop Feynman diagrams. These are multiply nested sums, and, associated to them by an inverse Mellin transform, specific iterated integrals. Both classes lead to sets of special numbers. Starting with harmonic sums and polylogarithms we discuss recent extensions of these quantities as cyclotomic, generalized (cyclotomic), and binomially weighted sums, associated iterated integrals and special constants and their relations.

  9. Plan for proposed aquifer hydraulic testing and groundwater sampling at Everest, Kansas, in January-February 2006.

    Energy Technology Data Exchange (ETDEWEB)

    LaFreniere, L. M.; Environmental Science Division

    2006-01-31

    well and subsequent testing of a new well to be installed north-northeast of the Nigh well. On November 28, 2005, the KDHE provided written comments on the Cross Section Analysis and the recommendations outlined above. In response to the KDHE's comments, the CCC/USDA agreed (Roe 2005) to discontinue plans to test the Nigh well. The CCC/USDA agreed to proceed instead with the design and installation of a new pumping well and associated observation points to be used for pump testing of the Everest aquifer along the apparent contaminant migration pathway north-northeast of the Nigh property. In conjunction with this test, the CCC/USDA also proposed groundwater sampling with the cone penetrometer (CPT) at the possible monitoring well locations requested by the KDHE in lieu of installing permanent monitoring points.

  10. Meal planning is associated with food variety, diet quality and body weight status in a large sample of French adults.

    Science.gov (United States)

    Ducrot, Pauline; Méjean, Caroline; Aroumougame, Vani; Ibanez, Gladys; Allès, Benjamin; Kesse-Guyot, Emmanuelle; Hercberg, Serge; Péneau, Sandrine

    2017-02-02

    Meal planning could be a potential tool to offset time scarcity and therefore encourage home meal preparation, which has been linked with an improved diet quality. However, to date, meal planning has received little attention in the scientific literature. The aim of our cross-sectional study was to investigate the association between meal planning and diet quality, including adherence to nutritional guidelines and food variety, as well as weight status. Meal planning, i.e. planning ahead the foods that will be eaten for the next few days, was assessed in 40,554 participants of the web-based observational NutriNet-Santé study. Dietary measurements included intakes of energy, nutrients, food groups, and adherence to the French nutritional guidelines (mPNNS-GS) estimated through repeated 24-h dietary records. A food variety score was also calculated using Food Frequency Questionnaire. Weight and height were self-reported. Association between meal planning and dietary intakes were assessed using ANCOVAs, while associations with quartiles of mPNNS-GS scores, quartiles of food variety score and weight status categories (overweight, obesity) were evaluated using logistic regression models. A total of 57% of the participants declared to plan meals at least occasionally. Meal planners were more likely to have a higher mPNNS-GS (OR quartile 4 vs. 1 = 1.13, 95% CI: [1.07-1.20]), higher overall food variety (OR quartile 4 vs. 1 = 1.25, 95% CI: [1.18-1.32]). In women, meal planning was associated with lower odds of being overweight (OR = 0.92 [0.87-0.98]) and obese (OR = 0.79 [0.73-0.86]). In men, the association was significant for obesity only (OR = 0.81 [0.69-0.94]). Meal planning was associated with a healthier diet and less obesity. Although no causality can be inferred from the reported associations, these data suggest that meal planning could potentially be relevant for obesity prevention.

  11. Estimasi Harga Multi-State European Call Option Menggunakan Model Binomial

    Directory of Open Access Journals (Sweden)

    Mila Kurniawaty, Endah Rokhmati

    2011-05-01

    Full Text Available Option merupakan kontrak yang memberikan hak kepada pemiliknya untuk membeli (call option atau menjual (put option sejumlah aset dasar tertentu (underlying asset dengan harga tertentu (strike price dalam jangka waktu tertentu (sebelum atau saat expiration date. Perkembangan option belakangan ini memunculkan banyak model pricing untuk mengestimasi harga option, salah satu model yang digunakan adalah formula Black-Scholes. Multi-state option merupakan sebuah option yang payoff-nya didasarkan pada dua atau lebih aset dasar. Ada beberapa metode yang dapat digunakan dalam mengestimasi harga call option, salah satunya masyarakat finance sering menggunakan model binomial untuk estimasi berbagai model option yang lebih luas seperti multi-state call option. Selanjutnya, dari hasil estimasi call option dengan model binomial didapatkan formula terbaik berdasarkan penghitungan eror dengan mean square error. Dari penghitungan eror didapatkan eror rata-rata dari masing-masing formula pada model binomial. Hasil eror rata-rata menunjukkan bahwa estimasi menggunakan formula 5 titik lebih baik dari pada estimasi menggunakan formula 4 titik.

  12. Discrimination of numerical proportions: A comparison of binomial and Gaussian models.

    Science.gov (United States)

    Raidvee, Aire; Lember, Jüri; Allik, Jüri

    2017-01-01

    Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.

  13. Genome-enabled predictions for binomial traits in sugar beet populations.

    Science.gov (United States)

    Biscarini, Filippo; Stevanato, Piergiorgio; Broccanello, Chiara; Stella, Alessandra; Saccomani, Massimo

    2014-07-22

    Genomic information can be used to predict not only continuous but also categorical (e.g. binomial) traits. Several traits of interest in human medicine and agriculture present a discrete distribution of phenotypes (e.g. disease status). Root vigor in sugar beet (B. vulgaris) is an example of binomial trait of agronomic importance. In this paper, a panel of 192 SNPs (single nucleotide polymorphisms) was used to genotype 124 sugar beet individual plants from 18 lines, and to classify them as showing "high" or "low" root vigor. A threshold model was used to fit the relationship between binomial root vigor and SNP genotypes, through the matrix of genomic relationships between individuals in a genomic BLUP (G-BLUP) approach. From a 5-fold cross-validation scheme, 500 testing subsets were generated. The estimated average cross-validation error rate was 0.000731 (0.073%). Only 9 out of 12326 test observations (500 replicates for an average test set size of 24.65) were misclassified. The estimated prediction accuracy was quite high. Such accurate predictions may be related to the high estimated heritability for root vigor (0.783) and to the few genes with large effect underlying the trait. Despite the sparse SNP panel, there was sufficient within-scaffold LD where SNPs with large effect on root vigor were located to allow for genome-enabled predictions to work.

  14. Analysis of railroad tank car releases using a generalized binomial model.

    Science.gov (United States)

    Liu, Xiang; Hong, Yili

    2015-11-01

    The United States is experiencing an unprecedented boom in shale oil production, leading to a dramatic growth in petroleum crude oil traffic by rail. In 2014, U.S. railroads carried over 500,000 tank carloads of petroleum crude oil, up from 9500 in 2008 (a 5300% increase). In light of continual growth in crude oil by rail, there is an urgent national need to manage this emerging risk. This need has been underscored in the wake of several recent crude oil release incidents. In contrast to highway transport, which usually involves a tank trailer, a crude oil train can carry a large number of tank cars, having the potential for a large, multiple-tank-car release incident. Previous studies exclusively assumed that railroad tank car releases in the same train accident are mutually independent, thereby estimating the number of tank cars releasing given the total number of tank cars derailed based on a binomial model. This paper specifically accounts for dependent tank car releases within a train accident. We estimate the number of tank cars releasing given the number of tank cars derailed based on a generalized binomial model. The generalized binomial model provides a significantly better description for the empirical tank car accident data through our numerical case study. This research aims to provide a new methodology and new insights regarding the further development of risk management strategies for improving railroad crude oil transportation safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Validity of the World Health Organization Adult ADHD Self-Report Scale (ASRS) Screener in a representative sample of health plan members

    OpenAIRE

    Kessler, Ronald C.; Adler, Lenard; Gruber, Michael J.; Sarawate, Chaitanya A.; Spencer, Thomas; Van Brunt, David L.

    2007-01-01

    The validity of the 6-question World Health Organization Adult ADHD Self-Report Scale (ASRS) Screener was assessed in a sample of subscribers to a large health plan in the US. A convenience sub-sample of 668 subscribers was administered the ASRS Screener twice to assess test-retest reliability and then a third time in conjunction with a clinical interviewer for DSM-IV adult ADHD. The data were weighted to adjust for discrepancies between the sample and the population on socio-demographics and...

  16. How Often Are Parents Counseled About Family Planning During Pediatric Visits? Results of a Nationally Representative Sample.

    Science.gov (United States)

    Venkataramani, Maya; Cheng, Tina L; Solomon, Barry S; Pollack, Craig Evan

    2017-07-01

    Maternal family planning plays an important role in child, maternal, and family health; children's health care providers are in a unique position to counsel adult caregivers regarding contraception and appropriate birth spacing. We sought to determine the prevalence of caregiver family planning counseling by children's health care providers during preventive care visits for infants and young children. Data from the National Ambulatory Medical Care Survey from 2009 to 2012 as well as National Hospital Ambulatory Medical Care Survey from 2009 to 2011 were analyzed to determine the weighted frequency of family planning/contraception counseling provided during preventive, primary care visits for children younger than the age of 2 years. Family planning/contraception counseling or education was documented in only 16 of 4261 preventive care visits in primary care settings for children younger than the age of 2 years, corresponding to 0.30% (95% confidence interval, -0.08% to 0.68%) of visits nationally. Similar frequencies were calculated for preventive visits with children younger than 1 year and with infants younger than 60 days of age. Despite Bright Futures' recommendations for children's health care providers to address caregiver family planning during well infant visits, documented counseling is rare. The results indicate that there are missed opportunities to promote family health in the pediatric setting. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  17. Jump-and-return sandwiches: A new family of binomial-like selective inversion sequences with improved performance

    Science.gov (United States)

    Brenner, Tom; Chen, Johnny; Stait-Gardner, Tim; Zheng, Gang; Matsukawa, Shingo; Price, William S.

    2018-03-01

    A new family of binomial-like inversion sequences, named jump-and-return sandwiches (JRS), has been developed by inserting a binomial-like sequence into a standard jump-and-return sequence, discovered through use of a stochastic Genetic Algorithm optimisation. Compared to currently used binomial-like inversion sequences (e.g., 3-9-19 and W5), the new sequences afford wider inversion bands and narrower non-inversion bands with an equal number of pulses. As an example, two jump-and-return sandwich 10-pulse sequences achieved 95% inversion at offsets corresponding to 9.4% and 10.3% of the non-inversion band spacing, compared to 14.7% for the binomial-like W5 inversion sequence, i.e., they afforded non-inversion bands about two thirds the width of the W5 non-inversion band.

  18. Surface water sampling and analysis plan for environmental monitoring in Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    This Sampling and Analysis Plan addresses surface water monitoring, sampling, and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Martin Marietta Energy Systems, Inc. Surface water monitoring will be conducted at nine sites within WAG 6. Activities to be conducted will include the installation, inspection, and maintenance of automatic flow-monitoring and sampling equipment and manual collection of various water and sediment samples. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the surface water monitoring, sampling, and analysis will aid in evaluating risk associated with contaminants migrating off-WAG, and will be used in calculations to establish relationships between contaminant concentration (C) and flow (Q). The C-Q relationship will be used in calculating the cumulative risk associated with the off-WAG migration of contaminants.

  19. Surface water sampling and analysis plan for environmental monitoring in Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee. Environmental Restoration Program

    International Nuclear Information System (INIS)

    1994-06-01

    This Sampling and Analysis Plan addresses surface water monitoring, sampling, and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Martin Marietta Energy Systems, Inc. Surface water monitoring will be conducted at nine sites within WAG 6. Activities to be conducted will include the installation, inspection, and maintenance of automatic flow-monitoring and sampling equipment and manual collection of various water and sediment samples. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the surface water monitoring, sampling, and analysis will aid in evaluating risk associated with contaminants migrating off-WAG, and will be used in calculations to establish relationships between contaminant concentration (C) and flow (Q). The C-Q relationship will be used in calculating the cumulative risk associated with the off-WAG migration of contaminants

  20. Health and safety work plan for sampling colloids in Waste Area Grouping 5 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Marsh, J.D.; McCarthy, J.F.

    1994-07-01

    This Work Plan/Site Safety and Health Plan (SSHP) and the attached work plan are for the performance of the colloid sampling project at WAG 5. The work will be conducted by the Oak Ridge National Laboratory (ORNL) Environmental Sciences Division (ESD) and associated ORNL environmental, safety, and health support groups. This activity will fall under the scope of 29 CFR 1910.120, Hazardous Waste Operations and Emergency Response (HAZWOPER). The purpose of this document is to establish health and safety guidelines to be followed by all personnel involved in conducting work for this project. Work will be conducted in accordance with requirements as stipulated in the ORNL HAZWOPER Program manual, and applicable ORNL, Martin Marietta Energy Systems, Inc., and US Department of Energy (DOE) policies and procedures